ollama-haskell: Haskell client for ollama.

[ library, mit, ollama, web ] [ Propose Tags ] [ Report a vulnerability ]
Versions [RSS] 0.1.0.0, 0.1.0.1, 0.1.0.2, 0.1.0.3, 0.1.1.3, 0.1.2.0, 0.1.3.0, 0.2.0.0
Change log CHANGELOG.md
Dependencies aeson (>=2 && <3), base (>=4.7 && <5), base64-bytestring (>=1 && <2), bytestring (>=0.10 && <0.13), containers (>=0.6 && <0.9), directory (>=1 && <1.4), filepath (>=1 && <1.6), http-client (>=0.6 && <0.8), http-client-tls (>=0.2 && <0.4), http-types (>=0.7 && <0.13), mtl (>=2 && <3), stm (>=2 && <3), text (>=1 && <3), time (>=1 && <2) [details]
License MIT
Copyright 2024 tushar
Author tushar
Maintainer tusharadhatrao@gmail.com
Category Web
Home page https://github.com/tusharad/ollama-haskell#readme
Bug tracker https://github.com/tusharad/ollama-haskell/issues
Source repo head: git clone https://github.com/tusharad/ollama-haskell
Uploaded by tusharad at 2025-06-05T17:09:39Z
Distributions LTSHaskell:0.1.3.0, Stackage:0.1.3.0
Reverse Dependencies 2 direct, 0 indirect [details]
Downloads 212 total (13 in the last 30 days)
Rating 2.0 (votes: 1) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs available [build log]
Last success reported on 2025-06-05 [all 1 reports]

Readme for ollama-haskell-0.2.0.0

[back to package description]

๐Ÿฆ™ Ollama Haskell

ollama-haskell is an unofficial Haskell client for Ollama, inspired by ollama-python. It enables interaction with locally running LLMs through the Ollama HTTP API โ€” directly from Haskell.


โœจ Features

  • ๐Ÿ’ฌ Chat with models
  • โœ๏ธ Text generation (with streaming)
  • โœ… Chat with structured messages and tools
  • ๐Ÿง  Embeddings
  • ๐Ÿงฐ Model management (list, pull, push, show, delete)
  • ๐Ÿ—ƒ๏ธ In-memory conversation history
  • โš™๏ธ Configurable timeouts, retries, streaming handlers

โšก Quick Example

{-# LANGUAGE OverloadedStrings #-}
module Main where

import Data.Ollama.Generate
import qualified Data.Text.IO as T

main :: IO ()
main = do
  let ops =
        defaultGenerateOps
          { modelName = "gemma3"
          , prompt = "What is the meaning of life?"
          }
  eRes <- generate ops Nothing
  case eRes of
    Left err -> putStrLn $ "Something went wrong: " ++ show err
    Right r -> do
      putStr "LLM response: "
      T.putStrLn (genResponse r)

๐Ÿ“ฆ Installation

Add to your .cabal file:

build-depends:
  base >=4.7 && <5,
  ollama-haskell

Or use with stack/nix-shell.


๐Ÿ“š More Examples

See examples/OllamaExamples.hs for:

  • Chat with conversation memory
  • Structured JSON output
  • Embeddings
  • Tool/function calling
  • Multimodal input
  • Streaming and non-streaming variants

๐Ÿ›  Prerequisite

Make sure you have Ollama installed and running locally. Run ollama pull llama3 to download a model.


๐Ÿงช Dev & Nix Support

Use Nix:

nix-shell

This will install stack and Ollama.


๐Ÿ‘จโ€๐Ÿ’ป Author

Created and maintained by @tusharad. PRs and feedback are welcome!


๐Ÿค Contributing

Have ideas or improvements? Feel free to open an issue or submit a PR!