langchain-hs-0.0.1.0: Haskell implementation of Langchain
Copyright(c) 2025 Tushar Adhatrao
LicenseMIT
MaintainerTushar Adhatrao <tusharadhatrao@gmail.com>
Stabilityexperimental
Safe HaskellSafe-Inferred
LanguageHaskell2010

Langchain.LLM.Ollama

Description

Ollama implementation of LangChain's LLM interface , supporting:

  • Text generation
  • Chat interactions
  • Streaming responses
  • Callback integration

Example usage:

-- Create Ollama configuration
ollamaLLM = Ollama "llama3" [stdOutCallback]

-- Generate text
response <- generate ollamaLLM "Explain Haskell monads" Nothing
-- Right "Monads in Haskell..."

-- Chat interaction
let messages = UserMessage "What's the capital of France?" :| []
chatResponse <- chat ollamaLLM messages Nothing
-- Right "The capital of France is Paris."

-- Streaming
streamHandler = StreamHandler print (putStrLn Done)
streamResult <- stream ollamaLLM messages streamHandler Nothing
Synopsis

Documentation

data Ollama Source #

Ollama LLM configuration Contains:

  • Model name (e.g., "llama3:latest")
  • Callbacks for event tracking

Example:

>>> Ollama "nomic-embed" [logCallback]
Ollama "nomic-embed"

Constructors

Ollama 

Fields

Instances

Instances details
Show Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

LLM Ollama Source #

Ollama implementation of the LLM typeclass Note: Params argument is currently ignored (see TODOs).

Example instance usage:

-- Generate text with error handling
case generate ollamaLLM Hello Nothing of
  Left err -> putStrLn $ "Error: " ++ err
  Right res -> putStrLn res
Instance details

Defined in Langchain.LLM.Ollama

Runnable Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

type RunnableInput Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama

type RunnableOutput Ollama Source # 
Instance details

Defined in Langchain.LLM.Ollama