Copyright | (c) 2025 Tushar Adhatrao |
---|---|
License | MIT |
Maintainer | Tushar Adhatrao <tusharadhatrao@gmail.com> |
Stability | experimental |
Safe Haskell | Safe-Inferred |
Language | Haskell2010 |
Langchain.LLM.Core
Description
This module provides the core types and typeclasses for the Langchain library in Haskell, which is designed to facilitate interaction with language models (LLMs). It defines a standardized interface that allows different LLM implementations to be used interchangeably, promoting code reuse and modularity.
The main components include:
- The
LLM
typeclass, which defines the interface for language models. - Data types such as
Params
for configuring model invocations,Message
for conversation messages, andStreamHandler
for handling streaming responses. - Default values like
defaultParams
anddefaultMessageData
for convenience.
This module is intended to be used as the foundation for building applications that interact with LLMs, providing a consistent API across different model implementations.
Synopsis
- class LLM m where
- data Message = Message {
- role :: Role
- content :: Text
- messageData :: MessageData
- data Role
- type ChatMessage = NonEmpty Message
- data MessageData = MessageData {}
- data Params = Params {}
- data StreamHandler = StreamHandler {
- onToken :: Text -> IO ()
- onComplete :: IO ()
- defaultParams :: Params
- defaultMessageData :: MessageData
LLM Typeclass
Typeclass defining the interface for language models. This provides methods for invoking the model, chatting with it, and streaming responses.
data TestLLM = TestLLM { responseText :: Text , shouldSucceed :: Bool } instance LLM TestLLM where generate m _ _ = pure $ if shouldSucceed m then Right (responseText m) else Left "Test error"
ollamaLLM = Ollama "llama3.2:latest" [stdOutCallback] response <- generate ollamaLLM "What is Haskell?" Nothing
Methods
Arguments
:: m | The type of the language model instance. |
-> Text | The prompt to send to the model. |
-> Maybe Params | Optional configuration parameters. |
-> IO (Either String Text) |
Invoke the language model with a single prompt. Suitable for simple queries; returns either an error or generated text.
Arguments
:: m | The type of the language model instance. |
-> ChatMessage | A non-empty list of messages to send to the model. |
-> Maybe Params | Optional configuration parameters. |
-> IO (Either String Text) | The result of the chat, either an error or the response text. |
Chat with the language model using a sequence of messages. Suitable for multi-turn conversations; returns either an error or the response.
stream :: m -> ChatMessage -> StreamHandler -> Maybe Params -> IO (Either String ()) Source #
Stream responses from the language model for a sequence of messages. Uses callbacks to process tokens in real-time; returns either an error or unit.
Instances
LLM Ollama Source # | Ollama implementation of the LLM typeclass Note: Params argument is currently ignored (see TODOs). Example instance usage: -- Generate text with error handling case generate ollamaLLM Hello Nothing of Left err -> putStrLn $ "Error: " ++ err Right res -> putStrLn res |
LLM OpenAI Source # | |
Parameters
Represents a message in a conversation, including the sender's role, content, and additional metadata. https:/python.langchain.comdocsconceptsmessages/
userMsg :: Message userMsg = Message { role = User , content = "Explain functional programming" , messageData = defaultMessageData }
Constructors
Message | |
Fields
|
Enumeration of possible roles in a conversation.
Constructors
System | System role, typically for instructions or context |
User | User role, for user inputs |
Assistant | Assistant role, for model responses |
Tool | Tool role, for tool outputs or interactions |
Instances
FromJSON Role Source # | |
Defined in Langchain.LLM.Core | |
ToJSON Role Source # | |
Generic Role Source # | |
Show Role Source # | |
Eq Role Source # | |
type Rep Role Source # | |
Defined in Langchain.LLM.Core type Rep Role = D1 ('MetaData "Role" "Langchain.LLM.Core" "langchain-hs-0.0.1.0-inplace" 'False) ((C1 ('MetaCons "System" 'PrefixI 'False) (U1 :: Type -> Type) :+: C1 ('MetaCons "User" 'PrefixI 'False) (U1 :: Type -> Type)) :+: (C1 ('MetaCons "Assistant" 'PrefixI 'False) (U1 :: Type -> Type) :+: C1 ('MetaCons "Tool" 'PrefixI 'False) (U1 :: Type -> Type))) |
type ChatMessage = NonEmpty Message Source #
Type alias for NonEmpty Message
data MessageData Source #
Additional data for a message, such as a name or tool calls.
This type is designed for extensibility, allowing new fields to be added without
breaking changes. Use defaultMessageData
for typical usage.
Constructors
MessageData | |
Instances
FromJSON MessageData Source # | JSON deserialization for MessageData. |
Defined in Langchain.LLM.Core | |
ToJSON MessageData Source # | JSON serialization for MessageData. |
Defined in Langchain.LLM.Core Methods toJSON :: MessageData -> Value # toEncoding :: MessageData -> Encoding # toJSONList :: [MessageData] -> Value # toEncodingList :: [MessageData] -> Encoding # omitField :: MessageData -> Bool # | |
Show MessageData Source # | |
Defined in Langchain.LLM.Core Methods showsPrec :: Int -> MessageData -> ShowS # show :: MessageData -> String # showList :: [MessageData] -> ShowS # | |
Eq MessageData Source # | |
Defined in Langchain.LLM.Core |
Parameters for configuring language model invocations. These parameters control aspects such as randomness, length, and stopping conditions of generated output. This type corresponds to standard parameters in Python Langchain: https:/python.langchain.comdocsconceptschat_models/#standard-parameters
Example usage:
myParams :: Params myParams = defaultParams { temperature = Just 0.7 , maxTokens = Just 100 }
Constructors
Params | |
Fields
|
data StreamHandler Source #
Callbacks for handling streaming responses from a language model. This allows real-time processing of tokens as they are generated and an action upon completion.
printHandler :: StreamHandler printHandler = StreamHandler { onToken = putStrLn . ("Token: " ++) , onComplete = putStrLn "Streaming complete" }
Constructors
StreamHandler | |
Fields
|
Default Values
defaultParams :: Params Source #
Default parameters with all fields set to Nothing. Use this when no specific configuration is needed for the language model.
>>>
generate myLLM "Hello" (Just defaultParams)
defaultMessageData :: MessageData Source #
Default message data with all fields set to Nothing. Use this for standard messages without additional metadata