Safe Haskell | Safe-Inferred |
---|---|
Language | Haskell2010 |
Data.Ollama.Chat
Contents
Synopsis
- chat :: ChatOps -> IO (Either String ChatResponse)
- chatJson :: (FromJSON jsonResult, ToJSON jsonResult) => ChatOps -> jsonResult -> Maybe Int -> IO (Either String jsonResult)
- data Message = Message {}
- data Role
- defaultChatOps :: ChatOps
- data ChatOps = ChatOps {}
- data ChatResponse = ChatResponse {}
- data Format
- schemaFromType :: ToJSON a => a -> ByteString
Chat APIs
chat :: ChatOps -> IO (Either String ChatResponse) Source #
Initiates a chat session with the specified ChatOps
configuration and returns either
a ChatResponse
or an error message.
This function sends a request to the Ollama chat API with the given options.
Example:
let ops = defaultChatOps result <- chat ops case result of Left errorMsg -> putStrLn ("Error: " ++ errorMsg) Right response -> print response
To request a JSON format response:
let ops = defaultChatOps { format = Just JsonFormat } result <- chat ops
To request a structured output with a JSON schema:
import Data.Aeson (object, (.=)) let ops = defaultChatOps { format = Just (SchemaFormat schema) } result <- chat ops
Arguments
:: (FromJSON jsonResult, ToJSON jsonResult) | |
=> ChatOps | |
-> jsonResult | Haskell type that you want your result in |
-> Maybe Int | Max retries |
-> IO (Either String jsonResult) |
chatJson is a higher level function that takes ChatOps (similar to chat) and also takes a Haskell type (that has To and From JSON instance) and returns the response in provided type.
This function simply calls chat with extra prompt appended to it, telling LLM to return the response in certain JSON format and serializes the response. This function will be helpful when you want to use the LLM to do something programmatic.
Note: This function predates the format parameter in the API. For new code, consider using
the format
parameter with a SchemaFormat instead, which leverages the model's native
JSON output capabilities.
For Example: > let expectedJsonStrucutre = Example { > sortedList = ["sorted List here"] > , wasListAlreadSorted = False > } > let msg0 = Ollama.Message User "Sort given list: [4, 2 , 3, 67]. Also tell whether list was already sorted or not." Nothing > eRes3 <- > chatJson > defaultChatOps > { Chat.chatModelName = "llama3.2" > , Chat.messages = msg0 :| [] > } > expectedJsonStrucutre > (Just 2) > print eRes3 Output: > Example {sortedList = ["1","2","3","4"], wasListAlreadSorted = False}
Note: While Passing the type, construct the type that will help LLM understand the field better. For example, in the above example, the sortedList's value is written as "Sorted List here". This will help LLM understand context better.
You can also provide number of retries in case the LLM field to return the response in correct JSON in first attempt.
Represents a message within a chat, including its role and content.
Constructors
Message | |
Fields |
Instances
FromJSON Message Source # | |
Defined in Data.Ollama.Chat | |
ToJSON Message Source # | |
Generic Message Source # | |
Show Message Source # | |
Eq Message Source # | |
type Rep Message Source # | |
Defined in Data.Ollama.Chat type Rep Message = D1 ('MetaData "Message" "Data.Ollama.Chat" "ollama-haskell-0.1.3.0-inplace" 'False) (C1 ('MetaCons "Message" 'PrefixI 'True) ((S1 ('MetaSel ('Just "role") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedLazy) (Rec0 Role) :*: S1 ('MetaSel ('Just "content") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedLazy) (Rec0 Text)) :*: (S1 ('MetaSel ('Just "images") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedLazy) (Rec0 (Maybe [Text])) :*: S1 ('MetaSel ('Just "tool_calls") 'NoSourceUnpackedness 'NoSourceStrictness 'DecidedLazy) (Rec0 (Maybe [Value]))))) |
Enumerated roles that can participate in a chat.
defaultChatOps :: ChatOps Source #
A default configuration for initiating a chat with a model. This can be used as a starting point and modified as needed.
Example:
let ops = defaultChatOps { chatModelName = "customModel" } chat ops
Constructors
ChatOps | |
Fields
|
data ChatResponse Source #
Constructors
ChatResponse | |
Fields
|
Instances
FromJSON ChatResponse Source # | |
Defined in Data.Ollama.Chat | |
Show ChatResponse Source # | |
Defined in Data.Ollama.Chat Methods showsPrec :: Int -> ChatResponse -> ShowS # show :: ChatResponse -> String # showList :: [ChatResponse] -> ShowS # | |
Eq ChatResponse Source # | |
Defined in Data.Ollama.Chat |
E.g SchemaFormat { "type": "object", "properties": { "age": { "type": "integer" }, "available": { "type": "boolean" } }, "required": [ "age", "available" ] } |
Format specification for the chat output | Since 0.1.3.0
Constructors
JsonFormat | |
SchemaFormat Value |
schemaFromType :: ToJSON a => a -> ByteString Source #
Helper function to create a JSON schema from a Haskell type