Copyright | (c) 2025 Tushar Adhatrao |
---|---|
License | MIT |
Maintainer | Tushar Adhatrao <tusharadhatrao@gmail.com> |
Safe Haskell | Safe-Inferred |
Language | Haskell2010 |
Langchain.Runnable.Utils
Description
This module provides various utility wrappers for Runnable
components that enhance
their behavior with common patterns like:
- Configuration management
- Result caching
- Automatic retries
- Timeout handling
These utilities follow the decorator pattern, wrapping existing Runnable
instances
with additional functionality while preserving the original input/output types.
Note: This module is experimental and the API may change in future versions.
Synopsis
- data WithConfig config r = Runnable r => WithConfig {
- configuredRunnable :: r
- runnableConfig :: config
- data Cached r = (Runnable r, Ord (RunnableInput r)) => Cached {
- cachedRunnable :: r
- cacheMap :: MVar (Map (RunnableInput r) (RunnableOutput r))
- cached :: (Runnable r, Ord (RunnableInput r)) => r -> IO (Cached r)
- data Retry r = Runnable r => Retry {
- retryRunnable :: r
- maxRetries :: Int
- retryDelay :: Int
- data WithTimeout r = Runnable r => WithTimeout {}
Configuration Management
data WithConfig config r Source #
Wrapper for Runnable
components with configurable behavior.
This wrapper allows attaching configuration data to a Runnable
instance.
The configuration data can be accessed and modified without changing the
underlying Runnable
implementation.
Example:
data LLMConfig = LLMConfig { temperature :: Float , maxTokens :: Int } let baseModel = OpenAI defaultOpenAIConfig configuredModel = WithConfig { configuredRunnable = baseModel , runnableConfig = LLMConfig 0.7 100 } -- Later, modify the configuration without changing the model let updatedModel = configuredModel { runnableConfig = LLMConfig 0.9 150 } -- Use the model as a regular Runnable result <- invoke updatedModel "Explain monads in Haskell"
Constructors
Runnable r => WithConfig | |
Fields
|
Instances
Runnable r => Runnable (WithConfig config r) Source # | Make WithConfig a Runnable that applies the configuration |
Defined in Langchain.Runnable.Utils Associated Types type RunnableInput (WithConfig config r) Source # type RunnableOutput (WithConfig config r) Source # Methods invoke :: WithConfig config r -> RunnableInput (WithConfig config r) -> IO (Either String (RunnableOutput (WithConfig config r))) Source # batch :: WithConfig config r -> [RunnableInput (WithConfig config r)] -> IO (Either String [RunnableOutput (WithConfig config r)]) Source # stream :: WithConfig config r -> RunnableInput (WithConfig config r) -> (RunnableOutput (WithConfig config r) -> IO ()) -> IO (Either String ()) Source # | |
type RunnableInput (WithConfig config r) Source # | |
Defined in Langchain.Runnable.Utils | |
type RunnableOutput (WithConfig config r) Source # | |
Defined in Langchain.Runnable.Utils |
Caching
Cache results of a Runnable
to avoid duplicate computations.
This wrapper stores previously computed results in a thread-safe cache. When an input is encountered again, the cached result is returned instead of recomputing it, which can significantly improve performance for expensive operations or when the same inputs are frequently processed.
Note: The cached results are stored in-memory and will be lost when the program terminates. For persistent caching, consider implementing a custom wrapper that uses database storage.
The RunnableInput
type must be an instance of Ord
for map lookups.
Constructors
(Runnable r, Ord (RunnableInput r)) => Cached | |
Fields
|
Instances
(Runnable r, Ord (RunnableInput r)) => Runnable (Cached r) Source # | Make Cached a Runnable that uses a cache |
Defined in Langchain.Runnable.Utils Methods invoke :: Cached r -> RunnableInput (Cached r) -> IO (Either String (RunnableOutput (Cached r))) Source # batch :: Cached r -> [RunnableInput (Cached r)] -> IO (Either String [RunnableOutput (Cached r)]) Source # stream :: Cached r -> RunnableInput (Cached r) -> (RunnableOutput (Cached r) -> IO ()) -> IO (Either String ()) Source # | |
type RunnableInput (Cached r) Source # | |
Defined in Langchain.Runnable.Utils | |
type RunnableOutput (Cached r) Source # | |
Defined in Langchain.Runnable.Utils |
cached :: (Runnable r, Ord (RunnableInput r)) => r -> IO (Cached r) Source #
Create a new cached Runnable
.
This function initializes an empty cache and wraps the provided Runnable
in a Cached
wrapper.
Example:
main = do -- Create a cached LLM to avoid redundant API calls let expensiveModel = OpenAI { model = "gpt-4", temperature = 0.7 } cachedModel <- cached expensiveModel -- These will all use the same cached result for identical inputs result1 <- invoke cachedModel "What is functional programming?" result2 <- invoke cachedModel "What is functional programming?" result3 <- invoke cachedModel "What is functional programming?" -- This will compute a new result result4 <- invoke cachedModel "What is Haskell?"
Resilience Patterns
Add retry capability to any Runnable
.
This wrapper automatically retries failed operations up to a specified number of times with a configurable delay between attempts. This is particularly useful for network operations or external API calls that might fail transiently.
Example:
-- Create an LLM with automatic retry for network failures let baseModel = OpenAI defaultConfig resilientModel = Retry { retryRunnable = baseModel , maxRetries = 3 , retryDelay = 1000000 -- 1 second delay between retries } -- If the API call fails, it will retry up to 3 times result <- invoke resilientModel "Generate a story about a Haskell programmer"
Constructors
Runnable r => Retry | |
Fields
|
Instances
Runnable r => Runnable (Retry r) Source # | Make Retry a Runnable that retries on failure |
Defined in Langchain.Runnable.Utils Methods invoke :: Retry r -> RunnableInput (Retry r) -> IO (Either String (RunnableOutput (Retry r))) Source # batch :: Retry r -> [RunnableInput (Retry r)] -> IO (Either String [RunnableOutput (Retry r)]) Source # stream :: Retry r -> RunnableInput (Retry r) -> (RunnableOutput (Retry r) -> IO ()) -> IO (Either String ()) Source # | |
type RunnableInput (Retry r) Source # | |
Defined in Langchain.Runnable.Utils | |
type RunnableOutput (Retry r) Source # | |
Defined in Langchain.Runnable.Utils |
data WithTimeout r Source #
Add timeout capability to any Runnable
.
This wrapper enforces a maximum execution time for the wrapped Runnable
.
If the operation takes longer than the specified timeout, it is cancelled and
an error is returned. This is useful for limiting the execution time of potentially
long-running operations.
Example:
-- Create an LLM with a 30-second timeout let baseModel = OpenAI defaultConfig timeboxedModel = WithTimeout { timeoutRunnable = baseModel , timeoutMicroseconds = 30000000 -- 30 seconds } -- If the API call takes longer than 30 seconds, it will be cancelled result <- invoke timeboxedModel "Generate a detailed analysis of Haskell's type system"
Note: This implementation uses forkIO
and killThread
, which may not always
cleanly terminate the underlying operation, especially for certain types of I/O.
For critical applications, consider implementing a more robust timeout mechanism.
Constructors
Runnable r => WithTimeout | |
Fields
|
Instances
Runnable r => Runnable (WithTimeout r) Source # | Make WithTimeout a Runnable that times out |
Defined in Langchain.Runnable.Utils Associated Types type RunnableInput (WithTimeout r) Source # type RunnableOutput (WithTimeout r) Source # Methods invoke :: WithTimeout r -> RunnableInput (WithTimeout r) -> IO (Either String (RunnableOutput (WithTimeout r))) Source # batch :: WithTimeout r -> [RunnableInput (WithTimeout r)] -> IO (Either String [RunnableOutput (WithTimeout r)]) Source # stream :: WithTimeout r -> RunnableInput (WithTimeout r) -> (RunnableOutput (WithTimeout r) -> IO ()) -> IO (Either String ()) Source # | |
type RunnableInput (WithTimeout r) Source # | |
Defined in Langchain.Runnable.Utils | |
type RunnableOutput (WithTimeout r) Source # | |
Defined in Langchain.Runnable.Utils |