This package provides a GHC plugin that uses LLMs to generate valid hole-fits.
It supports multiple backends including Ollama, OpenAI, and Gemini.
The following flags are available:
To specify the model to use:
-fplugin-opt=GHC.Plugin.OllamaHoles:model=<model_name>
To include documentation in the LLM's context (not recommended for small models):
-fplugin-opt=GHC.Plugin.OllamaHoles:include-docs
To specify the backend to use (ollama, openai, or gemini):
-fplugin-opt=GHC.Plugin.OllamaHoles:backend=<backend_name>
When using the openai backend, you can specify a custom base_url, e.g.
-fplugin-opt=GHC.Plugin.OllamaHoles:openai_base_url=api.groq.com/api
You can also specify which key to use.
-fplugin-opt=GHC.Plugin.OllamaHoles:openai_key_name=GROQ_API_KEY
To specify how many fits to generate (passed to the model)
-fplugin-opt=GHC.Plugin.OllamaHoles:n=5
To enable debug output:
-fplugin-opt=GHC.Plugin.OllamaHoles:debug
For the Ollama backend, make sure you have the Ollama CLI installed and the model
you want to use is available. You can install the Ollama CLI by following the
instructions at https://ollama.com/download,
and you can install the default model (gemma3:27b) by running `ollama pull gemma3:27b`.
For the OpenAI backend, you'll need to set the OPENAI_API_KEY environment variable with your API key.
For the Gemini backend, you'll need to set the GEMINI_API_KEY environment variable with your API key.
Note that the speed and quality of the hole-fits generated by the plugin depends on
the model you use, and the default model requires a GPU to run efficiently.
For a smaller model, we suggest `gemma3:4b-it-qat`, or `deepcoder:1.5b`.
[
Skip to Readme]
Ollama Holes

Introduction
This is an example of a typed-hole plugin for GHC that uses the Ollama to host a local LLM to fill in holes in Haskell code.
Before using this plugin, make sure you have the Ollama CLI installed and the model you want to use is available.
You can install the Ollama CLI by following the instructions at https://ollama.com/download,
and you can install the default model (gemma3:27b) by running ollama pull gemma3:27b
.
Note that the speed and quality of the hole-fits generated by the plugin depends on
the model you use, and the default model requires a GPU to run efficiently.
For a smaller model, we suggest gemma3:4b-it-qat
, or deepcoder:1.5b
.
This plugin is also availble on Hackage https://hackage.haskell.org/package/ollama-holes-plugin
Example
Given
{-# OPTIONS_GHC -fplugin=GHC.Plugin.OllamaHoles #-}
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:model=gemma3:27b #-}
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:n=5 #-}
module Main where
import Data.List
main :: IO ()
main = do let k = (_b :: [Int] -> [String])
print (k [1,2,3])
We get the following output:
Main.hs:12:20: error: [GHC-88464]
• Found hole: _b :: [Int] -> [String]
Or perhaps ‘_b’ is mis-spelled, or not in scope
• In the expression: _b :: [Int] -> [String]
In an equation for ‘k’: k = (_b :: [Int] -> [String])
In the expression:
do let k = (_b :: [Int] -> [String])
print (k [1, 2, ....])
• Relevant bindings include
k :: [Int] -> [String] (bound at Main.hs:12:15)
main :: IO () (bound at Main.hs:12:1)
Valid hole fits include
map show
Prelude.map show
(\xs -> map show xs)
(\xs -> [show x | x <- xs])
(\xs -> concatMap (return . show) xs)
|
12 | main = do let k = (_b :: [Int] -> [String])
| ^^
Guidance
We can also provide some guidance to the LLM, by having an identifier in scope called _guide
,
defined as _guide = Proxy :: Proxy (Text "<guidance")
.
Note that this requires GHC.TypeError
and Data.Proxy
, with DataKinds
enabled
Given
{-# LANGUAGE DataKinds #-}
{-# OPTIONS_GHC -fplugin=GHC.Plugin.OllamaHoles #-}
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:model=gemma3:27b #-}
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:n=5 #-}
module Main where
import qualified Data.List as L
import GHC.TypeError
import Data.Proxy
main :: IO ()
main = do let _guide = Proxy :: Proxy (Text "The function should take the list, sort it, and then print each integer.")
let k = (_b :: [Int] -> [String])
print (k [1,2,3])
We get:
Main.hs:16:20: error: [GHC-88464]
• Found hole: _b :: [Int] -> [String]
Or perhaps ‘_b’ is mis-spelled, or not in scope
• In the expression: _b :: [Int] -> [String]
In an equation for ‘k’: k = (_b :: [Int] -> [String])
In the expression:
do let _guide = ...
let k = (_b :: [Int] -> [String])
print (k [1, 2, ....])
• Relevant bindings include
k :: [Int] -> [String] (bound at Main.hs:16:15)
_guide :: Proxy
(Text
"The function should take the list, sort it, and then print each integer.")
(bound at Main.hs:15:15)
main :: IO () (bound at Main.hs:15:1)
|
16 | let k = (_b :: [Int] -> [String])
| ^^
Including Documentation
You can also pass the -fplugin-opt=GHC.Plugin.OllamaHoles:include-docs
, flag,
which will lookup the Haddock documentation (if available) for the functions in scope
and provide it to the LLM. E.g. if Data.List
is imported as L
, the request to the
LLM
will include
...
Documentation for `L.subsequences`:
The 'subsequences' function returns the list of all subsequences of the argument.
Documentation for `L.tails`:
\(\mathcal{O}(n)\). The 'tails' function returns all final segments of the
argument, longest first.
Documentation for `L.transpose`:
The 'transpose' function transposes the rows and columns of its argument.
...
Installation
- Install Ollama
- Install the
gemma3:27b
model (or any other model you prefer) using the following command:
ollama pull gemma3:27b
- Clone this repository and navigate to the directory, and build the project using:
cabal build
- Run the example using:
cabal build Test
- Enjoy! If you want to change the underlying model, make sure to pass the model name via the plugin arguments (see example)
OpenAI and Gemini backends
The plugin now supports using the OpenAI API and Gemini APIs to generate valid hole fits.
Simply set the backend flag -fplugin-opt=GHC.Plugin.OllamaHoles:backend=openai
,
or -fplugin-opt=GHC.Plugin.OllamaHoles:backend=gemini
, and make sure that
you have the OPENAI_API_KEY
or GEMINI_API_KEY
set in your environment.
To use with any other OpenAI compatible api (like groq or OpenRouter), simply set
-fplugin-opt=GHC.Plugin.OllamaHoles:backend=openai
,
and
-fplugin-opt=GHC.Plugin.OllamaHoles:openai_base_url=https://api.groq.com/openai
,
-fplugin-opt=GHC.Plugin.OllamaHoles:openai_key_name=GROQ_API_KEY
,