ollama-holes-plugin: A typed-hole plugin that uses LLMs to generate valid hole-fits

[ compiler-plugin, development, library, mit ] [ Propose Tags ] [ Report a vulnerability ]

This package provides a GHC plugin that uses LLMs to generate valid hole-fits. It supports multiple backends including Ollama, OpenAI, and Gemini.

The following flags are available:

To specify the model to use:

-fplugin-opt=GHC.Plugin.OllamaHoles:model=<model_name>

To include documentation in the LLM's context (not recommended for small models):

-fplugin-opt=GHC.Plugin.OllamaHoles:include-docs

To specify the backend to use (ollama, openai, or gemini):

-fplugin-opt=GHC.Plugin.OllamaHoles:backend=<backend_name>

When using the openai backend, you can specify a custom base_url, e.g.

-fplugin-opt=GHC.Plugin.OllamaHoles:openai_base_url=api.groq.com/api 

You can also specify which key to use.

-fplugin-opt=GHC.Plugin.OllamaHoles:openai_key_name=GROQ_API_KEY 

To specify how many fits to generate (passed to the model)

-fplugin-opt=GHC.Plugin.OllamaHoles:n=5

To enable debug output:

-fplugin-opt=GHC.Plugin.OllamaHoles:debug

For custom model options:

-fplugin-opt=GHC.Plugin.OllamaHoles:model-options=<json>

Where json is an object with your parameters, e.g.:

-fplugin-opt=GHC.Plugin.OllamaHoles:model-options={"num_ctxt": 32000, "temperature": 1.0}

For the Ollama backend, make sure you have the Ollama CLI installed and the model you want to use is available. You can install the Ollama CLI by following the instructions at https://ollama.com/download, and you can install the default model (qwen3) by running `ollama pull qwen3`.

For the OpenAI backend, you'll need to set the OPENAI_API_KEY environment variable with your API key.

For the Gemini backend, you'll need to set the GEMINI_API_KEY environment variable with your API key.

Note that the speed and quality of the hole-fits generated by the plugin depends on the model you use, and the default model requires a GPU to run efficiently. For a smaller model, we suggest `gemma3:4b-it-qat`, `phi4-mini-reasoning` or `deepcoder:1.5b`, or one of the smaller qwen3 models, such as `qwen3:1.7b` or `qwen3:4b` or even `qwen3:0.6b`, though results may vary.


[Skip to Readme]

Downloads

Maintainer's Corner

Package maintainers

For package maintainers and hackage trustees

Candidates

Versions [RSS] 0.1.0.0, 0.1.1.0, 0.1.2.0, 0.1.3.0, 0.1.4.0, 0.1.5.0, 0.1.5.1, 0.1.5.2, 0.1.5.3, 0.1.6.0
Change log CHANGELOG.md
Dependencies aeson (>=2.2 && <2.3), base (>=4.18 && <4.22), containers (>=0.6 && <0.8), ghc (>=9.6 && <9.14), modern-uri (>=0.3 && <0.4), ollama-haskell (>=0.1 && <0.2), req (>=3.13 && <3.14), text (>=2.1 && <2.2) [details]
Tested with ghc >=9.6 && <9.7, ghc >=9.8 && <9.9, ghc >=9.10 && <9.11, ghc >=9.12 && <9.13
License MIT
Copyright 2025 (c) Matthias Pall Gissurarson
Author Matthias Pall Gissurarson <mpg@mpg.is>
Maintainer Matthias Pall Gissurarson <mpg@mpg.is>
Category Development, Compiler Plugin
Home page https://github.com/Tritlo/OllamaHoles
Source repo head: git clone git://github.com/Tritlo/OllamaHoles.git
Uploaded by tritlo at 2025-05-02T12:50:17Z
Distributions
Downloads 10 total (10 in the last 30 days)
Rating (no votes yet) [estimated by Bayesian average]
Your Rating
  • λ
  • λ
  • λ
Status Docs uploaded by user
Build status unknown [no reports yet]

Readme for ollama-holes-plugin-0.1.6.0

[back to package description]

Ollama Holes

image

Introduction

This is an example of a typed-hole plugin for GHC that uses the Ollama to host a local LLM to fill in holes in Haskell code.

Before using this plugin, make sure you have the Ollama CLI installed and the model you want to use is available. You can install the Ollama CLI by following the instructions at https://ollama.com/download, and you can install the default model (qwen3) by running ollama pull qwen3.

Note that the speed and quality of the hole-fits generated by the plugin depends on the model you use, and the default model requires a GPU to run efficiently. For a smaller model, we suggest gemma3:4b-it-qat, phi4-mini-reasoning or deepcoder:1.5b, or one of the smaller qwen3 models, such as qwen3:1.7b or qwen3:4b or even qwen3:0.6b, though results may vary.

This plugin is also availble on Hackage https://hackage.haskell.org/package/ollama-holes-plugin

Installation

  1. Install Ollama
  2. Install the qwen3 model (or any other model you prefer) using the following command:

ollama pull qwen3
  1. Clone this repository and navigate to the directory, and build the project using:
cabal build
  1. Run the example using:
cabal build Test
  1. Enjoy! If you want to change the underlying model, make sure to pass the model name via the plugin arguments (see example)

Example

Given

{-# OPTIONS_GHC -fplugin=GHC.Plugin.OllamaHoles #-}

module Main where

import qualified Data.List as L

main :: IO ()
main = do let k = (_b :: [Int] -> [String])
          print (k [1,2,3])

We get the following output:

Main.hs:8:20: error: [GHC-88464]
    • Found hole: _b :: [Int] -> [String]
      Or perhaps ‘_b’ is mis-spelled, or not in scope
    • In the expression: _b :: [Int] -> [String]
      In an equation for ‘k’: k = (_b :: [Int] -> [String])
      In the expression:
        do let k = (_b :: [Int] -> [String])
           print (k [1, 2, ....])
    • Relevant bindings include
        k :: [Int] -> [String] (bound at Main.hs:8:15)
        main :: IO () (bound at Main.hs:8:1)
      Valid hole fits include
        map show
        pure . show
        fmap show
        L.map show
        (\xs -> map show xs)
  |
8 | main = do let k = (_b :: [Int] -> [String])
  |                    ^^

Guidance

We can also provide some guidance to the LLM, by having an identifier in scope called _guide, defined as _guide = Proxy :: Proxy (Text "<guidance").

Note that this requires GHC.TypeError and Data.Proxy, with DataKinds enabled

Given

{-# LANGUAGE DataKinds #-}
{-# OPTIONS_GHC -fplugin=GHC.Plugin.OllamaHoles #-}
-- Use a bigger model tomake it better at following instructions
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:model=qwen3:14b #-}
{-# OPTIONS_GHC -fplugin-opt=GHC.Plugin.OllamaHoles:n=10 #-}

module Main where

import qualified Data.List as L
import Data.Proxy
import GHC.TypeError

main :: IO ()
main = do
  let _guide = Proxy :: Proxy (Text "The function should sort the list and then show each element")
  let k = (_b :: [Int] -> [String])
  print (k [1, 2, 3])

We get:

Main.hs:15:12: error: [GHC-88464]
    • Found hole: _b :: [Int] -> [String]
      Or perhaps ‘_b’ is mis-spelled, or not in scope
    • In the expression: _b :: [Int] -> [String]
      In an equation for ‘k’: k = (_b :: [Int] -> [String])
      In the expression:
        do let _guide = ...
           let k = (_b :: [Int] -> [String])
           print (k [1, 2, ....])
    • Relevant bindings include
        k :: [Int] -> [String] (bound at Main.hs:15:7)
        _guide :: Proxy
                    (Text
                       "The function should sort the list and then show each element")
          (bound at Main.hs:14:7)
        main :: IO () (bound at Main.hs:13:1)
      Valid hole fits include
        map show . L.sort
        \xs -> map show (L.sort xs)
        L.map show . L.sort
        \xs -> L.map show (L.sort xs)
        \xs -> [show x | x <- L.sort xs]
        \xs -> L.sort xs >>= \x -> [show x]
        (Some hole fits suppressed; use -fmax-valid-hole-fits=N or -fno-max-valid-hole-fits)
   |
15 |   let k = (_b :: [Int] -> [String])
   |            ^^

Including Documentation

You can also pass the -fplugin-opt=GHC.Plugin.OllamaHoles:include-docs, flag, which will lookup the Haddock documentation (if available) for the functions in scope and provide it to the LLM. E.g. if Data.List is imported as L, the request to the LLM will include

...
Documentation for `L.subsequences`:
 The 'subsequences' function returns the list of all subsequences of the argument.
Documentation for `L.tails`:
 \(\mathcal{O}(n)\). The 'tails' function returns all final segments of the
 argument, longest first.
Documentation for `L.transpose`:
 The 'transpose' function transposes the rows and columns of its argument.
...

Model Options

Using

-fplugin-opt=GHC.Plugin.OllamaHoles:model-options={\"num_ctxt\": 32000, \"temperature\": 1.0}

You can pass further custom options to the model, e.g. here we increase the context length and set the temperature.

OpenAI and Gemini backends

The plugin now supports using the OpenAI API and Gemini APIs to generate valid hole fits. Simply set the backend flag -fplugin-opt=GHC.Plugin.OllamaHoles:backend=openai, or -fplugin-opt=GHC.Plugin.OllamaHoles:backend=gemini, and make sure that you have the OPENAI_API_KEY or GEMINI_API_KEY set in your environment.

To use with any other OpenAI compatible api (like groq or OpenRouter), simply set -fplugin-opt=GHC.Plugin.OllamaHoles:backend=openai, and -fplugin-opt=GHC.Plugin.OllamaHoles:openai_base_url=https://api.groq.com/openai, -fplugin-opt=GHC.Plugin.OllamaHoles:openai_key_name=GROQ_API_KEY,