cabal-version: 3.4 -- The cabal-version field refers to the version of the .cabal specification, -- and can be different from the cabal-install (the tool) version and the -- Cabal (the library) version you are using. As such, the Cabal (the library) -- version used must be equal or greater than the version stated in this field. -- Starting from the specification version 2.2, the cabal-version field must be -- the first thing in the cabal file. -- Initial package description 'OllamaHoles' generated by -- 'cabal init'. For further documentation, see: -- http://haskell.org/cabal/users-guide/ -- -- The name of the package. name: ollama-holes-plugin -- The package version. -- See the Haskell package versioning policy (PVP) for standards -- guiding when and how versions should be incremented. -- https://pvp.haskell.org -- PVP summary: +-+------- breaking API changes -- | | +----- non-breaking API additions -- | | | +--- code changes with no API change version: 0.1.6.0 -- A short (one-line) description of the package. synopsis: A typed-hole plugin that uses LLMs to generate valid hole-fits -- A longer description of the package. description: This package provides a GHC plugin that uses LLMs to generate valid hole-fits. It supports multiple backends including Ollama, OpenAI, and Gemini. The following flags are available: To specify the model to use: > -fplugin-opt=GHC.Plugin.OllamaHoles:model= To include documentation in the LLM's context (not recommended for small models): > -fplugin-opt=GHC.Plugin.OllamaHoles:include-docs To specify the backend to use (ollama, openai, or gemini): > -fplugin-opt=GHC.Plugin.OllamaHoles:backend= When using the openai backend, you can specify a custom base_url, e.g. > -fplugin-opt=GHC.Plugin.OllamaHoles:openai_base_url=api.groq.com/api You can also specify which key to use. > -fplugin-opt=GHC.Plugin.OllamaHoles:openai_key_name=GROQ_API_KEY To specify how many fits to generate (passed to the model) > -fplugin-opt=GHC.Plugin.OllamaHoles:n=5 To enable debug output: > -fplugin-opt=GHC.Plugin.OllamaHoles:debug For custom model options: > -fplugin-opt=GHC.Plugin.OllamaHoles:model-options= Where json is an object with your parameters, e.g.: > -fplugin-opt=GHC.Plugin.OllamaHoles:model-options={"num_ctxt": 32000, "temperature": 1.0} For the Ollama backend, make sure you have the Ollama CLI installed and the model you want to use is available. You can install the Ollama CLI by following the instructions at https://ollama.com/download, and you can install the default model (qwen3) by running `ollama pull qwen3`. For the OpenAI backend, you'll need to set the `OPENAI_API_KEY` environment variable with your API key. For the Gemini backend, you'll need to set the `GEMINI_API_KEY` environment variable with your API key. Note that the speed and quality of the hole-fits generated by the plugin depends on the model you use, and the default model requires a GPU to run efficiently. For a smaller model, we suggest `gemma3:4b-it-qat`, `phi4-mini-reasoning` or `deepcoder:1.5b`, or one of the smaller `qwen3` models, such as `qwen3:1.7b` or `qwen3:4b` or even `qwen3:0.6b`, though results may vary. -- The license under which the package is released. license: MIT -- The file containing the license text. license-file: LICENSE -- The package author(s). author: Matthias Pall Gissurarson -- An email address to which users can send suggestions, bug reports, and patches. maintainer: Matthias Pall Gissurarson -- A copyright notice. copyright: 2025 (c) Matthias Pall Gissurarson category: Development, Compiler Plugin tested-with: GHC == 9.6.*, GHC == 9.8.*, GHC == 9.10.*, GHC == 9.12.* build-type: Simple -- Extra doc files to be distributed with the package, such as a CHANGELOG or a README. extra-doc-files: CHANGELOG.md README.md homepage: https://github.com/Tritlo/OllamaHoles -- Extra source files to be distributed with the package, such as examples, or a tutorial module. -- extra-source-files: source-repository head type: git location: git://github.com/Tritlo/OllamaHoles.git common warnings ghc-options: -Wall library -- Import common warning flags. import: warnings -- Modules exported by the library. exposed-modules: GHC.Plugin.OllamaHoles -- Modules included in this library but not exported. other-modules: GHC.Plugin.OllamaHoles.Backend , GHC.Plugin.OllamaHoles.Backend.Ollama , GHC.Plugin.OllamaHoles.Backend.OpenAI , GHC.Plugin.OllamaHoles.Backend.Gemini -- LANGUAGE extensions used by modules in this package. -- other-extensions: -- Other library packages from which modules are imported. build-depends: base >= 4.18 && < 4.22, ghc >= 9.6 && < 9.14, ollama-haskell ^>= 0.1, text ^>= 2.1, req ^>= 3.13, modern-uri ^>= 0.3, aeson ^>= 2.2, containers >= 0.6 && < 0.8 -- Directories containing source files. hs-source-dirs: src -- Base language which the package is written in. default-language: GHC2021