Whether to enable pay-respects' LLM integration. When there is no rule for a given error, pay-respects can query an OpenAI-compatible API endpoint for command corrections.
- If this is set to
false, all LLM-related features are disabled. - If this is set to
true, the default OpenAI endpoint will be used, using upstream's API key. This default API key may be rate-limited. - You can also set a custom API endpoint, large language model and locale for command corrections. Simply access the
aiIntegration.url,aiIntegration.modelandaiIntegration.localeoptions, as described in the example.- Take a look at the services.ollama NixOS module if you wish to host a local large language model for
pay-respects.
- Take a look at the services.ollama NixOS module if you wish to host a local large language model for
For all of these methods, you can set a custom secret API key by using the _PR_AI_API_KEY environment variable.
Declarations
Type
boolean or (submodule)Default
falseExample
{
locale = "nl-be";
model = "llama3";
url = "http://127.0.0.1:11434/v1/chat/completions";
}