MyNixOS website logo
option

services.nextjs-ollama-llm-ui.enable

Whether to enable Simple Ollama web UI service; an easy to use web frontend for a Ollama backend service. Run state-of-the-art AI large language models (LLM) similar to ChatGPT locally with privacy on your personal computer. This service is stateless and doesn't store any data on the server; all data is kept locally in your web browser. See https://github.com/jakobhoeg/nextjs-ollama-llm-ui.

Required: You need the Ollama backend service running by having "services.nextjs-ollama-llm-ui.ollamaUrl" point to the correct url. You can host such a backend service with NixOS through "services.ollama". .

Declarations
Type
boolean
Default
false
Example
true