Skip to content

LLM setup

eunha supports three LLM backends: OpenAI, Anthropic, and Ollama (local).

All keys are stored in ~/.eunha/config.toml (permissions 0600). They are never written to the database.

[llm]
provider = "openai"
api_key = "sk-..."
model = "gpt-4o-mini" # optional, default: gpt-4o-mini
[llm]
provider = "anthropic"
api_key = "sk-ant-..."
model = "claude-haiku-4-5-20251001" # optional

Install Ollama and pull a model with JSON mode support (e.g. llama3.2):

Terminal window
ollama pull llama3.2
[llm]
provider = "ollama"
model = "llama3.2"
base_url = "http://localhost:11434" # optional, this is the default

Once configured, navigate to any repo in the library and press d to describe it. Press shift-D to re-describe (regenerate). Press shift-A to batch-describe all undescribed repos.