Provider Setup

aidaemon supports three provider types, all configured in the [provider] section.

Provider Kinds

openai_compatible (default)

Works with any API that implements the OpenAI chat completions format. This includes OpenAI, OpenRouter, Ollama, and many others.

toml
[provider]
kind = "openai_compatible"
api_key = "sk-..."
base_url = "https://api.openai.com/v1"

[provider.models]
primary = "gpt-4o"
fast = "gpt-4o-mini"
smart = "o1-preview"

anthropic

Native Anthropic API (Messages API format). Use this for direct Anthropic access without going through an OpenAI-compatible proxy.

toml
[provider]
kind = "anthropic"
api_key = "sk-ant-..."

[provider.models]
primary = "claude-3.5-sonnet"
fast = "claude-3-haiku"
smart = "claude-3-opus"

google_genai

Native Google Generative AI API. Use this for direct Gemini access.

toml
[provider]
kind = "google_genai"
api_key = "AIza..."

[provider.models]
primary = "gemini-3-flash-preview"
fast = "gemini-2.5-flash-lite"
smart = "gemini-3-pro-preview"

OpenRouter

OpenRouter provides access to models from multiple providers through a single API key and the OpenAI-compatible format.

toml
[provider]
kind = "openai_compatible"
api_key = "sk-or-..."
base_url = "https://openrouter.ai/api/v1"

[provider.models]
primary = "anthropic/claude-3.5-sonnet"
fast = "anthropic/claude-3-haiku"
smart = "anthropic/claude-3-opus"

Ollama (Local)

Run models locally with Ollama. No API key required.

toml
[provider]
kind = "openai_compatible"
api_key = "ollama"
base_url = "http://localhost:11434/v1"

[provider.models]
primary = "llama3.1"
fast = "llama3.1"
smart = "llama3.1"
Ollama Discovery
The setup wizard auto-discovers available Ollama models by querying http://localhost:11434/api/tags.