aidaemon supports three provider types, all configured in the [provider] section.
[provider]
Works with any API that implements the OpenAI chat completions format. This includes OpenAI, OpenRouter, Ollama, and many others.
[provider] kind = "openai_compatible" api_key = "sk-..." base_url = "https://api.openai.com/v1" [provider.models] primary = "gpt-4o" fast = "gpt-4o-mini" smart = "o1-preview"
Native Anthropic API (Messages API format). Use this for direct Anthropic access without going through an OpenAI-compatible proxy.
[provider] kind = "anthropic" api_key = "sk-ant-..." [provider.models] primary = "claude-3.5-sonnet" fast = "claude-3-haiku" smart = "claude-3-opus"
Native Google Generative AI API. Use this for direct Gemini access.
[provider] kind = "google_genai" api_key = "AIza..." [provider.models] primary = "gemini-3-flash-preview" fast = "gemini-2.5-flash-lite" smart = "gemini-3-pro-preview"
OpenRouter provides access to models from multiple providers through a single API key and the OpenAI-compatible format.
[provider] kind = "openai_compatible" api_key = "sk-or-..." base_url = "https://openrouter.ai/api/v1" [provider.models] primary = "anthropic/claude-3.5-sonnet" fast = "anthropic/claude-3-haiku" smart = "anthropic/claude-3-opus"
Run models locally with Ollama. No API key required.
[provider] kind = "openai_compatible" api_key = "ollama" base_url = "http://localhost:11434/v1" [provider.models] primary = "llama3.1" fast = "llama3.1" smart = "llama3.1"
http://localhost:11434/api/tags