For self-hosting, configure AI through environment variables in apps/web/.env.
If you used inbox-zero setup, many of these values are configured automatically.
Start here:
API keys require billing credits on the provider’s platform. A ChatGPT Plus or Claude Pro subscription does not include API access.
Providers
Use one of these values for *_LLM_PROVIDER:
| Provider | Value |
|---|
| OpenAI | openai |
| Anthropic | anthropic |
| Azure OpenAI | azure |
| Google Gemini (AI Studio) | google |
| Google Vertex AI | vertex |
| OpenRouter | openrouter |
| Groq | groq |
| Vercel AI Gateway | aigateway |
| AWS Bedrock | bedrock |
| Ollama | ollama |
| OpenAI-compatible (LM Studio, vLLM, LiteLLM, etc.) | openai-compatible |
Tiers
For most self-hosted setups, configure these two tiers:
DEFAULT_LLM_* (required): primary model used for normal AI tasks.
ECONOMY_LLM_* (optional): lower-cost model for high-volume tasks. If unset, it falls back to DEFAULT.
Minimal example:
DEFAULT_LLM_PROVIDER=openai
DEFAULT_LLM_MODEL=gpt-4o
ECONOMY_LLM_PROVIDER=openai
ECONOMY_LLM_MODEL=gpt-4o-mini
LLM_API_KEY=sk-...
Provider-specific keys (for example OPENAI_API_KEY, ANTHROPIC_API_KEY) also work. See Environment Variables for the full list.
App Settings
The app also has Settings → AI for per-user keys/models, but self-hosted deployments usually keep configuration at the environment-variable level.
Provider-specific details
openai-compatible also requires OPENAI_COMPATIBLE_BASE_URL.