Skip to main content

LLM Provider Switching

Nova can run with different LLM providers without changing business logic.

What LLMs are used for

  • understanding and drafting language
  • extracting structured information
  • summarization and response generation

What LLMs are not used for

LLMs do not directly decide unsafe side effects. Deterministic services still control:

  • target resolution
  • policy checks
  • delivery execution

Supported providers

  • openai
  • anthropic
  • ollama
  • openai_compatible

Typical setup example

LLM_PROVIDER=ollama
OLLAMA_BASE_URL=https://<your-endpoint>
OLLAMA_API_KEY=<token>
OLLAMA_MODEL=<model>

Provider selection model

  • global default from env
  • optional per-assistant override
  • controlled fallback to compatible mode when configured