Skip to main content

Module providers

Module providers 

Source
Expand description

Provider factory — builds embedding and LLM providers from environment config.

§Provider priority (LLM and embeddings)

  1. Mistral native (MISTRAL_API_KEY set) — calls api.mistral.ai directly. Preferred for the European deployment stack: no Google Cloud dependency, EU-hosted, GDPR-compliant. Set MISTRAL_MODEL to override the model.

  2. OpenAI (OPENAI_API_KEY set) — calls OpenAI or any compatible API. Override base URL with OPENAI_API_BASE for Ollama, Together, etc.

  3. Anthropic (ANTHROPIC_API_KEY set) — calls api.anthropic.com. LLM only (no embeddings).

  4. Vertex AI (VERTEX_AI_PROJECT + credentials set) — Google Cloud. Supports Mistral, Gemini, and Claude via Vertex Model Garden.

  5. Mock (fallback) — deterministic hash-based embeddings, echo LLM. Used in tests and when no cloud credentials are present.

§Environment variables

VariableDescriptionDefault
MISTRAL_API_KEYNative Mistral API key (console.mistral.ai)
MISTRAL_MODELMistral model namemistral-small-latest
MISTRAL_EMBEDDING_MODELMistral embedding modelmistral-embed
OPENAI_API_KEYOpenAI API key (or compatible)
OPENAI_MODELOpenAI model namegpt-4o-mini
OPENAI_API_BASEOpenAI-compatible base URLhttps://api.openai.com/v1
OPENAI_EMBEDDING_MODELOpenAI embedding modeltext-embedding-3-small
ANTHROPIC_API_KEYAnthropic API key
ANTHROPIC_MODELAnthropic model nameclaude-sonnet-4-20250514
VERTEX_AI_PROJECTGCP project IDrequired (or GOOGLE_CLOUD_PROJECT)
VERTEX_AI_LOCATIONGCP regioneurope-west1 (or GOOGLE_CLOUD_LOCATION)
VERTEX_AI_TOKENStatic GCP auth tokenauto-detect
VERTEX_AI_MODELVertex model namemistral-small-2503
NOETHER_LLM_PROVIDERForce: mistral | openai | anthropic | vertex | mockauto
NOETHER_EMBEDDING_PROVIDERForce: mistral | openai | vertex | mockauto

Functions§

build_embedding_provider
Build the best available embedding provider based on env config.
build_llm_provider
Build the best available LLM provider based on env config.