Converge Provider
Multi-provider LLM abstraction layer for the Converge runtime.
Website: converge.zone | Docs: docs.rs | Crates.io: converge-provider
Installation
[]
= "0.2"
Related Crates
| Crate | Version | Description |
|---|---|---|
| converge-core | 0.6.1 | Runtime engine, agent traits, capabilities |
| converge-provider | 0.2.3 | 14+ LLM providers, model selection |
| converge-domain | 0.2.3 | 12 business use cases |
Supported Providers
| Provider | Models | Region |
|---|---|---|
| Anthropic | Claude 3.5 Sonnet, Haiku, Opus 4 | US |
| OpenAI | GPT-4o, GPT-4o-mini, GPT-4 Turbo | US |
| Google Gemini | Gemini Pro, Flash | US |
| Alibaba Qwen | Qwen-Max, Qwen-Plus, Qwen3-VL | CN |
| DeepSeek | DeepSeek Chat, Coder | CN |
| Mistral | Mistral Large, Medium | EU |
| xAI Grok | Grok models | US |
| Perplexity | Online models (web search) | US |
| OpenRouter | Multi-provider gateway | US |
| Baidu ERNIE | ERNIE models | CN |
| Zhipu GLM | GLM-4 models | CN |
| Kimi (Moonshot) | Moonshot models | CN |
| Apertus | EU digital sovereignty | EU |
| Ollama | Local models (Llama, Mistral, etc.) | Local |
Features
Model Selection Engine
- Cost-aware: VeryLow → VeryHigh cost tiers
- Latency-aware: Interactive, Realtime, Batch classes
- Quality-scored: 0.0-1.0 quality ratings
- Capability matching: Tool use, vision, reasoning, code
- Data sovereignty: Region-aware selection (US, EU, CN, Local)
Capability Registry
- Unified discovery for LLM, embedding, and reranking providers
- Capability-based selection
- Local preference support
Vector Stores
- In-memory store (testing/development)
- LanceDB embedded store (optional feature)
- Qdrant distributed store (planned)
Embedding & Reranking
- Qwen3-VL multimodal embeddings
- Two-stage retrieval pipeline support
Prompt Optimization
- EDN format: ~40% token reduction vs markdown
- XML wrapping: Claude-specific optimization
- Structured response parsing: Extract proposals with confidence
Quick Start
use ;
// Create provider from environment variable
let provider = from_env?;
// Make a request
let request = new
.with_max_tokens
.with_temperature;
let response = provider.complete?;
println!;
Model Selection
use ;
let selector = default;
// Fast and cheap for simple tasks
let requirements = fast_extraction
.with_max_cost;
let result = selector.select?;
println!;
Factory Pattern
use ;
// Check if provider is available (API key set)
if can_create_provider
Feature Flags
[]
= { = "0.2", = ["lancedb"] }
| Feature | Description |
|---|---|
lancedb |
LanceDB embedded vector store |
qdrant |
Qdrant distributed vector store (planned) |
neo4j |
Neo4j graph store (planned) |
all-vector |
All vector stores |
all-stores |
All stores |
Environment Variables
Each provider requires an API key environment variable:
| Provider | Environment Variable |
|---|---|
| Anthropic | ANTHROPIC_API_KEY |
| OpenAI | OPENAI_API_KEY |
| Gemini | GOOGLE_API_KEY |
| Qwen | DASHSCOPE_API_KEY |
| DeepSeek | DEEPSEEK_API_KEY |
| Mistral | MISTRAL_API_KEY |
| Grok | XAI_API_KEY |
| Perplexity | PERPLEXITY_API_KEY |
| OpenRouter | OPENROUTER_API_KEY |
Repository
This crate is part of the Converge project.
Standalone repo: github.com/kpernyer/converge-provider
License
MIT