converge-provider 0.2.3

LLM provider implementations for Converge
Documentation

Converge Provider

Multi-provider LLM abstraction layer for the Converge runtime.

Crates.io Documentation

Website: converge.zone | Docs: docs.rs | Crates.io: converge-provider

Installation

[dependencies]
converge-provider = "0.2"

Related Crates

Crate Version Description
converge-core 0.6.1 Runtime engine, agent traits, capabilities
converge-provider 0.2.3 14+ LLM providers, model selection
converge-domain 0.2.3 12 business use cases

Supported Providers

Provider Models Region
Anthropic Claude 3.5 Sonnet, Haiku, Opus 4 US
OpenAI GPT-4o, GPT-4o-mini, GPT-4 Turbo US
Google Gemini Gemini Pro, Flash US
Alibaba Qwen Qwen-Max, Qwen-Plus, Qwen3-VL CN
DeepSeek DeepSeek Chat, Coder CN
Mistral Mistral Large, Medium EU
xAI Grok Grok models US
Perplexity Online models (web search) US
OpenRouter Multi-provider gateway US
Baidu ERNIE ERNIE models CN
Zhipu GLM GLM-4 models CN
Kimi (Moonshot) Moonshot models CN
Apertus EU digital sovereignty EU
Ollama Local models (Llama, Mistral, etc.) Local

Features

Model Selection Engine

  • Cost-aware: VeryLow → VeryHigh cost tiers
  • Latency-aware: Interactive, Realtime, Batch classes
  • Quality-scored: 0.0-1.0 quality ratings
  • Capability matching: Tool use, vision, reasoning, code
  • Data sovereignty: Region-aware selection (US, EU, CN, Local)

Capability Registry

  • Unified discovery for LLM, embedding, and reranking providers
  • Capability-based selection
  • Local preference support

Vector Stores

  • In-memory store (testing/development)
  • LanceDB embedded store (optional feature)
  • Qdrant distributed store (planned)

Embedding & Reranking

  • Qwen3-VL multimodal embeddings
  • Two-stage retrieval pipeline support

Prompt Optimization

  • EDN format: ~40% token reduction vs markdown
  • XML wrapping: Claude-specific optimization
  • Structured response parsing: Extract proposals with confidence

Quick Start

use converge_provider::{AnthropicProvider, LlmProvider, LlmRequest};

// Create provider from environment variable
let provider = AnthropicProvider::from_env("claude-3-5-sonnet-20241022")?;

// Make a request
let request = LlmRequest::new("Analyze market trends for Q4")
    .with_max_tokens(1000)
    .with_temperature(0.7);

let response = provider.complete(&request)?;
println!("Response: {}", response.content);

Model Selection

use converge_provider::{ModelSelector, AgentRequirements, CostClass};

let selector = ModelSelector::default();

// Fast and cheap for simple tasks
let requirements = AgentRequirements::fast_extraction()
    .with_max_cost(CostClass::Low);

let result = selector.select(&requirements)?;
println!("Selected: {} / {}", result.selected.provider, result.selected.model);

Factory Pattern

use converge_provider::factory::{create_provider, can_create_provider};

// Check if provider is available (API key set)
if can_create_provider("anthropic") {
    let provider = create_provider("anthropic", "claude-3-5-sonnet-20241022")?;
    // Use provider...
}

Feature Flags

[dependencies]
converge-provider = { version = "0.2", features = ["lancedb"] }
Feature Description
lancedb LanceDB embedded vector store
qdrant Qdrant distributed vector store (planned)
neo4j Neo4j graph store (planned)
all-vector All vector stores
all-stores All stores

Environment Variables

Each provider requires an API key environment variable:

Provider Environment Variable
Anthropic ANTHROPIC_API_KEY
OpenAI OPENAI_API_KEY
Gemini GOOGLE_API_KEY
Qwen DASHSCOPE_API_KEY
DeepSeek DEEPSEEK_API_KEY
Mistral MISTRAL_API_KEY
Grok XAI_API_KEY
Perplexity PERPLEXITY_API_KEY
OpenRouter OPENROUTER_API_KEY

Repository

This crate is part of the Converge project.

Standalone repo: github.com/kpernyer/converge-provider

License

MIT