Capability adapters for the Converge runtime.
Providers produce observations, never decisions. Converge converges; providers adapt.
This crate provides capability adapters (providers) that connect Converge
workflows to external systems. Providers implement traits defined in
converge-core and return structured observations with provenance.
What Is a Provider?
A provider is an adapter that:
- Implements capability traits (
LlmProvider,Embedding,VectorRecall, etc.) - Returns observations (not facts, not decisions)
- Includes provenance metadata for tracing
- Is stateless (no hidden lifecycle state)
A provider is NOT:
- An agent (agents live in
converge-core) - Orchestration (no workflows, no scheduling)
- Domain logic (business rules live in
converge-domain)
Available Providers
Remote Providers
- [
AnthropicProvider] - Claude API (Anthropic) - [
OpenAiProvider] - GPT-4, GPT-3.5 (OpenAI) - [
GeminiProvider] - Gemini Pro (Google) - [
PerplexityProvider] - Perplexity AI - [
QwenProvider] - Qwen models (Alibaba Cloud) - [
OpenRouterProvider] - Multi-provider aggregator - [
MinMaxProvider] -MinMaxAI - [
GrokProvider] - Grok (xAI) - [
MistralProvider] - Mistral AI - [
DeepSeekProvider] -DeepSeekAI - [
BaiduProvider] - Baidu ERNIE - [
ZhipuProvider] - Zhipu GLM - [
KimiProvider] - Kimi (Moonshot AI) - [
ApertusProvider] - Apertus (Switzerland, EU digital sovereignty)
Local Providers
- [
OllamaProvider] - Local models via Ollama (Qwen, Llama, Mistral, etc.)
Prompt Structuring
This crate provides provider-specific prompt structuring and optimization:
- [
ProviderPromptBuilder]: Builds prompts optimized for specific providers - [
StructuredResponseParser]: Parses structured responses (XML/JSON) - Helper functions: [
build_claude_prompt], [build_openai_prompt]
Examples
Using Anthropic (Claude)
use converge_provider::{AnthropicProvider, build_claude_prompt, StructuredResponseParser};
use converge_traits::llm::{LlmProvider, LlmRequest};
use converge_core::prompt::{AgentRole, OutputContract, PromptContext};
use converge_core::context::ContextKey;
let provider = AnthropicProvider::from_env("claude-sonnet-4-6")?;
// Build optimized prompt with XML structure
let prompt = build_claude_prompt(
AgentRole::Proposer,
"extract-competitors",
PromptContext::new(),
OutputContract::new("proposed-fact", ContextKey::Competitors),
vec![],
);
let response = provider.complete(&LlmRequest::new(prompt))?;
// Parse structured XML response
let proposals = StructuredResponseParser::parse_claude_xml(
&response,
ContextKey::Competitors,
"anthropic",
);
Using OpenAI
use converge_provider::OpenAiProvider;
use converge_traits::llm::{LlmProvider, LlmRequest};
let provider = OpenAiProvider::from_env("gpt-4")?;
let response = provider.complete(&LlmRequest::new("Hello!"))?;
Using OpenRouter (Multi-Provider)
use converge_provider::OpenRouterProvider;
use converge_traits::llm::{LlmProvider, LlmRequest};
// Access any provider through OpenRouter
let provider = OpenRouterProvider::from_env("anthropic/claude-3-opus")?;
let response = provider.complete(&LlmRequest::new("Hello!"))?;