Expand description
saorsa-ai: Unified multi-provider LLM API.
Provides a common interface for streaming completions, tool calling, and authentication across multiple LLM providers.
§Architecture Overview
┌─────────────────────────────────────────────────────────────┐
│ Application / Agent Layer │
│ (Sends CompletionRequest, receives StreamEvent stream) │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ ProviderRegistry (Factory) │
│ ProviderKind → ProviderConfig → Box<dyn Provider> │
└─────────────────────────────────────────────────────────────┘
│
┌─────────────┼─────────────┬─────────────┐
▼ ▼ ▼ ▼
┌──────────────┬──────────────┬──────────────┬──────────────┐
│ Anthropic │ OpenAI │ Gemini │ Ollama │
│ Provider │ Provider │ Provider │ Provider │
└──────────────┴──────────────┴──────────────┴──────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Streaming HTTP (reqwest, Server-Sent Events) │
│ POST /v1/messages → stream of JSON events → StreamEvent │
└─────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────┐
│ Message Protocol (vendor-agnostic types) │
│ Message, ContentBlock, ToolDefinition, ContentDelta │
└─────────────────────────────────────────────────────────────┘§Provider Abstraction
All providers implement the Provider trait:
stream_completion: ReturnsPin<Box<dyn Stream<Item = Result<StreamEvent>>>>- Unified event types:
StreamEvent::{ContentDelta, ToolUse, Done, Error} - Model metadata: Context windows, tool support, vision capabilities
§Supported Providers
- Anthropic: Claude models with streaming, tool use, vision
- OpenAI: GPT models with streaming, function calling, vision
- Gemini: Google Gemini with streaming and tool use
- Ollama: Local model hosting with OpenAI-compatible API
- OpenAI-Compatible: Generic adapter for compatible APIs (Groq, etc.)
§Key Types
Provider: Core trait for LLM completion providersCompletionRequest: Vendor-agnostic request (messages, tools, params)StreamEvent: Streaming events (content deltas, tool calls, completion)Message: Conversation message with role and content blocksToolDefinition: JSON Schema-based tool specification
Re-exports§
pub use anthropic::AnthropicProvider;pub use error::Result;pub use error::SaorsaAiError;pub use gemini::GeminiProvider;pub use message::ContentBlock;pub use message::Message;pub use message::Role;pub use message::ToolDefinition;pub use models::ModelInfo;pub use models::all_models;pub use models::get_context_window;pub use models::lookup_by_provider_prefix;pub use models::lookup_model;pub use models::lookup_model_by_prefix;pub use models::supports_tools;pub use models::supports_vision;pub use ollama::OllamaProvider;pub use openai::OpenAiProvider;pub use openai_compat::OpenAiCompatBuilder;pub use openai_compat::OpenAiCompatProvider;pub use provider::Provider;pub use provider::ProviderConfig;pub use provider::ProviderKind;pub use provider::ProviderRegistry;pub use provider::StreamingProvider;pub use provider::determine_provider;pub use types::CompletionRequest;pub use types::CompletionResponse;pub use types::ContentDelta;pub use types::StopReason;pub use types::StreamEvent;pub use types::ThinkingConfig;pub use types::Usage;
Modules§
- anthropic
- Anthropic Messages API provider.
- error
- Error types for saorsa-ai.
- gemini
- Google Gemini
generateContent/streamGenerateContentAPI provider. - message
- Message and content types for LLM conversations.
- models
- Model registry for known LLM models.
- ollama
- Ollama Chat API provider for local inference.
- openai
- OpenAI Chat Completions API provider.
- openai_
compat - Generic OpenAI-compatible provider.
- provider
- Provider trait for LLM backends.
- tokens
- Token counting and context window management.
- types
- Request, response, and streaming types for LLM APIs.