Expand description
§Siumai - A Unified LLM Interface Library
Siumai is a unified LLM interface library for Rust, supporting multiple AI providers. It adopts a trait-separated architectural pattern and provides a type-safe API.
§Features
- Capability Separation: Uses traits to distinguish different AI capabilities (chat, audio, vision, etc.)
- Shared Parameters: AI parameters are shared as much as possible, with extension points for provider-specific parameters.
- Builder Pattern: Supports a builder pattern for chained method calls.
- Type Safety: Leverages Rust’s type system to ensure compile-time safety.
- HTTP Customization: Supports passing in a reqwest client and custom HTTP configurations.
- Library First: Focuses on core library functionality, avoiding application-layer features.
- Flexible Capability Access: Capability checks serve as hints rather than restrictions, allowing users to try new model features.
§Quick Start
use siumai::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create an OpenAI client
let client = LlmBuilder::new()
.openai()
.api_key("your-api-key")
.model("gpt-4")
.temperature(0.7)
.build()
.await?;
// Send a chat request
let messages = vec![user!("Hello, world!")];
let response = client.chat(messages).await?;
if let Some(text) = response.content_text() {
println!("Response: {}", text);
}
Ok(())
}§Capability Access Philosophy
Siumai takes a permissive and quiet approach to capability access. It never blocks operations based on static capability information, and doesn’t generate noise with automatic warnings. The actual API determines what’s supported:
use siumai::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Siumai::builder()
.openai()
.api_key("your-api-key")
.model("gpt-4o") // This model supports vision
.build()
.await?;
// Get vision capability - this always works, regardless of "official" support
let vision = client.vision_capability();
// Optionally check support status if you want to (no automatic warnings)
if !vision.is_reported_as_supported() {
// You can choose to show a warning, or just proceed silently
println!("Note: Vision not officially supported, but trying anyway!");
}
// The actual operation will succeed or fail based on the model's real capabilities
// No pre-emptive blocking, no automatic noise
// vision.analyze_image(...).await?;
Ok(())
}Re-exports§
pub use error::LlmError;pub use traits::AudioCapability;pub use traits::ChatCapability;pub use traits::CompletionCapability;pub use traits::EmbeddingCapability;pub use traits::FileManagementCapability;pub use traits::ImageGenerationCapability;pub use traits::ModelListingCapability;pub use traits::ModerationCapability;pub use traits::ProviderCapabilities;pub use traits::RerankCapability;pub use traits::VisionCapability;pub use client::LlmClient;pub use types::ChatMessage;pub use types::ChatResponse;pub use types::CommonParams;pub use types::CompletionRequest;pub use types::CompletionResponse;pub use types::EmbeddingRequest;pub use types::EmbeddingResponse;pub use types::FinishReason;pub use types::HttpConfig;pub use types::ImageGenerationRequest;pub use types::ImageGenerationResponse;pub use types::MessageContent;pub use types::MessageRole;pub use types::ModelInfo;pub use types::ModerationRequest;pub use types::ModerationResponse;pub use types::ProviderType;pub use types::ResponseMetadata;pub use types::Tool;pub use types::ToolCall;pub use types::Usage;pub use builder::LlmBuilder;pub use providers::anthropic::AnthropicBuilder;pub use providers::gemini::GeminiBuilder;pub use providers::ollama::OllamaBuilder;pub use providers::openai::OpenAiBuilder;pub use stream::ChatStream;pub use stream::ChatStreamEvent;pub use types::WebSearchConfig;pub use types::WebSearchResult;pub use performance::PerformanceMetrics;pub use performance::PerformanceMonitor;pub use retry_strategy::RetryStrategy;Deprecated pub use retry_api::RetryBackend;pub use retry_api::RetryOptions;pub use retry_api::retry;pub use retry_api::retry_for_provider;pub use retry_api::retry_with;pub use benchmarks::BenchmarkConfig;pub use benchmarks::BenchmarkResults;pub use benchmarks::BenchmarkRunner;pub use custom_provider::CustomProvider;pub use custom_provider::CustomProviderConfig;pub use provider_features::ProviderFeatures;pub use types::models::model_constants as models;pub use types::models::constants;pub use tracing::OutputFormat;pub use tracing::TracingConfig;pub use tracing::init_debug_tracing;pub use tracing::init_tracing;pub use crate::builder::quick_anthropic;pub use crate::builder::quick_anthropic_with_model;pub use crate::builder::quick_gemini;pub use crate::builder::quick_gemini_with_model;pub use crate::builder::quick_groq;pub use crate::builder::quick_groq_with_model;pub use crate::builder::quick_openai;pub use crate::builder::quick_openai_with_model;pub use providers::convenience::core::anthropic;pub use providers::convenience::core::gemini;pub use providers::convenience::core::groq;pub use providers::convenience::core::ollama;pub use providers::convenience::core::openai;pub use providers::convenience::core::xai;pub use providers::convenience::openai_compatible::*;
Modules§
- analysis
- Analysis Tools
- benchmarks
- Benchmarking and Performance Testing
- builder
- LLM Client Builder - Client Configuration Layer
- client
- Client Module
- custom_
provider - Custom Provider Framework
- defaults
- Default Configuration Values
- error
- Error handling module
- multimodal
- Enhanced Multimodal Support
- params
- Parameter Management Module
- performance
- Performance Optimization and Monitoring
- prelude
- Convenient pre-import module
- provider
- Siumai LLM Interface
- provider_
builders - SiumaiBuilder Provider Methods
- provider_
features - Provider-Specific Features
- providers
- Provider Module
- request_
factory - Request Factory Module - Parameter Management Layer
- retry
- Retry Mechanism Module
- retry_
api - Public Retry API Facade
- retry_
backoff - Professional Retry Mechanism using backoff crate
- retry_
strategy Deprecated - Advanced Retry Strategy and Error Handling
- stream
- Streaming Processing Module
- tracing
- Tracing and Observability Module
- traits
- Core Trait Definitions
- types
- Core Data Type Definitions
- utils
- Utility modules for siumai
- web_
search - Web Search Functionality
Macros§
- assistant
- Creates an assistant message
- conversation
- Creates a conversation with alternating user and assistant messages
- conversation_
with_ system - Creates a conversation with a system prompt
- impl_
json_ converter - Helper macro to create JSON event converters
- impl_
sse_ converter - Helper macro to create SSE event converters
- messages
- Creates a collection of messages with convenient syntax
- quick_
chat - Creates a quick chat request with a single user message
- system
- Creates a system message
- tool
- Creates a tool message
- traced_
http_ request - Macro for creating traced HTTP requests
- traced_
llm_ chat - Macro for creating traced LLM interactions
- user
- Creates a user message
- user_
builder - Creates a user message builder for complex messages
- user_
with_ image - Multimodal user message macro
Structs§
- Provider
- Provider entry point for creating specific provider clients
Constants§
- ENABLED_
PROVIDERS - Enabled providers at compile time
- PROVIDER_
COUNT - Number of enabled providers at compile time