Expand description
§ChatDelta AI Client Library
A Rust library for connecting to multiple AI APIs (OpenAI, Google Gemini, Anthropic Claude) with a unified interface. Supports parallel execution, retry logic, and configurable parameters.
§Example
use chatdelta::{AiClient, ClientConfig, create_client};
use std::time::Duration;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let config = ClientConfig {
timeout: Duration::from_secs(30),
retries: 3,
temperature: Some(0.7),
max_tokens: Some(1024),
};
let client = create_client("openai", "your-api-key", "gpt-4o", config)?;
let response = client.send_prompt("Hello, world!").await?;
println!("{}", response);
Ok(())
}
Re-exports§
Modules§
Structs§
- Client
Config - Configuration for AI clients
Traits§
- AiClient
- Common trait implemented by all AI clients
Functions§
- create_
client - Factory function to create AI clients
- execute_
parallel - Execute multiple AI clients in parallel and return all results
- generate_
summary - Generate a summary using one of the provided clients