Crate chatdelta

Source
Expand description

§ChatDelta AI Client Library

A Rust library for connecting to multiple AI APIs (OpenAI, Google Gemini, Anthropic Claude) with a unified interface. Supports parallel execution, retry logic, and configurable parameters.

§Example

use chatdelta::{AiClient, ClientConfig, create_client};
use std::time::Duration;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let config = ClientConfig {
        timeout: Duration::from_secs(30),
        retries: 3,
        temperature: Some(0.7),
        max_tokens: Some(1024),
    };
     
    let client = create_client("openai", "your-api-key", "gpt-4o", config)?;
    let response = client.send_prompt("Hello, world!").await?;
    println!("{}", response);
     
    Ok(())
}

Re-exports§

pub use clients::*;
pub use error::*;

Modules§

clients
AI client implementations
error
Error types for the ChatDelta AI client library

Structs§

ClientConfig
Configuration for AI clients

Traits§

AiClient
Common trait implemented by all AI clients

Functions§

create_client
Factory function to create AI clients
execute_parallel
Execute multiple AI clients in parallel and return all results
generate_summary
Generate a summary using one of the provided clients