ChatDelta
A unified Rust library for connecting to multiple AI APIs (OpenAI, Google Gemini, Anthropic Claude) with a common interface. Supports parallel execution, retry logic, and configurable parameters.
Features
- Unified Interface: Single trait (
AiClient) for all AI providers - Multiple Providers: OpenAI ChatGPT, Google Gemini, Anthropic Claude
- Parallel Execution: Run multiple AI models concurrently
- Retry Logic: Configurable retry attempts with exponential backoff
- Async/Await: Built with tokio for efficient async operations
- Type Safety: Full Rust type safety with comprehensive error handling
Quick Start
Add this to your Cargo.toml:
[]
= "0.1"
= { = "1", = ["full"] }
Usage
Basic Example
use ;
use Duration;
async
Parallel Execution
use ;
use Duration;
async
Supported Providers
OpenAI
- Provider:
"openai","gpt", or"chatgpt" - Models:
"gpt-4","gpt-3.5-turbo", etc. - API Key: OpenAI API key
Google Gemini
- Provider:
"google"or"gemini" - Models:
"gemini-1.5-pro","gemini-1.5-flash", etc. - API Key: Google AI API key
Anthropic Claude
- Provider:
"anthropic"or"claude" - Models:
"claude-3-5-sonnet-20241022","claude-3-haiku-20240307", etc. - API Key: Anthropic API key
Configuration
use ClientConfig;
use Duration;
let config = ClientConfig ;
Error Handling
The library provides comprehensive error handling through the ClientError enum:
ClientError::Network: Connection and timeout errorsClientError::Api: API-specific errors and rate limitsClientError::Authentication: Invalid API keysClientError::Configuration: Invalid parametersClientError::Parse: Response parsing errors
License
This project is licensed under the MIT License - see the LICENSE file for details.