qai-sdk 0.1.2

Universal Rust SDK for AI Providers
Documentation

QAI SDK

Crates.io Version Crates.io Downloads Docs.rs CI Pipeline Rust Version License

GitHub Stars GitHub Issues GitHub Pull Requests

A modular, type-safe Rust SDK for AI providers. One unified API across OpenAI, Anthropic Claude, Google Gemini, DeepSeek, xAI Grok, and any OpenAI-compatible endpoint.

Features

Capability OpenAI Anthropic Google DeepSeek xAI Compatible
Chat / Language Model
Streaming
Tool Calling
Vision / Multimodal
Embeddings
Image Generation
Speech (TTS)
Transcription (STT)
Text Completion
Responses API

Quick Start

Add to your Cargo.toml:

[dependencies]
qai-sdk = "0.1"
tokio = { version = "1", features = ["full"] }

By default, all providers are enabled. To optimize compile times, disable default features and select only the providers you need:

[dependencies]
qai-sdk = { version = "0.1", default-features = false, features = ["openai", "anthropic"] }

Basic Usage

use qai_sdk::prelude::*;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // Create a provider
    let provider = create_openai(ProviderSettings {
        api_key: Some("sk-...".to_string()),
        ..Default::default()
    });

    // Get a chat model
    let model = provider.chat("gpt-4o");

    // Generate a response
    let result = model.generate(
        Prompt {
            messages: vec![Message {
                role: Role::User,
                content: vec![Content::Text {
                    text: "Hello, world!".to_string(),
                }],
            }],
        },
        GenerateOptions {
            model_id: "gpt-4o".to_string(),
            max_tokens: Some(100),
            temperature: Some(0.7),
            top_p: None,
            stop_sequences: None,
            tools: None,
        },
    ).await?;

    println!("{}", result.text);
    Ok(())
}

Streaming

use qai_sdk::prelude::*;
use futures::StreamExt;

let model = provider.chat("gpt-4o");
let mut stream = model.generate_stream(prompt, options).await?;

while let Some(part) = stream.next().await {
    match part {
        StreamPart::TextDelta { delta } => print!("{delta}"),
        StreamPart::Finish { finish_reason } => println!("\n[{finish_reason}]"),
        _ => {}
    }
}

Switch Providers in One Line

// OpenAI
let provider = create_openai(settings.clone());
// Anthropic
let provider = create_anthropic(settings.clone());
// Google Gemini
let provider = create_google(settings.clone());
// DeepSeek
let provider = create_deepseek(settings.clone());
// xAI Grok
let provider = create_xai(settings.clone());
// Any OpenAI-compatible API
let provider = create_openai_compatible(settings);

Documentation

Dive deep into specific provider features and initialization parameters in our comprehensive module docs:

Architecture

qai-sdk is a single, monolithic crate designed with zero-cost abstractions. Providers are organically separated via modular architecture and gated by Cargo features, keeping compile times fast when you only need specific integrations:

qai-sdk
├── core                — Core traits: LanguageModel, EmbeddingModel, ImageModel
├── openai              — OpenAI API (GPT, DALL-E, Whisper, TTS, Responses)
├── anthropic           — Anthropic API (Claude)
├── google              — Google API (Gemini)
├── deepseek            — DeepSeek API (via OpenAI-compatible pipeline)
├── xai                 — xAI API (Grok, via OpenAI-compatible pipeline)
└── openai_compatible   — Any OpenAI-compatible endpoint (Ollama, LM Studio)

Examples

See the examples/ directory for 17 comprehensive examples covering:

  • Basic chat, streaming, and multimodal conversations
  • Tool calling / function calling
  • Embeddings, image generation, speech, and transcription
  • OpenAI Responses API
  • Error handling patterns
  • Provider factory pattern
  • OpenAI-compatible endpoints (Ollama, LM Studio, etc.)

Run an example:

cp .env.example .env
# Fill in your API keys
cargo run --example chat_basic

Environment Variables

Variable Provider
OPENAI_API_KEY OpenAI
ANTHROPIC_API_KEY Anthropic
GOOGLE_API_KEY Google Gemini
DEEPSEEK_API_KEY DeepSeek
XAI_API_KEY xAI

Contributing

See CONTRIBUTING.md for guidelines.

License

Licensed under either of:

at your option.

Author

Keyvan Arasteh@keyvanarasteh