QAI SDK
A modular, type-safe Rust SDK for AI providers. One unified API across OpenAI, Anthropic Claude, Google Gemini, DeepSeek, xAI Grok, and any OpenAI-compatible endpoint.
Features
| Capability | OpenAI | Anthropic | DeepSeek | xAI | Compatible | |
|---|---|---|---|---|---|---|
| Chat / Language Model | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Streaming | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Tool Calling | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Vision / Multimodal | ✅ | ✅ | ✅ | — | — | — |
| Embeddings | ✅ | — | ✅ | — | — | — |
| Image Generation | ✅ | — | ✅ | — | — | — |
| Speech (TTS) | ✅ | — | — | — | — | — |
| Transcription (STT) | ✅ | — | — | — | — | — |
| Text Completion | ✅ | — | — | — | — | — |
| Responses API | ✅ | — | — | — | — | — |
Quick Start
Add to your Cargo.toml:
[]
= "0.1"
= { = "1", = ["full"] }
By default, all providers are enabled. To optimize compile times, disable default features and select only the providers you need:
[]
= { = "0.1", = false, = ["openai", "anthropic"] }
Basic Usage
use *;
async
Streaming
use *;
use StreamExt;
let model = provider.chat;
let mut stream = model.generate_stream.await?;
while let Some = stream.next.await
Switch Providers in One Line
// OpenAI
let provider = create_openai;
// Anthropic
let provider = create_anthropic;
// Google Gemini
let provider = create_google;
// DeepSeek
let provider = create_deepseek;
// xAI Grok
let provider = create_xai;
// Any OpenAI-compatible API
let provider = create_openai_compatible;
Documentation
Dive deep into specific provider features and initialization parameters in our comprehensive module docs:
- Core Interoperability
qai_sdk::core - OpenAI Provider
qai_sdk::openai - Anthropic Provider
qai_sdk::anthropic - Google Gemini Provider
qai_sdk::google - DeepSeek Provider
qai_sdk::deepseek - xAI Grok Provider
qai_sdk::xai - OpenAI Compatible Provider
qai_sdk::openai_compatible
Architecture
qai-sdk is a single, monolithic crate designed with zero-cost abstractions. Providers are organically separated via modular architecture and gated by Cargo features, keeping compile times fast when you only need specific integrations:
qai-sdk
├── core — Core traits: LanguageModel, EmbeddingModel, ImageModel
├── openai — OpenAI API (GPT, DALL-E, Whisper, TTS, Responses)
├── anthropic — Anthropic API (Claude)
├── google — Google API (Gemini)
├── deepseek — DeepSeek API (via OpenAI-compatible pipeline)
├── xai — xAI API (Grok, via OpenAI-compatible pipeline)
└── openai_compatible — Any OpenAI-compatible endpoint (Ollama, LM Studio)
Examples
See the examples/ directory for 17 comprehensive examples covering:
- Basic chat, streaming, and multimodal conversations
- Tool calling / function calling
- Embeddings, image generation, speech, and transcription
- OpenAI Responses API
- Error handling patterns
- Provider factory pattern
- OpenAI-compatible endpoints (Ollama, LM Studio, etc.)
Run an example:
# Fill in your API keys
Environment Variables
| Variable | Provider |
|---|---|
OPENAI_API_KEY |
OpenAI |
ANTHROPIC_API_KEY |
Anthropic |
GOOGLE_API_KEY |
Google Gemini |
DEEPSEEK_API_KEY |
DeepSeek |
XAI_API_KEY |
xAI |
Contributing
See CONTRIBUTING.md for guidelines.
License
Licensed under either of:
at your option.
Author
Keyvan Arasteh — @keyvanarasteh