QAI SDK
A modular, type-safe Rust SDK for AI providers. One unified API across OpenAI, Anthropic Claude, Google Gemini, DeepSeek, xAI Grok, and any OpenAI-compatible endpoint.
Features
| Capability | OpenAI | Anthropic | DeepSeek | xAI | Compatible | |
|---|---|---|---|---|---|---|
| Chat / Language Model | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Streaming | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Tool Calling | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Structured Output (generate_object) |
✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Provider Registry | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Middleware Layer | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Universal Agent | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
| Vision / Multimodal | ✅ | ✅ | ✅ | — | — | — |
| Embeddings | ✅ | — | ✅ | — | — | — |
| Image Generation | ✅ | — | ✅ | — | — | — |
| Speech (TTS) | ✅ | — | — | — | — | — |
| Transcription (STT) | ✅ | — | — | — | — | — |
| Text Completion | ✅ | — | — | — | — | — |
| Responses API | ✅ | — | — | — | — | — |
| Model Context Protocol (MCP) | ✅ | ✅ | ✅ | ✅ | ✅ | ✅ |
Unified API Demo
The playground.html showcase demonstrates the lightning-fast API flexibility. Open it locally to interact with it directly.
Quick Start
Add to your Cargo.toml:
[]
= "0.1"
= { = "1", = ["full"] }
By default, all providers are enabled. To optimize compile times, disable default features and select only the providers you need:
[]
= { = "0.1", = false, = ["openai", "anthropic"] }
Basic Usage
use *;
async
Streaming
use *;
use StreamExt;
let model = provider.chat;
let mut stream = model.generate_stream.await?;
while let Some = stream.next.await
Switch Providers in One Line
// OpenAI
let provider = create_openai;
// Anthropic
let provider = create_anthropic;
// Google Gemini
let provider = create_google;
// DeepSeek
let provider = create_deepseek;
// xAI Grok
let provider = create_xai;
// Any OpenAI-compatible API
let provider = create_openai_compatible;
Provider Registry — Resolve Models by String
use ProviderRegistry;
let registry = new
.register
.register;
let model = registry.language_model?;
let result = model.generate.await?;
Structured Output — Force JSON Schema Conformance
use *;
let result = generate_object.await?;
println!; // {"name": "John Doe", "age": 30}
Middleware — Composable Model Wrappers
use *;
let wrapped = wrap_language_model;
// Every call now uses temperature=0.7 if not explicitly set
Universal Agent — Multi-Step Tool Loop
use Agent;
let agent = builder
.model
.tools
.tool_handler
.max_steps
.system
.build
.expect;
let result = agent.run.await?;
println!;
Documentation
Dive deep into specific provider features and initialization parameters in our comprehensive module docs:
- Core Interoperability
qai_sdk::core - OpenAI Provider
qai_sdk::openai - Anthropic Provider
qai_sdk::anthropic - Google Gemini Provider
qai_sdk::google - DeepSeek Provider
qai_sdk::deepseek - xAI Grok Provider
qai_sdk::xai - OpenAI Compatible Provider
qai_sdk::openai_compatible - Model Context Protocol
qai_sdk::mcp - Structured Output
qai_sdk::core::structured - Provider Registry
qai_sdk::core::registry - Middleware
qai_sdk::core::middleware - Universal Agent
qai_sdk::core::agent
Architecture
qai-sdk is a single, monolithic crate designed with zero-cost abstractions. Providers are organically separated via modular architecture and gated by Cargo features, keeping compile times fast when you only need specific integrations:
qai-sdk
├── core
│ ├── traits — LanguageModel, EmbeddingModel, ImageModel, SpeechModel, TranscriptionModel
│ ├── structured — generate_object() / stream_object() with JSON Schema validation
│ ├── registry — ProviderRegistry for "provider:model" string resolution
│ ├── middleware — Composable LanguageModelMiddleware (DefaultSettings, ExtractReasoning)
│ └── agent — Universal Agent with builder pattern & max_steps tool loop
├── openai — OpenAI API (GPT, DALL-E, Whisper, TTS, Responses)
├── anthropic — Anthropic API (Claude)
├── google — Google API (Gemini)
├── deepseek — DeepSeek API (via OpenAI-compatible pipeline)
├── xai — xAI API (Grok, via OpenAI-compatible pipeline)
├── openai_compatible — Any OpenAI-compatible endpoint (Ollama, LM Studio)
└── mcp — Model Context Protocol (JSON-RPC, Stdio/SSE, resources, prompts)
Examples
See the examples/ directory for 17 comprehensive examples covering:
- Basic chat, streaming, and multimodal conversations
- Tool calling / function calling
- Embeddings, image generation, speech, and transcription
- OpenAI Responses API
- Error handling patterns
- Provider factory pattern
- OpenAI-compatible endpoints (Ollama, LM Studio, etc.)
Run an example:
# Fill in your API keys
Environment Variables
| Variable | Provider |
|---|---|
OPENAI_API_KEY |
OpenAI |
ANTHROPIC_API_KEY |
Anthropic |
GOOGLE_API_KEY |
Google Gemini |
DEEPSEEK_API_KEY |
DeepSeek |
XAI_API_KEY |
xAI |
Contributing
See CONTRIBUTING.md for guidelines.
License
Licensed under either of:
at your option.
Author
Keyvan Arasteh — @keyvanarasteh