litellm-rust
Minimal Rust SDK port of LiteLLM (library only). Provides a unified interface for chat, embeddings, images, and video across multiple LLM providers.
Note: This project is under active development. APIs may change.
Features
- Unified client for OpenAI-compatible, Anthropic, Gemini, and xAI providers
- Chat completions with streaming (SSE) support
- Text embeddings
- Image generation (OpenAI DALL-E / GPT Image)
- Video generation (Gemini Veo, OpenAI Sora)
- Model routing with
provider/modelformat - Automatic retry with exponential backoff
- Cost tracking via response headers
- Embedded model registry with pricing and context window data
Supported Providers
| Provider | Chat | Streaming | Embeddings | Images | Video |
|---|---|---|---|---|---|
| OpenAI-compatible | yes | yes | yes | yes | yes |
| Anthropic | yes | yes | - | - | - |
| Gemini | yes | - | - | yes | yes |
| xAI | yes | yes | - | - | - |
Installation
Add to your Cargo.toml:
[]
= { = "https://github.com/avivsinai/litellm-rust" }
Quick Start
use ;
async
Streaming
use StreamExt;
use ;
# async
Provider Configuration
Set API keys via environment variables:
| Variable | Provider |
|---|---|
OPENAI_API_KEY |
OpenAI |
ANTHROPIC_API_KEY |
Anthropic |
GEMINI_API_KEY |
Google Gemini |
OPENROUTER_API_KEY |
OpenRouter |
XAI_API_KEY |
xAI / Grok |
Model routing uses provider/model format (e.g., openai/gpt-4o, openrouter/anthropic/claude-sonnet-4-5).
Minimum Supported Rust Version
The MSRV is Rust 1.88. This is verified in CI.
Notes
- xAI uses OpenAI-compatible endpoints. Configure provider
xaiwith base URLhttps://api.x.ai/v1andXAI_API_KEY. - This crate intentionally excludes LiteLLM proxy/server features.
Contributing
See CONTRIBUTING.md for development setup and guidelines.
License
MIT - This project is a Rust port of LiteLLM by Berri AI (also MIT licensed).