LLMG — LLM Gateway
A high-performance Rust LLM gateway and provider library. One OpenAI-compatible API for 70+ LLM providers.
Features
- Unified API — Single OpenAI-compatible endpoint for every provider
- 70+ Providers — OpenAI, Anthropic, Azure, Groq, Mistral, Cohere, DeepSeek, Ollama, OpenRouter, and many more
- Library + Gateway — Use as a Rust crate or deploy the HTTP gateway
- Feature-Gated — Compile only the providers you need
- Streaming — Server-Sent Events across all providers
- Rig Integration — Drop-in provider for Rig agents
Quick Start
Gateway
Install the gateway via cargo:
Then run it with your API keys:
OPENAI_API_KEY=sk-...
Note: The gateway requires an
Authorization: Bearer <token>header. The token is not validated in the current release — any value works. See the Authentication docs for details.
Docker
Library
[]
= "0.4.0"
= { = "0.4.0", = ["openai"] }
use ;
use ;
// 1. Create registry and auto-load from env (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)
let mut registry = new;
register_all_from_env;
// 2. Create the provider-agnostic client
let client = new;
// 3. Use "provider/model" routing syntax
let request = ChatCompletionRequest ;
let response = client.chat_completion.await?;
How Routing Works
Requests use the provider/model format:
openai/gpt-4 → OpenAI
anthropic/claude-3-opus → Anthropic
groq/llama3-70b-8192 → Groq
ollama/llama3 → Ollama (local)
openrouter/openai/gpt-4 → OpenRouter (nested)
Built-in aliases let you use short names like gpt-4, claude, or gemini.
Project Structure
| Crate | Purpose |
|---|---|
llmg-core |
Shared types, traits, error handling |
llmg-providers |
Provider implementations (feature-gated) |
llmg-gateway |
HTTP gateway server (Axum) |
Configuration
Set API keys as environment variables. The gateway auto-registers providers based on which keys are present.
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GROQ_API_KEY=gsk_...
See the documentation for the full list of providers and their environment variables.
License
Licensed under either of Apache License 2.0 or MIT at your option.
Contributing
Contributions welcome! See CONTRIBUTING.md for guidelines.