limit-llm
Multi-provider LLM client for Rust with streaming support.
Unified API for Anthropic Claude, OpenAI, and z.ai models with built-in token tracking, state persistence, and automatic model handoff.
Part of the Limit ecosystem.
Features
- Multi-provider support: Anthropic Claude, OpenAI GPT, z.ai
- Streaming responses: Async streaming with
futures::Stream - Token tracking: SQLite-based usage tracking and cost estimation
- State persistence: Serialize/restore conversation state
- Model handoff: Automatic fallback between providers
- Tool calling: Full function/tool support for supported providers
- Type-safe: Full Rust type system with serde integration
Installation
Add to your Cargo.toml:
[]
= "0.0.25"
Quick Start
use ;
async
Streaming
use StreamExt;
let mut stream = client.complete_stream.await?;
while let Some = stream.next.await
Tool Calling
use ;
let tools = vec!;
let response = client.complete_with_tools.await?;
API
Providers
| Provider | Client | Streaming | Tools | Thinking |
|---|---|---|---|---|
| Anthropic Claude | AnthropicClient |
✓ | ✓ | ✓ |
| OpenAI | OpenAiProvider |
✓ | ✓ | — |
| z.ai | ZaiProvider |
✓ | ✓ | ✓ |
| Local/Ollama | LocalProvider |
✓ | — | — |
Core Types
Message— Chat message with role, content, and tool callsRole— User, Assistant, or SystemTool/ToolCall— Function calling definitionsUsage— Token counting for prompt/completionResponse— Complete response with content and metadata
Advanced Features
use ;
// Track usage across sessions
let tracking = new?;
tracking.record_usage?;
// Persist conversation state
let persistence = new?;
persistence.save?;
// Automatic model fallback
let handoff = new
.with_primary
.with_fallback;
Configuration
Environment variables:
ANTHROPIC_API_KEY=your-key # For Claude
OPENAI_API_KEY=your-key # For GPT
ZAI_API_KEY=your-key # For z.ai
Or use Config for programmatic configuration:
use ;
let config = Config ;
License
MIT