letta
A Rust client library for the Letta REST API, providing idiomatic Rust bindings for building stateful AI agents with persistent memory and context.
Unlike the Letta-provided TypeScript and Python libraries, this was not generated from the OpenAPI spec, but implemented by hand (with substantial LLM assistance). As such it exposes things in slightly different, mildly opinionated ways, and includes a number of Rust-oriented affordances.
Features
- Pagination: Automatic cursor-based pagination with
PaginatedStream - Type Safety: Comprehensive type definitions for all API requests/responses
- Flexible Configuration: Support for cloud and local deployments
- Rich Error Handling: Detailed error types
- Well Tested: Extensive test coverage with integration tests
Usage
Add this to your Cargo.toml:
[]
= "0.1.2"
CLI Installation
The letta crate includes an optional CLI tool for interacting with Letta servers:
# Install from crates.io
# Or build from source
After installation, the letta-client command will be available in your PATH.
CLI Configuration
Set your API key (for cloud deployments):
Or specify the base URL for local servers:
CLI Usage Examples
# Check server health
# List all agents
# Create a new agent
# Send a message to an agent (with streaming)
# View agent memory
# Upload a document to a source
# Get help for any command
The CLI supports multiple output formats:
--output summary(default) - Human-readable format--output json- JSON output for scripting--output pretty- Pretty-printed JSON
Compatibility
| letta client | letta server |
|---|---|
| 0.1.2 | 0.8.8 |
| 0.1.0-0.1.1 | 0.8.x |
Quick Start
use ;
use ;
async
API Coverage
Core APIs
- ✅ Agents - Create, update, delete, and manage AI agents
- ✅ Messages - Send messages and stream responses with SSE
- ✅ Memory - Manage core and archival memory with semantic search
- ✅ Tools - Register and manage agent tools (functions)
- ✅ Sources - Upload documents and manage knowledge sources
- ✅ Blocks - Manage memory blocks and persistent storage
Advanced APIs
- ✅ Groups - Multi-agent conversations
- ✅ Runs - Execution tracking and debugging
- ✅ Jobs - Asynchronous job management
- ✅ Batch - Batch message processing
- ✅ Templates - Agent templates for quick deployment
- ✅ Projects - Project organization
- ✅ Models - LLM and embedding model configuration
- ✅ Providers - LLM provider management
- ✅ Identities - Identity and permissions management
- ✅ Tags - Tag-based organization
- ✅ Telemetry - Usage tracking and monitoring
- 🚧 Voice - Voice conversation support (beta)
Examples
Creating an Agent with Builder Pattern
use ;
// Create agent using builder pattern
let request = builder
.name
.agent_type
.description
.model // Shorthand for LLM config
.embedding // Shorthand for embedding config
.build;
let agent = client.agents.create.await?;
// Create custom memory blocks with builder
let human_block = human
.label;
let persona_block = persona
.label;
Working with Archival Memory
// Add to archival memory
client
.memory
.insert_archival_memory
.await?;
// Search archival memory
let memories = client
.memory
.search_archival_memory
.await?;
for memory in memories
Streaming with Pagination
// Get paginated list of agents
let mut agent_stream = client
.agents
.paginated
.limit
.build;
while let Some = agent_stream.next.await
Managing Tools
use ;
// Create a custom tool
// Note: this example is simplified, see the tool documentation for details.
let tool = CreateToolRequest ;
let created_tool = client.tools.create.await?;
// Add tool to agent
client
.agents
.add_tool
.await?;
Configuration
Local Development Server
// No authentication required for local server
let config = new?;
let client = new?;
Letta Cloud
// Use API key for cloud deployment
let config = new?
.with_api_key;
let client = new?;
Custom Headers
// Add custom headers like X-Project
let config = new?
.with_header?;
Error Handling
The library provides comprehensive error handling with detailed context:
use LettaError;
match client.agents.get.await
Development
Building from Source
# Clone the repository
# Build the library
# Run tests
# Build documentation
Running the Local Test Server
# Start local Letta server for testing
# Run integration tests
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Built with reqwest for HTTP operations
- Uses tokio for async runtime
- Streaming support via eventsource-stream
- Error handling with miette
Related Projects
- Letta - The official Letta server
- letta-node - Official TypeScript/JavaScript SDK
- letta-python - Official Python SDK