Talk
A Rust library for creating controlled LLM agents with behavioral guidelines, tool integration, and multi-step conversation journeys.
Talk enables developers to create production-ready AI agents with predictable behavior in under 50 lines of Rust code, featuring pluggable LLM providers (OpenAI, Anthropic), configurable session storage backends, and comprehensive explainability features.
Features
- π― Behavioral Guidelines: Define predictable agent behavior with pattern matching and priority-based execution
- π§ Tool Integration: Register async functions as tools with configurable timeouts
- πΊοΈ Conversation Journeys: Multi-step conversation state machines for guided user flows
- π Pluggable LLM Providers: Built-in support for OpenAI and Anthropic with trait-based extensibility
- πΎ Session Storage: In-memory default with optional Redis and PostgreSQL backends
- π Explainability: Understand agent decisions with comprehensive decision tracking
- β‘ Performance: <2s response times, 1000+ concurrent sessions support
- π¦ Type-Safe: Full Rust type safety with compile-time guarantees
Quick Start
Add to your Cargo.toml:
[]
= "0.1"
= { = "1", = ["full"] }
Simple Agent with Guidelines (30 lines)
use ;
async
Installation
Prerequisites
- Rust 1.90 or later
- OpenAI or Anthropic API key
- Basic familiarity with async Rust
Basic Installation
With Optional Storage Backends
# Redis storage
# PostgreSQL storage
# All storage backends
Documentation
Examples
See the examples/ directory for more complex use cases:
simple_agent.rs- Basic agent with guidelinesweather_agent.rs- Agent with tool integrationonboarding.rs- Multi-step journey example
Performance
- Agent response time: <2s (excluding LLM latency)
- Tool integration overhead: <100ms
- Guideline matching: O(n) linear time with SIMD acceleration
- Concurrent sessions: 1000+ without degradation
Architecture
Talk is built on:
- Tokio: Async runtime for high-performance concurrent operations
- serde/serde_json: Zero-cost serialization with type safety
- async-openai & anthropic-sdk: Official LLM provider integrations
- Aho-Corasick + regex: Efficient pattern matching for guidelines
- thiserror: Type-safe error handling
Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
License
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Acknowledgments
Inspired by Parlant, the Python library for creating LLM-based agents.
Ready to build production-ready AI agents in Rust! π¦