Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Ceylon - AI Agent Framework
A powerful and flexible Rust framework for building AI agents with goal-oriented capabilities, memory management, and tool integration.
Features
- Goal-Oriented Agents: Create agents that can analyze tasks, break them into sub-goals, and track progress
- Memory Management: Built-in conversation history, context management, and vector memory support
- Tool Integration: Extensible tool system for adding custom capabilities to your agents
- Multiple LLM Support: Works with 13+ providers including OpenAI, Anthropic, Ollama, Google, Groq, and more
- Async-First: Built on Tokio for efficient async/await support
- Vector Memory: Optional support for semantic search with OpenAI, Ollama, HuggingFace embeddings
- Interactive Runner: Optional CLI runner for interactive agent sessions
- WASM Support: Can be compiled to WebAssembly for browser-based applications
Quick Start
Add Ceylon to your Cargo.toml:
[]
= "0.1.0"
= { = "1", = ["rt-multi-thread", "macros"] }
Basic Usage
use Agent;
use TaskRequest;
async
Set your API key as an environment variable:
Working with Tools
Extend your agent's capabilities with custom tools:
use Agent;
use ToolTrait;
use json;
// Define a custom tool
;
async
Working with Memory
Agents automatically maintain conversation history:
use Agent;
use TaskRequest;
async
Supported LLM Providers
Ceylon supports 13+ LLM providers out of the box:
| Provider | Example Model String | API Key Env Var |
|---|---|---|
| OpenAI | openai::gpt-4 |
OPENAI_API_KEY |
| Anthropic | anthropic::claude-3-5-sonnet-20241022 |
ANTHROPIC_API_KEY |
| Ollama | ollama::llama3.2 |
(local) |
| DeepSeek | deepseek::deepseek-coder |
DEEPSEEK_API_KEY |
| X.AI (Grok) | xai::grok-beta |
XAI_API_KEY |
| Google Gemini | google::gemini-pro |
GOOGLE_API_KEY |
| Groq | groq::mixtral-8x7b-32768 |
GROQ_API_KEY |
| Azure OpenAI | azure::gpt-4 |
AZURE_OPENAI_API_KEY |
| Cohere | cohere::command |
COHERE_API_KEY |
| Mistral | mistral::mistral-large-latest |
MISTRAL_API_KEY |
| Phind | phind::Phind-CodeLlama-34B-v2 |
PHIND_API_KEY |
| OpenRouter | openrouter::anthropic/claude-3-opus |
OPENROUTER_API_KEY |
| ElevenLabs | elevenlabs::eleven_monolingual_v1 |
ELEVENLABS_API_KEY |
Features
Ceylon uses Cargo features to enable optional functionality:
[]
# Default: std features, vector memory, and CLI runner
= "0.1.0"
# Minimal installation (no tokio, no LLM, suitable for WASM)
= { = "0.1.0", = false }
# With specific vector providers
= { = "0.1.0", = ["vector-openai"] }
= { = "0.1.0", = ["vector-huggingface-local"] }
# All vector providers
= { = "0.1.0", = ["full-vector"] }
Available Features
std(default): Standard features including tokio, LLM support, SQLite memory, and MessagePack serializationvector: Base vector memory functionalityvector-openai: OpenAI embeddings for vector memoryvector-ollama: Ollama embeddings for vector memoryvector-huggingface: HuggingFace API embeddingsvector-huggingface-local: Local HuggingFace embeddings using Candlefull-vector: All vector providersrunner: Interactive CLI runnerwasm: WebAssembly support
Goal-Oriented Programming
Create agents that can break down complex tasks:
use Agent;
use Goal;
use TaskRequest;
async
Examples
The repository includes numerous examples:
- 01_basic_agent: Simple agent creation and usage
- 02_with_tools: Custom tool implementation
- 03_with_memory: Working with conversation history
- 04_advanced_agent: Complex agent configurations
- 05_with_goals: Goal-oriented task management
- 08_llm_providers: Using different LLM providers
- 10_file_saving: Creating file-saving tools
- 11_persistent_memory: SQLite-backed memory
- 12_vector_memory: Semantic search with Ollama
- 13_vector_memory_openai: OpenAI embeddings
- 14_vector_memory_huggingface: HuggingFace API embeddings
- 15_vector_memory_huggingface_local: Local embeddings with Candle
Run examples from the repository:
# Clone the repository
# Run an example
Documentation
Architecture
Ceylon is organized into several core modules:
agent: Core agent implementation and lifecycle managementtools: Tool system and built-in toolsmemory: Memory backends (in-memory, SQLite, vector)llm: LLM provider integrations and abstractionsgoal: Goal-oriented task managementrunner: Interactive CLI runnertasks: Task definitions and execution
Contributing
We welcome contributions! Please see our GitHub repository for more information.
License
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
at your option.
Acknowledgments
Ceylon is built on top of the excellent llm crate for LLM provider integrations.