ceylon 0.1.0

A powerful AI agent framework with goal-oriented capabilities, memory management, and tool integration
docs.rs failed to build ceylon-0.1.0
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build: ceylon-0.22.1

Ceylon - AI Agent Framework

Crates.io Documentation License

A powerful and flexible Rust framework for building AI agents with goal-oriented capabilities, memory management, and tool integration.

Features

  • Goal-Oriented Agents: Create agents that can analyze tasks, break them into sub-goals, and track progress
  • Memory Management: Built-in conversation history, context management, and vector memory support
  • Tool Integration: Extensible tool system for adding custom capabilities to your agents
  • Multiple LLM Support: Works with 13+ providers including OpenAI, Anthropic, Ollama, Google, Groq, and more
  • Async-First: Built on Tokio for efficient async/await support
  • Vector Memory: Optional support for semantic search with OpenAI, Ollama, HuggingFace embeddings
  • Interactive Runner: Optional CLI runner for interactive agent sessions
  • WASM Support: Can be compiled to WebAssembly for browser-based applications

Quick Start

Add Ceylon to your Cargo.toml:

[dependencies]

ceylon = "0.1.0"

tokio = { version = "1", features = ["rt-multi-thread", "macros"] }

Basic Usage

use ceylon::agent::Agent;
use ceylon::tasks::TaskRequest;

#[tokio::main]
async fn main() {
    // Create a new agent
    let mut agent = Agent::new("MyAssistant", "openai::gpt-4");

    // Create a task
    let task = TaskRequest::new("What is the capital of France?");

    // Run the agent
    let response = agent.run(task).await;
    println!("Response: {:?}", response.result());
}

Set your API key as an environment variable:

export OPENAI_API_KEY="your-api-key-here"

Working with Tools

Extend your agent's capabilities with custom tools:

use ceylon::agent::Agent;
use ceylon::tools::ToolTrait;
use serde_json::json;

// Define a custom tool
struct CalculatorTool;

impl ToolTrait for CalculatorTool {
    fn name(&self) -> String {
        "calculator".to_string()
    }

    fn description(&self) -> String {
        "Performs basic arithmetic operations".to_string()
    }

    fn input_schema(&self) -> serde_json::Value {
        json!({
            "type": "object",
            "properties": {
                "operation": {"type": "string", "enum": ["add", "subtract", "multiply", "divide"]},
                "a": {"type": "number"},
                "b": {"type": "number"}
            },
            "required": ["operation", "a", "b"]
        })
    }

    fn execute(&self, input: serde_json::Value) -> serde_json::Value {
        let op = input["operation"].as_str().unwrap();
        let a = input["a"].as_f64().unwrap();
        let b = input["b"].as_f64().unwrap();

        let result = match op {
            "add" => a + b,
            "subtract" => a - b,
            "multiply" => a * b,
            "divide" => a / b,
            _ => 0.0,
        };

        json!({"result": result})
    }
}

#[tokio::main]
async fn main() {
    let mut agent = Agent::new("Calculator Agent", "openai::gpt-4");
    agent.add_tool(CalculatorTool);

    let task = TaskRequest::new("What is 15 multiplied by 7?");
    let response = agent.run(task).await;
    println!("{:?}", response.result());
}

Working with Memory

Agents automatically maintain conversation history:

use ceylon::agent::Agent;
use ceylon::tasks::TaskRequest;

#[tokio::main]
async fn main() {
    let mut agent = Agent::new("MemoryAgent", "openai::gpt-4");

    // First conversation
    let task1 = TaskRequest::new("My name is Alice");
    agent.run(task1).await;

    // Second conversation - agent remembers context
    let task2 = TaskRequest::new("What is my name?");
    let response = agent.run(task2).await;
    // Agent should respond with "Alice"

    // Search memory
    let memories = agent.search_memory("Alice").await;
    println!("Found {} relevant conversations", memories.len());
}

Supported LLM Providers

Ceylon supports 13+ LLM providers out of the box:

Provider Example Model String API Key Env Var
OpenAI openai::gpt-4 OPENAI_API_KEY
Anthropic anthropic::claude-3-5-sonnet-20241022 ANTHROPIC_API_KEY
Ollama ollama::llama3.2 (local)
DeepSeek deepseek::deepseek-coder DEEPSEEK_API_KEY
X.AI (Grok) xai::grok-beta XAI_API_KEY
Google Gemini google::gemini-pro GOOGLE_API_KEY
Groq groq::mixtral-8x7b-32768 GROQ_API_KEY
Azure OpenAI azure::gpt-4 AZURE_OPENAI_API_KEY
Cohere cohere::command COHERE_API_KEY
Mistral mistral::mistral-large-latest MISTRAL_API_KEY
Phind phind::Phind-CodeLlama-34B-v2 PHIND_API_KEY
OpenRouter openrouter::anthropic/claude-3-opus OPENROUTER_API_KEY
ElevenLabs elevenlabs::eleven_monolingual_v1 ELEVENLABS_API_KEY

Features

Ceylon uses Cargo features to enable optional functionality:

[dependencies]

# Default: std features, vector memory, and CLI runner

ceylon = "0.1.0"



# Minimal installation (no tokio, no LLM, suitable for WASM)

ceylon = { version = "0.1.0", default-features = false }



# With specific vector providers

ceylon = { version = "0.1.0", features = ["vector-openai"] }

ceylon = { version = "0.1.0", features = ["vector-huggingface-local"] }



# All vector providers

ceylon = { version = "0.1.0", features = ["full-vector"] }

Available Features

  • std (default): Standard features including tokio, LLM support, SQLite memory, and MessagePack serialization
  • vector: Base vector memory functionality
  • vector-openai: OpenAI embeddings for vector memory
  • vector-ollama: Ollama embeddings for vector memory
  • vector-huggingface: HuggingFace API embeddings
  • vector-huggingface-local: Local HuggingFace embeddings using Candle
  • full-vector: All vector providers
  • runner: Interactive CLI runner
  • wasm: WebAssembly support

Goal-Oriented Programming

Create agents that can break down complex tasks:

use ceylon::agent::Agent;
use ceylon::goal::Goal;
use ceylon::tasks::TaskRequest;

#[tokio::main]
async fn main() {
    let mut agent = Agent::new("ProjectManager", "openai::gpt-4");

    // Create a goal with success criteria
    let mut goal = Goal::new(
        "Launch Product",
        "Successfully launch the new product to market"
    );

    goal.add_criterion("Product is tested and bug-free");
    goal.add_criterion("Marketing materials are ready");
    goal.add_criterion("Launch event is scheduled");

    // Add sub-goals
    goal.add_sub_goal(Goal::new("Development", "Complete development"));
    goal.add_sub_goal(Goal::new("Marketing", "Create marketing campaign"));
    goal.add_sub_goal(Goal::new("Launch", "Execute launch"));

    // Track progress
    println!("Progress: {}%", goal.get_progress());
}

Examples

The repository includes numerous examples:

  • 01_basic_agent: Simple agent creation and usage
  • 02_with_tools: Custom tool implementation
  • 03_with_memory: Working with conversation history
  • 04_advanced_agent: Complex agent configurations
  • 05_with_goals: Goal-oriented task management
  • 08_llm_providers: Using different LLM providers
  • 10_file_saving: Creating file-saving tools
  • 11_persistent_memory: SQLite-backed memory
  • 12_vector_memory: Semantic search with Ollama
  • 13_vector_memory_openai: OpenAI embeddings
  • 14_vector_memory_huggingface: HuggingFace API embeddings
  • 15_vector_memory_huggingface_local: Local embeddings with Candle

Run examples from the repository:

# Clone the repository

git clone https://github.com/ceylonai/next.git

cd next


# Run an example

cargo run --example 01_basic_agent --manifest-path ceylon/Cargo.toml

Documentation

Architecture

Ceylon is organized into several core modules:

  • agent: Core agent implementation and lifecycle management
  • tools: Tool system and built-in tools
  • memory: Memory backends (in-memory, SQLite, vector)
  • llm: LLM provider integrations and abstractions
  • goal: Goal-oriented task management
  • runner: Interactive CLI runner
  • tasks: Task definitions and execution

Contributing

We welcome contributions! Please see our GitHub repository for more information.

License

Licensed under either of:

at your option.

Acknowledgments

Ceylon is built on top of the excellent llm crate for LLM provider integrations.