Crate ceylon_runtime

Crate ceylon_runtime 

Source
Expand description

§Ceylon Runtime

A Rust-based agent mesh framework for building local and distributed AI agent systems.

§Architecture

Ceylon Runtime is modular, split into focused sub-crates that are re-exported here:

CrateDescription
coreCore abstractions: Agent, Memory, Mesh, Message
llmLLM integration with 13+ providers
localLocal mesh implementation
memoryMemory backend implementations
logging/[metrics]Observability
[mcp]Model Context Protocol (feature: mcp)

§Features

  • Agent Framework: Build autonomous AI agents with a simple trait-based API
  • Local Mesh: Connect multiple agents for inter-agent communication
  • LLM Integration: Built-in support for OpenAI, Anthropic, Ollama, Google, and 10+ more
  • Memory Backends: Pluggable memory (in-memory, SQLite, Redis)
  • TOML Configuration: Load agents from config files
  • Async-first: Built on Tokio for high performance
  • Observability: Structured logging and metrics

§Quick Start

§Custom Agent

use ceylon_runtime::{Agent, AgentContext, LocalMesh, Message};
use ceylon_runtime::core::error::Result;
use ceylon_runtime::core::mesh::Mesh;
use async_trait::async_trait;

struct MyAgent { name: String }

#[async_trait]
impl Agent for MyAgent {
    fn name(&self) -> String { self.name.clone() }

    async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
        println!("Received: {:?}", msg.topic);
        Ok(())
    }
}

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mesh = LocalMesh::new("my-mesh");
    mesh.add_agent(Box::new(MyAgent { name: "agent".into() })).await?;
    mesh.start().await?;
    Ok(())
}

§LLM Agent

use ceylon_runtime::{LlmAgent, AgentContext};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
        .with_system_prompt("You are helpful.")
        .with_temperature(0.7)
        .build()?;

    let mut ctx = AgentContext::new("test".into(), None);
    let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
    println!("{}", response);
    Ok(())
}

§TOML Configuration

use ceylon_runtime::{LlmAgent, LlmAgentFromConfig};
use ceylon_runtime::config::AgentConfig;

let config = AgentConfig::from_file("agent.toml").unwrap();
let agent = LlmAgent::from_config(config).unwrap();

§Feature Flags

  • sqlite - SQLite memory backend
  • redis - Redis memory backend
  • mcp - Model Context Protocol support
  • full - All optional features

Re-exports§

pub use config::AgentConfig;
pub use config::MeshConfig;
pub use ceylon_core as core;
pub use ceylon_llm as llm;
pub use ceylon_local as local;
pub use ceylon_memory as memory;

Modules§

config
Configuration module for Ceylon Runtime.
logging
metrics

Structs§

AgentContext
Runtime context provided to agents during lifecycle and message handling.
InMemoryBackend
In-memory storage backend with TTL support
LLMConfig
Configuration for LLM providers with all builder options.
LlmAgent
An agent that uses an LLM (Large Language Model) to process messages.
LlmAgentBuilder
Builder for constructing an LlmAgent with fluent API.
LocalMesh
MemoryEntry
Entry stored in memory
MemoryQuery
Query filter for memory retrieval
Message
A message for inter-agent communication.
UniversalLLMClient

Traits§

Agent
The core trait that defines an autonomous agent.
LlmAgentFromConfig
Extension trait for creating LlmAgent from TOML configuration.
Memory
Base memory trait - all backends must implement this
Mesh
Trait for agent coordination and message routing.
VectorMemory
Extended trait for vector-based semantic search

Functions§

metrics