Expand description
§Enki Runtime
A Rust-based agent mesh framework for building local and distributed AI agent systems.
§Architecture
Enki Runtime is modular, split into focused sub-crates that are re-exported here:
| Crate | Description |
|---|---|
core | Core abstractions: Agent, Memory, Mesh, Message |
llm | LLM integration with 13+ providers |
local | Local mesh implementation |
memory | Memory backend implementations |
logging/[metrics] | Observability |
[mcp] | Model Context Protocol (feature: mcp) |
§Features
- Agent Framework: Build autonomous AI agents with a simple trait-based API
- Local Mesh: Connect multiple agents for inter-agent communication
- LLM Integration: Built-in support for OpenAI, Anthropic, Ollama, Google, and 10+ more
- Memory Backends: Pluggable memory (in-memory, SQLite, Redis)
- TOML Configuration: Load agents from config files
- Async-first: Built on Tokio for high performance
- Observability: Structured logging and metrics
§Quick Start
§Custom Agent
use enki_runtime::{Agent, AgentContext, LocalMesh, Message};
use enki_runtime::core::error::Result;
use enki_runtime::core::mesh::Mesh;
use async_trait::async_trait;
struct MyAgent { name: String }
#[async_trait]
impl Agent for MyAgent {
fn name(&self) -> String { self.name.clone() }
async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
println!("Received: {:?}", msg.topic);
Ok(())
}
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let mesh = LocalMesh::new("my-mesh");
mesh.add_agent(Box::new(MyAgent { name: "agent".into() })).await?;
mesh.start().await?;
Ok(())
}§LLM Agent
use enki_runtime::{LlmAgent, AgentContext};
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
.with_system_prompt("You are helpful.")
.with_temperature(0.7)
.build()?;
let mut ctx = AgentContext::new("test".into(), None);
let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
println!("{}", response);
Ok(())
}§TOML Configuration
use enki_runtime::{LlmAgent, LlmAgentFromConfig};
use enki_runtime::config::AgentConfig;
let config = AgentConfig::from_file("agent.toml").unwrap();
let agent = LlmAgent::from_config(config).unwrap();§Feature Flags
sqlite- SQLite memory backendredis- Redis memory backendmcp- Model Context Protocol supportfull- All optional features
Re-exports§
pub use config::AgentConfig;pub use config::MemoryBackendType;pub use config::MemoryConfig;pub use config::MeshConfig;pub use enki_core as core;pub use enki_llm as llm;pub use enki_local as local;pub use enki_memory as memory;
Modules§
Structs§
- Agent
Context - InMemory
Backend - In-memory storage backend with TTL support
- LLMConfig
- Configuration for LLM providers with all builder options.
- LlmAgent
- An agent that uses an LLM (Large Language Model) to process messages.
- LlmAgent
Builder - Builder for constructing an
LlmAgentwith fluent API. - Local
Mesh - Memory
Entry - Entry stored in memory
- Memory
Query - Query filter for memory retrieval
- Message
- A message for inter-agent communication (v1).
- UniversalLLM
Client
Enums§
- Agent
Event - Events specific to agent execution flow
Traits§
- Agent
- The core trait that defines an autonomous agent.
- LlmAgent
From Config - Extension trait for creating
LlmAgentfrom TOML configuration. - Memory
- Base memory trait - all backends must implement this
- Memory
From Config - Extension trait for creating memory backends from TOML configuration.
- Mesh
- Trait for agent coordination and message routing.
- Vector
Memory - Extended trait for vector-based semantic search