Crate enki_runtime

Crate enki_runtime 

Source
Expand description

§Enki Runtime

A Rust-based agent mesh framework for building local and distributed AI agent systems.

§Architecture

Enki Runtime is modular, split into focused sub-crates that are re-exported here:

CrateDescription
coreCore abstractions: Agent, Memory, Mesh, Message
llmLLM integration with 13+ providers
localLocal mesh implementation
memoryMemory backend implementations
logging/[metrics]Observability
[mcp]Model Context Protocol (feature: mcp)

§Features

  • Agent Framework: Build autonomous AI agents with a simple trait-based API
  • Local Mesh: Connect multiple agents for inter-agent communication
  • LLM Integration: Built-in support for OpenAI, Anthropic, Ollama, Google, and 10+ more
  • Memory Backends: Pluggable memory (in-memory, SQLite, Redis)
  • TOML Configuration: Load agents from config files
  • Async-first: Built on Tokio for high performance
  • Observability: Structured logging and metrics

§Quick Start

§Custom Agent

use enki_runtime::{Agent, AgentContext, LocalMesh, Message};
use enki_runtime::core::error::Result;
use enki_runtime::core::mesh::Mesh;
use async_trait::async_trait;

struct MyAgent { name: String }

#[async_trait]
impl Agent for MyAgent {
    fn name(&self) -> String { self.name.clone() }

    async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
        println!("Received: {:?}", msg.topic);
        Ok(())
    }
}

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mesh = LocalMesh::new("my-mesh");
    mesh.add_agent(Box::new(MyAgent { name: "agent".into() })).await?;
    mesh.start().await?;
    Ok(())
}

§LLM Agent

use enki_runtime::{LlmAgent, AgentContext};

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
        .with_system_prompt("You are helpful.")
        .with_temperature(0.7)
        .build()?;

    let mut ctx = AgentContext::new("test".into(), None);
    let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
    println!("{}", response);
    Ok(())
}

§TOML Configuration

use enki_runtime::{LlmAgent, LlmAgentFromConfig};
use enki_runtime::config::AgentConfig;

let config = AgentConfig::from_file("agent.toml").unwrap();
let agent = LlmAgent::from_config(config).unwrap();

§Feature Flags

  • sqlite - SQLite memory backend
  • redis - Redis memory backend
  • mcp - Model Context Protocol support
  • full - All optional features

Re-exports§

pub use config::AgentConfig;
pub use config::MemoryBackendType;
pub use config::MemoryConfig;
pub use config::MeshConfig;
pub use enki_core as core;
pub use enki_llm as llm;
pub use enki_local as local;
pub use enki_memory as memory;

Modules§

config
Configuration module for Enki Runtime.
logging
metrics

Structs§

AgentContext
InMemoryBackend
In-memory storage backend with TTL support
LLMConfig
Configuration for LLM providers with all builder options.
LlmAgent
An agent that uses an LLM (Large Language Model) to process messages.
LlmAgentBuilder
Builder for constructing an LlmAgent with fluent API.
LocalMesh
MemoryEntry
Entry stored in memory
MemoryQuery
Query filter for memory retrieval
Message
A message for inter-agent communication (v1).
UniversalLLMClient

Enums§

AgentEvent
Events specific to agent execution flow

Traits§

Agent
The core trait that defines an autonomous agent.
LlmAgentFromConfig
Extension trait for creating LlmAgent from TOML configuration.
Memory
Base memory trait - all backends must implement this
MemoryFromConfig
Extension trait for creating memory backends from TOML configuration.
Mesh
Trait for agent coordination and message routing.
VectorMemory
Extended trait for vector-based semantic search

Functions§

metrics