enki-runtime 0.1.4

A Rust-based agent mesh framework for building local and distributed AI agent systems
Documentation
# Enki Runtime


A Rust-based agent mesh framework for building local and distributed AI agent systems.

[![Crates.io](https://img.shields.io/crates/v/enki-runtime.svg)](https://crates.io/crates/enki-runtime)
[![Documentation](https://docs.rs/enki-runtime/badge.svg)](https://docs.rs/enki-runtime)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

## Features


- **Agent Framework**: Build autonomous AI agents with a simple trait-based API
- **Local Mesh**: Connect multiple agents in a local mesh for inter-agent communication
- **LLM Integration**: Built-in support for 13+ LLM providers (OpenAI, Anthropic, Ollama, Google, etc.)
- **Memory Backends**: Pluggable memory systems with in-memory, SQLite, and Redis support
- **MCP Support**: Model Context Protocol for tool integration
- **Async-first**: Built on Tokio for high-performance async operations
- **Observability**: Structured logging and metrics collection built-in

## Architecture


Enki Runtime is modular, split into focused sub-crates:

| Crate                  | Description                                     |
| ---------------------- | ----------------------------------------------- |
| `enki-core`          | Core abstractions: Agent, Memory, Mesh, Message |
| `enki-llm`           | LLM integration with multi-provider support     |
| `enki-local`         | Local mesh implementation                       |
| `enki-memory`        | Memory backend implementations                  |
| `enki-observability` | Logging and metrics                             |
| `enki-mcp`           | Model Context Protocol (optional)               |

The `enki-runtime` umbrella crate re-exports all components for convenience.

## Installation


```toml
[dependencies]
enki-runtime = "0.1"
```

### Feature Flags


- `sqlite` - Enable SQLite memory backend
- `redis` - Enable Redis memory backend
- `mcp` - Enable Model Context Protocol support
- `full` - Enable all optional features

```toml
[dependencies]
enki-runtime = { version = "0.1", features = ["sqlite", "redis", "mcp"] }
```

## Quick Start


```rust
use enki_runtime::{Agent, AgentContext, LocalMesh, Message};
use enki_runtime::core::error::Result;
use enki_runtime::core::mesh::Mesh;
use async_trait::async_trait;

struct MyAgent {
    name: String,
}

#[async_trait]

impl Agent for MyAgent {
    fn name(&self) -> String {
        self.name.clone()
    }

    async fn on_message(&mut self, msg: Message, _ctx: &mut AgentContext) -> Result<()> {
        println!("Received: {:?}", msg.topic);
        Ok(())
    }
}

#[tokio::main]

async fn main() -> anyhow::Result<()> {
    let agent = MyAgent { name: "my-agent".to_string() };
    let mesh = LocalMesh::new("my-mesh");
    mesh.add_agent(Box::new(agent)).await?;
    mesh.start().await?;
    Ok(())
}
```

## Using LLM Agents


```rust
use enki_runtime::LlmAgent;

#[tokio::main]

async fn main() -> anyhow::Result<()> {
    // Create an LLM agent with Ollama (no API key needed)
    let mut agent = LlmAgent::builder("assistant", "ollama::gemma3:latest")
        .with_system_prompt("You are a helpful assistant.")
        .with_temperature(0.7)
        .build()?;

    // Use directly (without mesh)
    let mut ctx = enki_runtime::AgentContext::new("test".to_string(), None);
    let response = agent.send_message_and_get_response("Hello!", &mut ctx).await?;
    println!("Response: {}", response);

    Ok(())
}
```

## TOML Configuration


Load agents from TOML files:

```rust
use enki_runtime::{LlmAgent, LlmAgentFromConfig};
use enki_runtime::config::AgentConfig;

let config = AgentConfig::from_file("agent.toml")?;
let agent = LlmAgent::from_config(config)?;
```

## Core Components


| Component         | Description                                    |
| ----------------- | ---------------------------------------------- |
| `Agent`           | Trait for defining agent behavior              |
| `LocalMesh`       | Local multi-agent coordination                 |
| `LlmAgent`        | Pre-built agent with LLM capabilities          |
| `Memory`          | Trait for memory backends                      |
| `InMemoryBackend` | In-memory storage (default)                    |
| `SqliteBackend`   | SQLite persistent storage (feature: `sqlite`)  |
| `RedisBackend`    | Redis distributed storage (feature: `redis`)   |
| `McpClient`       | MCP client for external tools (feature: `mcp`) |

## Examples


Run examples with:

```bash
cargo run --example llm_ollama
cargo run --example toml_agents
cargo run --example mesh_architecture
cargo run --example mcp_client --features mcp
```

## Documentation


- [API Documentation]https://docs.rs/enki-runtime
- [GitHub Repository]https://github.com/Enkiai/Enki

## License


This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Contributing


Contributions are welcome! Please feel free to submit a Pull Request.