AI Agents Framework
One YAML = Any Agent.
A Rust framework for building AI agents from a single YAML specification. No code required for common use cases.
ai-agents.rs - Documentation, guides, and examples
- Declarative behavior - everything in YAML, not code
- Language-agnostic semantics - intent, extraction, validation via LLM (no regex)
- Layered overrides - global → agent → state → skill → turn
- Safety by default - tool policies, HITL approvals, error recovery
- Extensible - custom LLMs, tools, memory, storage, hooks
Status: 1.0.0-rc.10 — Under active development. APIs and YAML schema may change between minor versions.
Features
- Multi-LLM with fallback - 12 providers (OpenAI, Anthropic, Google, Ollama, DeepSeek, Groq, Mistral, Cohere, xAI, Phind, OpenRouter, any OpenAI-compatible); named aliases (default, router); auto-fallback on failure
- Hierarchical state machine - nested sub-states, LLM-evaluated transitions, guard-based short-circuiting, intent-based routing, entry/exit actions
- Skill system - reusable tool + prompt workflows with LLM-based intent routing
- built-in tools + MCP - datetime, JSON, HTTP, file, text, template, math, calculator, random, echo; connect any MCP server for hundreds more
- Tool scoping & conditions - 3-level filtering (state → spec → registry), context/state/time/semantic conditions, multi-language aliases, parallel execution
- Input/output process pipeline - normalize, detect, extract, sanitize, validate, transform, format - all LLM-based, works across languages
- CompactingMemory - LLM-based rolling summarization, token budgeting, SQLite/Redis/file persistence
- Dynamic context - runtime, file, HTTP, env, and callback sources with Jinja2 templates in prompts
- Reasoning & reflection - chain-of-thought, ReAct, plan-and-execute, auto mode; LLM self-evaluation with criteria and retry
- Intent disambiguation - LLM-based ambiguity detection, clarification generation, multi-turn resolution
- Safety & control - error recovery with backoff, tool security (rate limits, domain restrictions), human-in-the-loop approvals with multi-language messages
- Dynamic agent spawning - runtime agent creation from YAML/templates, agent registry, inter-agent messaging
- Extensible via traits -
LLMProvider,Memory,Tool,ApprovalHandler,Summarizer,AgentHooks,ToolProvider
See Concepts for architecture details and Providers for per-provider setup.
Install
[]
= "1.0.0-rc.10"
Quick Start
From CLI (no Rust code needed)
Create agent.yaml:
# agent.yaml
name: MyAgent
system_prompt: "You are a helpful assistant."
llm:
provider: openai
model: gpt-4.1-nano
# For any OpenAI-compatible server:
# llm:
# provider: openai-compatible
# model: qwen3:8b
# base_url: http://localhost:11434/v1
# Provider-specific extra params are also allowed.
# Example for OpenAI reasoning-capable models:
# llms:
# default:
# provider: openai
# model: gpt-5.4-mini
# reasoning_effort: low
# llm:
# default: default
Run it:
From YAML + Rust
use ;
async
From Rust API
use ;
use Arc;
async
See the examples/ directory for more.
CLI
# Install from crates.io
# Or run directly from source
See the CLI Guide for REPL commands, metadata configuration, and full reference.
Roadmap
See the full roadmap for what's shipped, what's next, and the complete feature catalog.
Documentation
| Resource | Description |
|---|---|
| Getting Started | Install and run your first agent in under a minute |
| YAML Reference | Complete spec for agent definition files |
| CLI Guide | All commands, flags, and REPL features |
| Rust API | Embedding agents in your Rust application |
| Providers | Setup for all 12 LLM providers |
| Concepts | Architecture, lifecycle, and core ideas |
| Examples | YAML and Rust examples for every feature |
| API Docs | Auto-generated Rust API reference |
Key Dependencies
| Crate | Role |
|---|---|
| llm | Unified LLM provider interface (OpenAI, Anthropic, Google, Ollama, and more) |
| rmcp | Official Rust SDK for Model Context Protocol (MCP) |
| tokio | Async runtime |
| minijinja | Jinja2-compatible template engine for system prompts and spawner templates |
| sqlx | SQLite storage backend (optional, sqlite feature) |
| redis | Redis storage backend (optional, redis-storage feature) |
Independence Notice
This repository is an independent open-source project maintained by the author in a personal capacity.
It is not an official product or offering of any employer, and no employer owns or governs this project.
See INDEPENDENCE.md for details.
License
Licensed under either of
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT)