Expand description
Appam: AI Agent Framework
A comprehensive framework for building AI agents with minimal configuration. Create powerful agents by writing TOML configs, tool definitions, and tool implementations in Rust or Python.
§Overview
Appam provides:
- Agent system: Define agents with system prompts and tool sets
- Tool framework: Implement tools in Rust or Python with automatic loading
- LLM integration: Streaming OpenRouter client with tool calling
- OpenAI Codex auth: Local OAuth cache for ChatGPT Codex subscription access
- Configuration: Hierarchical TOML-based configuration
- Interfaces: Built-in TUI and web API
- Logging: Structured tracing and session transcripts
§Quick Start
§Option 1: Pure Rust SDK (No TOML Required)
Build agents entirely in Rust with the builder API:
use appam::prelude::*;
use anyhow::Result;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<()> {
let agent = AgentBuilder::new("my-agent")
.model("openai/gpt-4o-mini")
.system_prompt("You are a helpful AI assistant.")
.with_tool(Arc::new(MyTool))
.build()?;
agent.run("Hello!").await?;
Ok(())
}§Option 2: TOML Configuration
Create an agent configuration file (agent.toml):
[agent]
name = "assistant"
model = "openai/gpt-5"
system_prompt = "prompt.txt"
[[tools]]
name = "bash"
schema = "tools/bash.json"
implementation = { type = "python", script = "tools/bash.py" }Load and run the agent:
use appam::prelude::*;
use anyhow::Result;
#[tokio::main]
async fn main() -> Result<()> {
let agent = TomlAgent::from_file("agent.toml")?;
agent.run("What can you do?").await?;
Ok(())
}§Option 3: Hybrid Approach
Load from TOML and extend with Rust tools:
let agent = TomlAgent::from_file("agent.toml")?
.with_additional_tool(Arc::new(CustomTool));
agent.run("Use custom tool").await?;§Architecture
§Agents
Agents are defined by:
- A system prompt that establishes behavior and capabilities
- A tool set that provides executable functions
- A model that powers the agent’s reasoning
The Agent trait provides the core interface. Use TomlAgent to load
agents from configuration files.
§Tools
Tools are executable functions exposed to the LLM. Each tool has:
- A JSON schema defining parameters and description
- An implementation in Rust or Python
Tools can be implemented as:
- Rust: Native performance, type safety, full access to Rust ecosystem
- Python: Easy prototyping, access to Python libraries
§Configuration
Configuration is hierarchical:
- Default values (hardcoded)
- Global config (
appam.toml) - Agent config (per-agent TOML file)
- Environment variables (highest priority)
§Interfaces
Run agents via:
- TUI: Interactive terminal interface with rich widgets
- CLI: Simple streaming output
- Web API: RESTful API with Server-Sent Events streaming
§Examples
§Creating a Python Tool
Define the schema (echo.json):
{
"type": "function",
"function": {
"name": "echo",
"description": "Echo back the input message",
"parameters": {
"type": "object",
"properties": {
"message": {
"type": "string",
"description": "Message to echo"
}
},
"required": ["message"]
}
}
}Implement the tool (echo.py):
def execute(args):
"""Echo the input message."""
return {"output": args.get("message", "")}Register in agent config:
[[tools]]
name = "echo"
schema = "tools/echo.json"
implementation = { type = "python", script = "tools/echo.py" }§Programmatic Agent Creation
use appam::agent::{Agent, TomlAgent};
use appam::tools::{Tool, ToolRegistry};
use std::sync::Arc;
async fn create_custom_agent() {
let agent = TomlAgent::from_file("agent.toml")
.unwrap()
.with_model("anthropic/claude-3.5-sonnet");
// Run with custom prompt
agent.run("Analyze this codebase").await.unwrap();
}Re-exports§
pub use agent::history::SessionHistory;pub use agent::Agent;pub use agent::AgentBuilder;pub use agent::RuntimeAgent;pub use agent::Session;pub use agent::TomlAgent;pub use config::load_config_from_env;pub use config::load_global_config;pub use config::AgentConfigBuilder;pub use config::AppConfig;pub use config::AppConfigBuilder;pub use config::HistoryConfig;pub use config::LogFormat;pub use config::LoggingConfig;pub use config::TraceFormat;pub use llm::DynamicLlmClient;pub use llm::LlmClient;pub use llm::LlmProvider;pub use llm::UnifiedMessage;pub use llm::UnifiedTool;pub use llm::UnifiedToolCall;pub use tools::Tool;pub use tools::ToolRegistry;
Modules§
- agent
- Agent system and runtime.
- config
- Configuration system with hierarchical loading.
- http
- Shared HTTP utilities.
- llm
- LLM client types and abstractions for multiple providers.
- logging
- Logging and session transcript management.
- prelude
- Prelude module for convenient imports.
- tools
- Tool system for agent capabilities.
- web
- Experimental web API server with SSE streaming and session management.
Attribute Macros§
- tool
- Attribute macro for defining AI agent tools.
Derive Macros§
- Schema
- Derive macro for generating JSON schemas with simplified description attributes.