helios-engine 0.5.5

A powerful and flexible Rust framework for building LLM-powered agents with tool support, both locally and online
Documentation

Helios Engine - LLM Agent Framework

Crates.io docs.rs Book downloads issues stars last commit Website Release

Helios Engine is a powerful and flexible Rust framework for building LLM-powered agents with tool support, streaming chat capabilities, and easy configuration management. Create intelligent agents that can interact with users, call tools, and maintain conversation context - with both online and offline local model support.

Key Features

  • 🆕 ReAct Mode: Enable agents to reason and plan before taking actions with a simple .react() call - includes custom reasoning prompts for domain-specific tasks
  • 🆕 Forest of Agents: Multi-agent collaboration system where agents can communicate, delegate tasks, and share context
  • Agent System: Create multiple agents with different personalities and capabilities
  • 🆕 Tool Builder: Simplified tool creation with builder pattern - wrap any function as a tool without manual trait implementation
  • Tool Registry: Extensible tool system for adding custom functionality
  • Extensive Tool Suite: 16+ built-in tools including web scraping, JSON parsing, timestamp operations, file I/O, shell commands, HTTP requests, system info, and text processing
  • 🆕 RAG System: Retrieval-Augmented Generation with vector stores (InMemory and Qdrant)
  • 🆕 Custom Endpoints: Ultra-simple API for adding custom HTTP endpoints to your agent server - ~70% less code than before!
  • Streaming Support: True real-time response streaming for both remote and local models with immediate token delivery
  • Local Model Support: Run local models offline using llama.cpp with HuggingFace integration (optional local feature)
  • HTTP Server & API: Expose OpenAI-compatible API endpoints with full parameter support
  • Dual Mode Support: Auto, online (remote API), and offline (local) modes
  • CLI & Library: Use as both a command-line tool and a Rust library crate
  • 🆕 Feature Flags: Optional local feature for offline model support - build only what you need!
  • 🆕 Improved Syntax: Cleaner, more ergonomic API for adding multiple tools and agents - use .tools(vec![...]) and .agents(vec![...]) for bulk operations

Documentation

Online Resources

Quick Links

Local Documentation

Full Documentation Index - Complete navigation and updated structure

Quick Start

Version 0.4.4

Install CLI Tool

# Install without local model support (lighter, faster install)
cargo install helios-engine

# Install with local model support (enables offline mode with llama-cpp-2)
cargo install helios-engine --features local

Basic Usage

# Initialize configuration
helios-engine init

# Start interactive chat
helios-engine chat

# Ask a quick question
helios-engine ask "What is Rust?"

As a Library Crate

Add to your Cargo.toml:

[dependencies]
helios-engine = "0.5.0"
tokio = { version = "1.35", features = ["full"] }
Simplest Agent (3 lines!)
use helios_engine::Agent;

#[tokio::main]
async fn main() -> helios_engine::Result<()> {
    let mut agent = Agent::quick("Bot").await?;
    let response = agent.ask("What is 2+2?").await?;
    println!("{}", response);
    Ok(())
}
With Tools & Custom Config
use helios_engine::{Agent, CalculatorTool, Config};

#[tokio::main]
async fn main() -> helios_engine::Result<()> {
    let config = Config::builder()
        .m("gpt-4")
        .key("your-api-key")
        .temp(0.7)
        .build();
    
    let mut agent = Agent::builder("MathBot")
        .config(config)
        .prompt("You are a helpful math assistant")
        .with_tool(Box::new(CalculatorTool))
        .build()
        .await?;
    
    let response = agent.ask("What is 15 * 8?").await?;
    println!("{}", response);
    Ok(())
}

For local model support:

[dependencies]
helios-engine = { version = "0.5.0", features = ["local"] }
tokio = { version = "1.35", features = ["full"] }

Simple Example

use helios_engine::{Agent, Config, CalculatorTool};

#[tokio::main]
async fn main() -> helios_engine::Result<()> {
    let config = Config::from_file("config.toml")?;

    let mut agent = Agent::builder("MyAssistant")
        .config(config)
        .system_prompt("You are a helpful AI assistant.")
        .tool(Box::new(CalculatorTool))
        .react()  // Enable ReAct mode for reasoning before acting!
        .build()
        .await?;

    let response = agent.chat("What is 15 * 8?").await?;
    println!("{}", response);

    Ok(())
}

See Getting Started Guide or visit the Official Book for detailed examples and comprehensive tutorials!

Custom Endpoints Made Simple

Create custom HTTP endpoints with minimal code:

use helios_engine::{ServerBuilder, get, EndpointBuilder, EndpointResponse};

let endpoints = vec![
    // Simple static endpoint
    get("/api/version", serde_json::json!({"version": "1.0"})),
    
    // Dynamic endpoint with request handling
    EndpointBuilder::post("/api/echo")
        .handle(|req| {
            let msg = req.and_then(|r| r.body).unwrap_or_default();
            EndpointResponse::ok(serde_json::json!({"echo": msg}))
        })
        .build(),
];

ServerBuilder::with_agent(agent, "model")
    .endpoints(endpoints)
    .serve()
    .await?;

70% less code than the old API! See Custom Endpoints Guide for details.

Use Cases

  • Chatbots & Virtual Assistants: Build conversational AI with tool access and memory
  • Multi-Agent Systems: Coordinate multiple specialized agents for complex workflows
  • Data Analysis: Agents that can read files, process data, and generate reports
  • Web Automation: Scrape websites, make API calls, and process responses
  • Knowledge Management: Build RAG systems for semantic search and Q&A
  • API Services: Expose your agents via OpenAI-compatible HTTP endpoints
  • Local AI: Run models completely offline for privacy and security

Built-in Tools (16+)

Helios Engine includes a comprehensive suite of production-ready tools:

  • File Management: Read, write, edit, and search files
  • Web & API: Web scraping, HTTP requests
  • System Utilities: Shell commands, system information
  • Data Processing: JSON parsing, text manipulation, timestamps
  • Communication: Agent-to-agent messaging
  • Knowledge: RAG tool for semantic search and retrieval

Learn more in the Tools Guide.

Project Structure

helios-engine/
├── src/                    # Source code
├── examples/               # Example applications
├── docs/                   # Documentation
├── book/                   # mdBook source (deployed to vercel)
├── tests/                  # Integration tests
├── Cargo.toml             # Project configuration
└── README.md              # This file

Contributing

We welcome contributions! See our Contributing Guide for details on:

  • Development setup
  • Code standards
  • Documentation guidelines
  • Testing procedures

Links

License

This project is licensed under the MIT License - see the LICENSE file for details.