🔥 Helios Engine - LLM Agent Framework
Helios Engine is a powerful and flexible Rust framework for building LLM-powered agents with tool support, streaming chat capabilities, and easy configuration management. Create intelligent agents that can interact with users, call tools, and maintain conversation context - with both online and offline local model support.
🚀 Key Features
- 🆕 Forest of Agents: Multi-agent collaboration system where agents can communicate, delegate tasks, and share context
- Agent System: Create multiple agents with different personalities and capabilities
- Tool Registry: Extensible tool system for adding custom functionality
- Extensive Tool Suite: 16+ built-in tools including web scraping, JSON parsing, timestamp operations, file I/O, shell commands, HTTP requests, system info, and text processing
- 🆕 RAG System: Retrieval-Augmented Generation with vector stores (InMemory and Qdrant)
- Streaming Support: True real-time response streaming for both remote and local models with immediate token delivery
- Local Model Support: Run local models offline using llama.cpp with HuggingFace integration (optional
localfeature) - HTTP Server & API: Expose OpenAI-compatible API endpoints with full parameter support
- Dual Mode Support: Auto, online (remote API), and offline (local) modes
- CLI & Library: Use as both a command-line tool and a Rust library crate
- 🆕 Feature Flags: Optional
localfeature for offline model support - build only what you need!
📚 Documentation
| Guide | Description |
|---|---|
| 📖 Getting Started | 5-minute setup guide to get Helios running |
| 🛠️ Installation | Complete installation instructions and feature flags |
| 💻 CLI Usage | Command-line interface and common usage patterns |
| ⚙️ Configuration | Configuration options and local inference setup |
| 🔧 Tools | Built-in tools and creating custom tools |
| 🆕 Advanced Features | RAG, Forest of Agents, and advanced capabilities |
| 📋 API Reference | Complete API documentation |
| 🏗️ Architecture | System architecture and design principles |
🏃♂️ Quick Start
Install CLI Tool
# Install without local model support (lighter, faster install)
# Install with local model support (enables offline mode with llama-cpp-2)
Basic Usage
# Initialize configuration
# Start interactive chat
# Ask a quick question
As a Library Crate
Add to your Cargo.toml:
[]
= "0.3.4"
= { = "1.35", = ["full"] }
See 📖 Quick Start Guide for detailed examples!
📁 Project Structure
helios-engine/
├── src/ # Source code
├── examples/ # Example applications
├── docs/ # Documentation
├── tests/ # Integration tests
├── Cargo.toml # Project configuration
└── README.md # This file
🤝 Contributing
We welcome contributions! See Contributing Guide for details on:
- Development setup
- Code standards
- Documentation guidelines
- Testing procedures
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Made with ❤️ in Rust