Crate helios_engine

Crate helios_engine 

Source
Expand description

§Helios Engine

Helios is a powerful and flexible Rust framework for building LLM-powered agents with tool support, chat capabilities, and easy configuration management.

§Quick Start

§Using as a Library (Direct LLM Calls)

§Example

use helios_engine::{LLMClient, ChatMessage};
use helios_engine::config::LLMConfig;

#[tokio::main]
async fn main() -> helios_engine::Result<()> {
    let llm_config = LLMConfig {
        model_name: "gpt-3.5-turbo".to_string(),
        base_url: "https://api.openai.com/v1".to_string(),
        api_key: std::env::var("OPENAI_API_KEY").unwrap(),
        temperature: 0.7,
        max_tokens: 2048,
    };

    let client = LLMClient::new(helios_engine::llm::LLMProviderType::Remote(llm_config)).await?;
    let messages = vec![
        ChatMessage::system("You are a helpful assistant."),
        ChatMessage::user("What is the capital of France?"),
    ];

    let response = client.chat(messages, None, None, None, None).await?;
    println!("Response: {}", response.content);
    Ok(())
}

§Overview

use helios_engine::{Agent, Config, CalculatorTool};

#[tokio::main]
async fn main() -> helios_engine::Result<()> {
    let config = Config::from_file("config.toml")?;

    let mut agent = Agent::builder("MyAgent")
        .config(config)
        .system_prompt("You are a helpful assistant.")
        .tool(Box::new(CalculatorTool))
        .build()
        .await?;

    let response = agent.chat("What is 15 * 7?").await?;
    println!("Agent: {}", response);
    Ok(())
}

§Features

  • Direct LLM Access: Use LLMClient for simple, direct calls to LLM models.
  • Agent System: Create intelligent agents with tools and persistent conversation.
  • Tool Support: Extensible tool system for adding custom functionality.
  • Multi-Provider: Works with OpenAI, Azure OpenAI, local models, and any OpenAI-compatible API.
  • Type-Safe: Leverages Rust’s type system for reliability.
  • Async: Built on Tokio for high-performance async operations.

Re-exports§

pub use agent::Agent;
pub use agent::AgentBuilder;
pub use chat::ChatMessage;
pub use chat::ChatSession;
pub use chat::Role;
pub use config::Config;
pub use config::LLMConfig;
pub use error::HeliosError;
pub use error::Result;
pub use llm::Delta;
pub use llm::LLMClient;
pub use llm::LLMProvider;
pub use llm::LLMRequest;
pub use llm::LLMResponse;
pub use llm::StreamChoice;
pub use llm::StreamChunk;
pub use tools::CalculatorTool;
pub use tools::EchoTool;
pub use tools::FileEditTool;
pub use tools::FileReadTool;
pub use tools::FileSearchTool;
pub use tools::FileWriteTool;
pub use tools::MemoryDBTool;
pub use tools::QdrantRAGTool;
pub use tools::Tool;
pub use tools::ToolParameter;
pub use tools::ToolRegistry;
pub use tools::ToolResult;
pub use serve::load_custom_endpoints_config;
pub use serve::start_server;
pub use serve::start_server_with_agent;
pub use serve::start_server_with_agent_and_custom_endpoints;
pub use serve::start_server_with_custom_endpoints;
pub use serve::CustomEndpoint;
pub use serve::CustomEndpointsConfig;
pub use serve::ServerState;

Modules§

agent
Defines the Agent struct and its associated builder, which are central to the Helios Engine.
chat
Provides chat-related functionality, including ChatMessage, ChatSession, and Role.
config
Handles configuration for the engine, including LLM providers and agent settings.
error
Defines the custom HeliosError and Result types for error handling.
llm
Manages interactions with Large Language Models (LLMs), including different providers.
serve
Provides HTTP server functionality for exposing OpenAI-compatible API endpoints.
tools
Contains the tool system, including the Tool trait and various tool implementations.