Expand description
You can import everything directly from the crate:
use reagent_rs::{Agent, AgentBuilder, Flow, Tool, Message};
Or pull in the essentials:
use reagent_rs::prelude::*;
Create an Agent
using AgentBuilder
:
use std::error::Error;
use reagent_rs::{init_default_tracing, AgentBuilder};
use schemars::{schema_for, JsonSchema};
use serde::Deserialize;
#[derive(Debug, Deserialize, JsonSchema)]
struct MyWeatherOuput {
windy: bool,
temperature: i32,
description: String
}
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
init_default_tracing();
let mut agent = AgentBuilder::default()
.set_model("qwen3:0.6b")
.set_system_prompt("You make up weather info in JSON")
.set_response_format_from::<MyWeatherOuput>()
.set_temperature(0.6)
.set_top_k(20)
.set_stream(true)
.build()
.await?;
let resp: MyWeatherOuput = agent
.invoke_flow_structured_output("What is the current weather in Koper?")
.await?;
Ok(())
}
Reagent talks to Ollama by default. It also supports OpenRouter.
To use OpenRouter, set the provider to Provider::OpenRouter
and supply your API key.
use reagent_rs::{AgentBuilder, Provider};
async {
let agent = AgentBuilder::default()
.set_provider(Provider::OpenRouter)
.set_api_key("your_openrouter_key")
.set_model("meta-llama/llama-3.1-8b-instruct:free")
.build()
.await;
};
Re-exports§
pub use crate::agent::*;
pub use crate::flows::*;
pub use crate::notifications::*;
pub use crate::prebuilds::*;
pub use crate::templates::*;
pub use crate::tools::*;
Modules§
Macros§
Structs§
Enums§
- McpIntegration
Error - Errors that can occur while integrating with an MCP server.
- McpServer
Type - Defines the type of MCP server the agent can connect to.
- Provider
- Role
Traits§
- Json
Schema - A type which can be described as a JSON Schema document.