pub struct AgentConfig {Show 20 fields
pub name: String,
pub model: Option<String>,
pub instruction: Option<String>,
pub description: Option<String>,
pub tools: Vec<ToolConfig>,
pub sub_agents: Vec<AgentConfig>,
pub temperature: Option<f32>,
pub max_output_tokens: Option<u32>,
pub thinking_budget: Option<u32>,
pub output_key: Option<String>,
pub output_schema: Option<Value>,
pub max_llm_calls: Option<u32>,
pub agent_type: String,
pub max_iterations: Option<u32>,
pub metadata: HashMap<String, Value>,
pub voice: Option<String>,
pub greeting: Option<String>,
pub transcription: Option<bool>,
pub a2a: Option<bool>,
pub env: HashMap<String, String>,
}Expand description
Declarative agent configuration — loadable from YAML or TOML.
§Example YAML
name: weather_agent
model: gemini-2.0-flash
instruction: "You are a helpful weather assistant."
description: "Provides weather information for cities."
tools:
- name: get_weather
description: "Get weather for a city"
- builtin: google_search
sub_agents:
- name: forecast_agent
model: gemini-2.0-flash
instruction: "Provide 5-day forecasts."
output_key: weather_resultFields§
§name: StringAgent name (required).
model: Option<String>Model identifier (e.g., “gemini-2.0-flash”, “gemini-2.5-pro”).
instruction: Option<String>System instruction for the agent.
description: Option<String>Human-readable description of what this agent does.
tools: Vec<ToolConfig>Tool declarations.
sub_agents: Vec<AgentConfig>Sub-agent configurations (for multi-agent hierarchies).
temperature: Option<f32>Temperature for generation (0.0 - 2.0).
max_output_tokens: Option<u32>Maximum output tokens.
thinking_budget: Option<u32>Thinking budget (Google AI only).
output_key: Option<String>State key to auto-save the agent’s final response into.
output_schema: Option<Value>JSON Schema for structured output.
max_llm_calls: Option<u32>Maximum number of LLM calls per invocation (safety limit).
agent_type: StringAgent type: “llm” (default), “sequential”, “parallel”, “loop”.
max_iterations: Option<u32>For loop agents: maximum iterations.
metadata: HashMap<String, Value>Custom metadata (passed through to state or callbacks).
voice: Option<String>Voice configuration for live agents.
greeting: Option<String>Greeting message (model speaks first on connect).
transcription: Option<bool>Whether to enable transcription.
a2a: Option<bool>Whether to enable A2A protocol endpoint.
env: HashMap<String, String>Environment variables to set when loading this agent.
Implementations§
Source§impl AgentConfig
impl AgentConfig
Sourcepub fn from_yaml_file(path: &Path) -> Result<Self, AgentConfigError>
pub fn from_yaml_file(path: &Path) -> Result<Self, AgentConfigError>
Load agent config from a YAML file.
Sourcepub fn from_yaml(yaml: &str) -> Result<Self, AgentConfigError>
pub fn from_yaml(yaml: &str) -> Result<Self, AgentConfigError>
Parse agent config from a YAML string.
Sourcepub fn from_json(json: &str) -> Result<Self, AgentConfigError>
pub fn from_json(json: &str) -> Result<Self, AgentConfigError>
Parse agent config from a JSON string.
Sourcepub fn from_value(value: Value) -> Result<Self, AgentConfigError>
pub fn from_value(value: Value) -> Result<Self, AgentConfigError>
Parse agent config from a JSON value.
Sourcepub fn validate(&self) -> Result<(), AgentConfigError>
pub fn validate(&self) -> Result<(), AgentConfigError>
Validate the configuration.
Sourcepub fn builtin_tools(&self) -> Vec<&str>
pub fn builtin_tools(&self) -> Vec<&str>
Check if this is a built-in tool reference.
Sourcepub fn is_workflow(&self) -> bool
pub fn is_workflow(&self) -> bool
Check if this is a workflow agent (non-LLM).
Trait Implementations§
Source§impl Clone for AgentConfig
impl Clone for AgentConfig
Source§fn clone(&self) -> AgentConfig
fn clone(&self) -> AgentConfig
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read more