pub struct Agent { /* private fields */ }Expand description
An AI Agent with goal-oriented capabilities, memory, and tool integration.
Agent is the core component of the Ceylon framework. It manages:
- LLM interactions for generating responses
- Goal analysis and tracking
- Conversation history and memory
- Tool execution and management
§Examples
use ceylon::agent::Agent;
use ceylon::tasks::TaskRequest;
#[tokio::main]
async fn main() {
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let task = TaskRequest::new("Hello, how are you?");
let response = agent.run(task).await;
println!("{:?}", response.result());
}Implementations§
Source§impl Agent
impl Agent
Sourcepub fn new(name: &str, model: &str) -> Self
pub fn new(name: &str, model: &str) -> Self
Creates a new Agent with the specified name and LLM model.
§Arguments
name- The name of the agent (used in system prompts)model- The LLM model to use (e.g., “openai::gpt-4”, “claude-3-opus”, “llama2”)
§Examples
use ceylon::agent::Agent;
let agent = Agent::new("MyAssistant", "openai::gpt-4");§Supported Models
- OpenAI: “openai::gpt-4”, “gpt-3.5-turbo”
- Anthropic: “claude-3-opus”, “claude-3-sonnet”
- Ollama: “llama2”, “mistral”
§Panics
Panics if the model format is invalid or if an API key is required but not found
in environment variables. Use Agent::new_with_config to provide API keys explicitly.
Sourcepub fn new_with_config(
name: &str,
llm_config: LLMConfig,
) -> Result<Self, String>
pub fn new_with_config( name: &str, llm_config: LLMConfig, ) -> Result<Self, String>
Creates a new Agent with the specified name and LLM configuration.
This method allows you to provide comprehensive LLM configuration including API keys, temperature, max tokens, and other provider-specific settings.
§Arguments
name- The name of the agent (used in system prompts)llm_config- LLM configuration with model, API key, and other settings
§Examples
use ceylon::agent::Agent;
use ceylon::llm::LLMConfig;
// Create agent with explicit API key
let config = LLMConfig::new("openai::gpt-4")
.with_api_key("your-api-key")
.with_temperature(0.7)
.with_max_tokens(2048);
let agent = Agent::new_with_config("Assistant", config).unwrap();§Errors
Returns an error if:
- The model format is invalid (should be “provider::model-name”)
- An API key is required but not provided or found in environment variables
- The LLM provider fails to initialize
Sourcepub fn with_memory(&mut self, memory: Arc<dyn Memory>) -> &mut Self
pub fn with_memory(&mut self, memory: Arc<dyn Memory>) -> &mut Self
Sets a custom memory implementation for the agent.
By default, agents use InMemoryStore. You can provide a custom
implementation of the Memory trait for persistent storage.
§Arguments
memory- An Arc-wrapped implementation of the Memory trait
§Examples
use ceylon::agent::Agent;
use std::sync::Arc;
// Assuming you have a custom RedisMemory implementation
// let redis_memory = Arc::new(RedisMemory::new());
// let mut agent = Agent::new("Assistant", "openai::gpt-4");
// agent.with_memory(redis_memory);Sourcepub async fn get_history(&self, limit: Option<usize>) -> Vec<MemoryEntry>
pub async fn get_history(&self, limit: Option<usize>) -> Vec<MemoryEntry>
Retrieves conversation history from memory.
§Arguments
limit- Optional limit on number of conversations to retrieve. IfNone, returns all history.
§Returns
A vector of MemoryEntry objects sorted by recency (newest first).
§Examples
use ceylon::agent::Agent;
let agent = Agent::new("Assistant", "openai::gpt-4");
// Get last 5 conversations
let recent = agent.get_history(Some(5)).await;
// Get all history
let all = agent.get_history(None).await;Sourcepub async fn search_memory(&self, query: &str) -> Vec<MemoryEntry>
pub async fn search_memory(&self, query: &str) -> Vec<MemoryEntry>
Searches conversation history for messages containing the query string.
Performs a case-insensitive text search across all stored messages.
§Arguments
query- The search query string
§Returns
A vector of MemoryEntry objects containing the query.
§Examples
use ceylon::agent::Agent;
let agent = Agent::new("Assistant", "openai::gpt-4");
let results = agent.search_memory("Python").await;
println!("Found {} conversations about Python", results.len());Sourcepub async fn clear_memory(&self) -> Result<(), String>
pub async fn clear_memory(&self) -> Result<(), String>
Clears all conversation history for this agent.
§Returns
Ok(()) if successful, Err(String) with error message if failed.
§Examples
use ceylon::agent::Agent;
let agent = Agent::new("Assistant", "openai::gpt-4");
if let Err(e) = agent.clear_memory().await {
eprintln!("Failed to clear memory: {}", e);
}Sourcepub fn get_current_goal(&self) -> Option<&Goal>
pub fn get_current_goal(&self) -> Option<&Goal>
Returns the current goal the agent is working on, if any.
§Examples
use ceylon::agent::Agent;
use ceylon::tasks::TaskRequest;
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let task = TaskRequest::new("Build a web server");
agent.run(task).await;
if let Some(goal) = agent.get_current_goal() {
println!("Current goal: {}", goal.description);
}Sourcepub fn set_goal(&mut self, goal: Goal)
pub fn set_goal(&mut self, goal: Goal)
Manually sets a goal for the agent to work on.
This is useful when you want to provide a pre-structured goal instead of having the agent analyze the task automatically.
§Arguments
goal- The goal to set
§Examples
use ceylon::agent::Agent;
use ceylon::goal::Goal;
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let mut goal = Goal::new("Create a REST API".to_string());
goal.add_sub_goal("Design endpoints".to_string(), 1);
goal.add_sub_goal("Implement handlers".to_string(), 2);
agent.set_goal(goal);Sourcepub fn with_system_prompt(&mut self, prompt: &str) -> &mut Self
pub fn with_system_prompt(&mut self, prompt: &str) -> &mut Self
Sets a custom system prompt for the agent.
The system prompt defines the agent’s behavior, personality, and instructions. By default, a goal-oriented system prompt is used.
§Arguments
prompt- The system prompt text
§Examples
use ceylon::agent::Agent;
let mut agent = Agent::new("Assistant", "openai::gpt-4");
agent.with_system_prompt("You are a helpful coding assistant specializing in Rust.");Sourcepub fn get_system_prompt(&self) -> &str
pub fn get_system_prompt(&self) -> &str
Returns the current system prompt.
§Examples
use ceylon::agent::Agent;
let agent = Agent::new("Assistant", "openai::gpt-4");
println!("System prompt: {}", agent.get_system_prompt());Sourcepub fn with_config(&mut self, config: AgentConfig) -> &mut Self
pub fn with_config(&mut self, config: AgentConfig) -> &mut Self
Sets the agent configuration.
§Arguments
config- TheAgentConfigto use
§Examples
use ceylon::agent::{Agent, AgentConfig};
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let mut config = AgentConfig::new(5, 120);
config.with_goal_analysis(true);
agent.with_config(config);Sourcepub fn with_llm_config(
&mut self,
llm_config: LLMConfig,
) -> Result<&mut Self, String>
pub fn with_llm_config( &mut self, llm_config: LLMConfig, ) -> Result<&mut Self, String>
Configures the agent with comprehensive LLM settings using LLMConfig.
This allows you to set advanced LLM parameters like temperature, top_p, reasoning, provider-specific options, and more.
§Arguments
llm_config- TheLLMConfigcontaining comprehensive LLM settings
§Examples
use ceylon::agent::Agent;
use ceylon::llm::LLMConfig;
let mut agent = Agent::new("Assistant", "openai::gpt-4");
// Configure with advanced settings
let llm_config = LLMConfig::new("openai::gpt-4")
.with_api_key("your-api-key")
.with_temperature(0.7)
.with_max_tokens(2048)
.with_top_p(0.9);
agent.with_llm_config(llm_config);§Provider-Specific Examples
§Azure OpenAI
use ceylon::agent::Agent;
use ceylon::llm::LLMConfig;
let mut agent = Agent::new("Assistant", "azure::gpt-4");
let config = LLMConfig::new("azure::gpt-4")
.with_api_key("your-azure-key")
.with_deployment_id("your-deployment-id")
.with_api_version("2024-02-01");
agent.with_llm_config(config);§OpenAI with Web Search
use ceylon::agent::Agent;
use ceylon::llm::LLMConfig;
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let config = LLMConfig::new("openai::gpt-4")
.with_api_key("your-api-key")
.with_openai_web_search(true);
agent.with_llm_config(config);§Anthropic with Reasoning
use ceylon::agent::Agent;
use ceylon::llm::LLMConfig;
let mut agent = Agent::new("Assistant", "anthropic::claude-3-opus");
let config = LLMConfig::new("anthropic::claude-3-opus")
.with_api_key("your-api-key")
.with_reasoning(true)
.with_reasoning_effort("high");
agent.with_llm_config(config);Sourcepub fn add_tool<T>(&mut self, tool: T) -> &mut Self
pub fn add_tool<T>(&mut self, tool: T) -> &mut Self
Adds a tool to the agent’s toolset.
Tools extend the agent’s capabilities by allowing it to perform external actions like web searches, calculations, database queries, etc.
§Arguments
tool- Any type implementingToolTrait
§Examples
use ceylon::agent::Agent;
use ceylon::tools::ToolTrait;
use serde_json::{json, Value};
struct WeatherTool;
impl ToolTrait for WeatherTool {
fn name(&self) -> String { "get_weather".to_string() }
fn description(&self) -> String { "Get weather for a location".to_string() }
fn input_schema(&self) -> Value { json!({"type": "object"}) }
fn execute(&self, input: Value) -> Value { json!({"temp": 72}) }
}
let mut agent = Agent::new("Assistant", "openai::gpt-4");
agent.add_tool(WeatherTool);Sourcepub fn get_tool_invoker(&self) -> &ToolInvoker
pub fn get_tool_invoker(&self) -> &ToolInvoker
Returns a reference to the agent’s tool invoker.
This can be used to inspect registered tools or invoke them manually.
Sourcepub async fn run(&mut self, task: TaskRequest) -> TaskResponse
pub async fn run(&mut self, task: TaskRequest) -> TaskResponse
Executes a task using the agent.
This is the main method for running the agent. It:
- Analyzes the task and creates a goal structure (if enabled)
- Loads relevant conversation history from memory
- Iteratively processes the task with the LLM
- Invokes tools as needed
- Tracks progress towards goal completion
- Saves the conversation to memory
§Arguments
task- ATaskRequestcontaining the user’s request
§Returns
A TaskResponse containing the agent’s final output
§Examples
use ceylon::agent::Agent;
use ceylon::tasks::{TaskRequest, OutputData};
#[tokio::main]
async fn main() {
let mut agent = Agent::new("Assistant", "openai::gpt-4");
let task = TaskRequest::new("Write a haiku about Rust");
let response = agent.run(task).await;
match response.result() {
OutputData::Text(text) => println!("{}", text),
_ => println!("Unexpected output type"),
}
}Trait Implementations§
Auto Trait Implementations§
impl Freeze for Agent
impl !RefUnwindSafe for Agent
impl Send for Agent
impl Sync for Agent
impl Unpin for Agent
impl !UnwindSafe for Agent
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more