pub struct Agent<M: CompletionModel> {
pub tools: ToolSet,
/* private fields */
}
Expand description
Struct representing an LLM agent. An agent is an LLM model combined with a preamble (i.e.: system prompt) and a static set of context documents and tools. All context documents and tools are always provided to the agent when prompted.
§Example
use rig::{completion::Prompt, providers::openai};
let openai = openai::Client::from_env();
let comedian_agent = openai
.agent("gpt-4o")
.preamble("You are a comedian here to entertain the user using humour and jokes.")
.temperature(0.9)
.build();
let response = comedian_agent.prompt("Entertain me!")
.await
.expect("Failed to prompt the agent");
Fields§
§tools: ToolSet
Actual tool implementations
Trait Implementations§
Source§impl<M: CompletionModel> Chat for Agent<M>
impl<M: CompletionModel> Chat for Agent<M>
Source§impl<M: CompletionModel> Completion<M> for Agent<M>
impl<M: CompletionModel> Completion<M> for Agent<M>
Source§async fn completion(
&self,
prompt: impl Into<Message> + Send,
chat_history: Vec<Message>,
) -> Result<CompletionRequestBuilder<M>, CompletionError>
async fn completion( &self, prompt: impl Into<Message> + Send, chat_history: Vec<Message>, ) -> Result<CompletionRequestBuilder<M>, CompletionError>
Generates a completion request builder for the given
prompt
and chat_history
.
This function is meant to be called by the user to further customize the
request at prompt time before sending it. Read moreSource§impl<M: CompletionModel> Prompt for &Agent<M>
impl<M: CompletionModel> Prompt for &Agent<M>
Source§impl<M: CompletionModel> Prompt for Agent<M>
impl<M: CompletionModel> Prompt for Agent<M>
Source§impl<M: StreamingCompletionModel> StreamingChat for Agent<M>
impl<M: StreamingCompletionModel> StreamingChat for Agent<M>
Source§async fn stream_chat(
&self,
prompt: &str,
chat_history: Vec<Message>,
) -> Result<StreamingResult, CompletionError>
async fn stream_chat( &self, prompt: &str, chat_history: Vec<Message>, ) -> Result<StreamingResult, CompletionError>
Stream a chat with history to the model
Source§impl<M: StreamingCompletionModel> StreamingCompletion<M> for Agent<M>
impl<M: StreamingCompletionModel> StreamingCompletion<M> for Agent<M>
Source§async fn stream_completion(
&self,
prompt: impl Into<Message> + Send,
chat_history: Vec<Message>,
) -> Result<CompletionRequestBuilder<M>, CompletionError>
async fn stream_completion( &self, prompt: impl Into<Message> + Send, chat_history: Vec<Message>, ) -> Result<CompletionRequestBuilder<M>, CompletionError>
Generate a streaming completion from a request
Source§impl<M: StreamingCompletionModel> StreamingPrompt for Agent<M>
impl<M: StreamingCompletionModel> StreamingPrompt for Agent<M>
Source§async fn stream_prompt(
&self,
prompt: &str,
) -> Result<StreamingResult, CompletionError>
async fn stream_prompt( &self, prompt: &str, ) -> Result<StreamingResult, CompletionError>
Stream a simple prompt to the model
Auto Trait Implementations§
impl<M> Freeze for Agent<M>where
M: Freeze,
impl<M> !RefUnwindSafe for Agent<M>
impl<M> Send for Agent<M>
impl<M> Sync for Agent<M>
impl<M> Unpin for Agent<M>where
M: Unpin,
impl<M> !UnwindSafe for Agent<M>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more