pub struct Agent { /* private fields */ }Expand description
An autonomous MAGI agent with its own identity, system prompt, and LLM provider.
Each agent combines an AgentName identity, a mode-agnostic system prompt
(v0.3.0+), and an LlmProvider backend. The agent delegates LLM communication
to its provider via execute.
Implementations§
Source§impl Agent
impl Agent
Sourcepub fn new(name: AgentName, provider: Arc<dyn LlmProvider>) -> Self
pub fn new(name: AgentName, provider: Arc<dyn LlmProvider>) -> Self
Creates an agent with a mode-agnostic system prompt for the given name.
The prompt is selected from compiled-in markdown files via include_str!.
As of v0.3.0 the mode parameter has been removed; the agent uses a single
consolidated prompt per identity. Mode routing is handled by the orchestrator.
§Parameters
name: Which MAGI agent (Melchior, Balthasar, Caspar).provider: The LLM backend for this agent.
Sourcepub fn with_custom_prompt(
name: AgentName,
provider: Arc<dyn LlmProvider>,
prompt: String,
) -> Self
pub fn with_custom_prompt( name: AgentName, provider: Arc<dyn LlmProvider>, prompt: String, ) -> Self
Creates an agent with a custom system prompt, bypassing the compiled-in defaults.
§Parameters
name: Which MAGI agent.provider: The LLM backend.prompt: Custom system prompt string.
Sourcepub fn from_file(
name: AgentName,
provider: Arc<dyn LlmProvider>,
path: &Path,
) -> Result<Self, MagiError>
pub fn from_file( name: AgentName, provider: Arc<dyn LlmProvider>, path: &Path, ) -> Result<Self, MagiError>
Creates an agent by loading the system prompt from a filesystem path.
Returns MagiError::Io if the file cannot be read.
§Parameters
name: Which MAGI agent.provider: The LLM backend.path: Path to the prompt file.
§Errors
Returns MagiError::Io if the file does not exist or cannot be read.
Sourcepub async fn execute(
&self,
user_prompt: &str,
config: &CompletionConfig,
) -> Result<String, ProviderError>
pub async fn execute( &self, user_prompt: &str, config: &CompletionConfig, ) -> Result<String, ProviderError>
Executes the agent by sending the user prompt to the LLM provider.
Delegates to LlmProvider::complete with this agent’s system prompt.
Returns the raw LLM response string — parsing is the orchestrator’s responsibility.
§Parameters
user_prompt: The user’s input content.config: Completion parameters (max_tokens, temperature).
§Errors
Returns ProviderError on LLM communication failure.
Sourcepub fn system_prompt(&self) -> &str
pub fn system_prompt(&self) -> &str
Returns the agent’s system prompt.
Sourcepub fn provider_name(&self) -> &str
pub fn provider_name(&self) -> &str
Returns the provider’s name (e.g., “claude”, “openai”).
Sourcepub fn provider_model(&self) -> &str
pub fn provider_model(&self) -> &str
Returns the provider’s model identifier.
Sourcepub fn display_name(&self) -> &str
pub fn display_name(&self) -> &str
Returns the agent’s display name (e.g., “Melchior”).