pub async fn query(
prompt: &str,
options: &AgentOptions,
) -> Result<Pin<Box<dyn Stream<Item = Result<ContentBlock>> + Send>>>Expand description
Simple query function for single-turn interactions without conversation history.
This is a stateless convenience function for simple queries that don’t require multi-turn conversations. It creates a temporary HTTP client, sends a single prompt, and returns a stream of content blocks.
For multi-turn conversations or more control over the interaction, use Client instead.
§Parameters
prompt: The user’s message to send to the modeloptions: Configuration including model, API key, tools, etc.
§Returns
Returns a ContentStream that yields content blocks as they arrive from the model.
The stream must be polled to completion to receive all blocks.
§Behavior
- Creates a temporary HTTP client with configured timeout
- Builds message array (system prompt + user prompt)
- Converts tools to OpenAI format if provided
- Makes HTTP POST request to
/chat/completions - Parses Server-Sent Events (SSE) response stream
- Aggregates chunks into complete content blocks
- Returns stream that yields blocks as they complete
§Error Handling
This function can return errors for:
- HTTP client creation failures
- Network errors during the request
- API errors (authentication, invalid model, rate limits, etc.)
- SSE parsing errors
- JSON deserialization errors
§Performance Notes
- Creates a new HTTP client for each call (consider using
Clientfor repeated queries) - Timeout is configurable via
AgentOptions::timeout(default: 120 seconds) - Streaming begins immediately; no buffering of the full response
§Examples
§Basic Usage
use open_agent::{query, AgentOptions};
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let options = AgentOptions::builder()
.system_prompt("You are a helpful assistant")
.model("gpt-4")
.api_key("sk-...")
.build()?;
let mut stream = query("What's the capital of France?", &options).await?;
while let Some(block) = stream.next().await {
match block? {
open_agent::ContentBlock::Text(text) => {
print!("{}", text.text);
}
open_agent::ContentBlock::ToolUse(_)
| open_agent::ContentBlock::ToolResult(_)
| open_agent::ContentBlock::Image(_) => {}
}
}
Ok(())
}§With Tools
use open_agent::{query, AgentOptions, Tool, ContentBlock};
use futures::StreamExt;
use serde_json::json;
let calculator = Tool::new(
"calculator",
"Performs calculations",
json!({"type": "object"}),
|input| Box::pin(async move { Ok(json!({"result": 42})) })
);
let options = AgentOptions::builder()
.model("gpt-4")
.api_key("sk-...")
.tools(vec![calculator])
.build()?;
let mut stream = query("Calculate 2+2", &options).await?;
while let Some(block) = stream.next().await {
match block? {
ContentBlock::ToolUse(tool_use) => {
println!("Model wants to use: {}", tool_use.name());
// Note: You'll need to manually execute tools and continue
// the conversation. For automatic execution, use Client.
}
ContentBlock::Text(text) => print!("{}", text.text),
ContentBlock::ToolResult(_) | ContentBlock::Image(_) => {}
}
}§Error Handling
use open_agent::{query, AgentOptions};
use futures::StreamExt;
let options = AgentOptions::builder()
.model("gpt-4")
.api_key("invalid-key")
.build()
.unwrap();
match query("Hello", &options).await {
Ok(mut stream) => {
while let Some(result) = stream.next().await {
match result {
Ok(block) => println!("Block: {:?}", block),
Err(e) => {
eprintln!("Stream error: {}", e);
break;
}
}
}
}
Err(e) => eprintln!("Query failed: {}", e),
}