Expand description
Conversation primitives: messages, content blocks, and responses.
A conversation is a sequence of ChatMessages, each carrying a
ChatRole and one or more ContentBlocks. Providers return a
ChatResponse containing the assistant’s reply along with token
usage and stop-reason metadata.
§Content model
Every message holds a Vec<ContentBlock> rather than a plain string.
This uniform representation handles text, images, tool calls, tool
results, and reasoning traces without special-casing:
use llm_stack::{ChatMessage, ContentBlock, ChatRole};
// Simple text message
let msg = ChatMessage::user("Hello, world!");
// Mixed content (text + image) in a single message
let mixed = ChatMessage {
role: ChatRole::User,
content: vec![
ContentBlock::Text("What's in this image?".into()),
ContentBlock::Image {
media_type: "image/png".into(),
data: llm_stack::ImageSource::Base64("...".into()),
},
],
};Structs§
- Chat
Message - A single message in a conversation.
- Chat
Response - A complete response from a model.
- Tool
Call - A tool invocation requested by the assistant.
- Tool
Result - The result of executing a tool, returned to the model.
Enums§
- Chat
Role - The role of a participant in a conversation.
- Content
Block - An individual piece of content within a
ChatMessage. - Image
Source - How an image is supplied to the model.
- Stop
Reason - The reason the model stopped producing output.