Skip to main content

Module chat

Module chat 

Source
Expand description

Conversation primitives: messages, content blocks, and responses.

A conversation is a sequence of ChatMessages, each carrying a ChatRole and one or more ContentBlocks. Providers return a ChatResponse containing the assistant’s reply along with token usage and stop-reason metadata.

§Content model

Every message holds a Vec<ContentBlock> rather than a plain string. This uniform representation handles text, images, tool calls, tool results, and reasoning traces without special-casing:

use llm_stack_core::{ChatMessage, ContentBlock, ChatRole};

// Simple text message
let msg = ChatMessage::user("Hello, world!");

// Mixed content (text + image) in a single message
let mixed = ChatMessage {
    role: ChatRole::User,
    content: vec![
        ContentBlock::Text("What's in this image?".into()),
        ContentBlock::Image {
            media_type: "image/png".into(),
            data: llm_stack_core::ImageSource::Base64("...".into()),
        },
    ],
};

Structs§

ChatMessage
A single message in a conversation.
ChatResponse
A complete response from a model.
ToolCall
A tool invocation requested by the assistant.
ToolResult
The result of executing a tool, returned to the model.

Enums§

ChatRole
The role of a participant in a conversation.
ContentBlock
An individual piece of content within a ChatMessage.
ImageSource
How an image is supplied to the model.
StopReason
The reason the model stopped producing output.