Skip to main content

Module summarizer

Module summarizer 

Source
Expand description

LLM-based context summarization.

When the conversation grows beyond the context window, older messages are summarized into a compact representation to preserve important context while reducing token usage.

Structs§

ContextSummarizer
Generates summaries of conversation history using the LLM.
ContextSummary
Summary of conversation context for compression.
TokenCostDisplay
Token and cost tracking display data.

Enums§

SummarizeError
Errors during summarization.
TokenAlert
Token budget alerts.

Functions§

smart_fallback_summary
Smart fallback summary that preserves structured information when LLM-based summarization fails. Instead of naive truncation, it extracts tool names, results, and preserves the first/last messages for continuity.