Skip to main content

Module summarize

Module summarize 

Source
Expand description

LLM-powered session summarization. Session summary generation via LLM providers.

This module provides the ability to generate summaries of AI-assisted development sessions using various LLM providers (Anthropic, OpenAI, OpenRouter). It includes provider configuration, API communication, and error handling.

§Usage

The main entry point is [generate_summary], which resolves the provider configuration and calls the appropriate LLM API. Configuration is read from ~/.lore/config.yaml with environment variable overrides.

Re-exports§

pub use provider::create_provider;
pub use provider::SummaryProvider;
pub use provider::SummaryProviderKind;

Modules§

prompt
Prompt construction for LLM-powered session summaries.
provider
LLM provider integrations for session summary generation.

Structs§

SummaryConfig
Resolved summary configuration from config file and environment variables.

Enums§

SummarizeError
Errors that can occur during summary generation.

Functions§

generate_summary
Generates a summary for a set of session messages using the configured LLM provider.
resolve_config
Resolves summary configuration from the config file and environment variables.