Expand description
LLM-powered session summarization. Session summary generation via LLM providers.
This module provides the ability to generate summaries of AI-assisted development sessions using various LLM providers (Anthropic, OpenAI, OpenRouter). It includes provider configuration, API communication, and error handling.
§Usage
The main entry point is [generate_summary], which resolves the provider
configuration and calls the appropriate LLM API. Configuration is read
from ~/.lore/config.yaml with environment variable overrides.
Re-exports§
pub use provider::create_provider;pub use provider::SummaryProvider;pub use provider::SummaryProviderKind;
Modules§
- prompt
- Prompt construction for LLM-powered session summaries.
- provider
- LLM provider integrations for session summary generation.
Structs§
- Summary
Config - Resolved summary configuration from config file and environment variables.
Enums§
- Summarize
Error - Errors that can occur during summary generation.
Functions§
- generate_
summary - Generates a summary for a set of session messages using the configured LLM provider.
- resolve_
config - Resolves summary configuration from the config file and environment variables.