Expand description
Text generation and LLM interactions (async feature only)
Modules§
- async_
mock_ llm - Async implementation of MockLLM demonstrating async trait patterns
Structs§
- Answer
Context - Context information assembled from search results
- Answer
Generator - Main answer generator that orchestrates the response generation process
- Generated
Answer - Generated answer with metadata
- Generation
Config - Configuration for answer generation
- Generator
Statistics - Statistics about the answer generator
- MockLLM
- Simple mock LLM implementation for testing
- Prompt
Template - Template system for constructing context-aware prompts
- Source
Attribution - Source attribution for generated answers
Enums§
- Answer
Mode - Different modes for answer generation
Traits§
- LLMInterface
- Mock LLM interface for testing without external dependencies