Skip to main content

Module llm

Module llm 

Source
Expand description

LLM provider abstraction for AI-augmented data generation.

This module provides a trait-based LLM integration that supports:

  • Deterministic mock provider for testing (always available)
  • HTTP-based providers for OpenAI/Anthropic (requires llm feature)
  • Response caching for efficiency

Re-exports§

pub use cache::LlmCache;
pub use mock_provider::MockLlmProvider;
pub use provider::*;

Modules§

cache
mock_provider
nl_config
Natural language to YAML configuration generator.
provider