Expand description
LLM provider abstractions and implementations
This module defines the core traits and types for integrating with various Language Model providers like Anthropic, OpenAI, Google, and others.
Structs§
- Cost
- Cost structure for model pricing
- Current
Usage - Current usage against rate limits
- Limits
- Model limits structure
- Model
Capabilities - Model capabilities
- Model
Config - Model configuration
- Model
Info - Information about an available model
- Model
Limits - Model limits and constraints
- Model
Metadata - Model metadata
- Model
Pricing - Model pricing information
- Model
Usage - Usage statistics for a specific model
- Period
Usage - Usage statistics for a time period
- Provider
Config - Provider configuration
- Provider
Health - Provider health status
- Provider
Registry - Registry for managing LLM providers
- Rate
Limit Info - Rate limiting information
- Retry
Config - Retry configuration
- Usage
Stats - Usage statistics for a provider
Enums§
- Model
Status - Model status
- Provider
Source - Provider source enum
- Provider
Status - Provider status enum
Traits§
Functions§
- retry_
with_ backoff - Retry helper function with exponential backoff