Expand description
LLM client layer.
Handles all communication with LLM APIs. Supports OpenAI-compatible and Anthropic-native APIs with streaming via Server-Sent Events (SSE).
§Architecture
client— HTTP client with retry logic and streamingmessage— Message types for the conversation protocolstream— SSE parser that yieldsStreamEventvalues
Modules§
- anthropic
- Anthropic Messages API provider.
- azure_
openai - Azure OpenAI provider.
- client
- HTTP streaming client for LLM APIs.
- message
- Message types for the conversation protocol.
- normalize
- Message normalization and validation utilities.
- openai
- OpenAI Chat Completions provider.
- provider
- LLM provider abstraction.
- retry
- Retry logic and streaming fallback handling.
- stream
- SSE (Server-Sent Events) stream parser.