Skip to main content

Module llm

Module llm 

Source
Expand description

LLM client layer.

Handles all communication with LLM APIs. Supports OpenAI-compatible and Anthropic-native APIs with streaming via Server-Sent Events (SSE).

§Architecture

  • client — HTTP client with retry logic and streaming
  • message — Message types for the conversation protocol
  • stream — SSE parser that yields StreamEvent values

Modules§

anthropic
Anthropic Messages API provider.
azure_openai
Azure OpenAI provider.
client
HTTP streaming client for LLM APIs.
message
Message types for the conversation protocol.
normalize
Message normalization and validation utilities.
openai
OpenAI Chat Completions provider.
provider
LLM provider abstraction.
retry
Retry logic and streaming fallback handling.
stream
SSE (Server-Sent Events) stream parser.