Skip to main content

Module llm

Module llm 

Source
Expand description

LLM abstraction layer for multiple providers

Structs§

AssistantMessage
Assistant message
CacheControl
Cache control for prompt caching
ChatAnthropic
Anthropic Chat Model
ChatCompletion
Chat completion response
ChatDeepSeek
DeepSeek Chat Model
ChatGoogle
Google Gemini Chat Model
ChatGroq
Groq Chat Model
ChatMistral
Mistral Chat Model
ChatOllama
Ollama Chat Model
ChatOpenAI
OpenAI Chat Model
ChatOpenAICompatible
OpenAI-compatible Chat Model base implementation
ChatOpenRouter
OpenRouter Chat Model
ContentPartDocument
Document content part (for PDFs etc.)
ContentPartImage
Image content part
ContentPartRedactedThinking
Redacted thinking content
ContentPartRefusal
Refusal content part
ContentPartText
Text content part
ContentPartThinking
Thinking content part
DeveloperMessage
Developer message (for o1+ models)
DocumentSource
Function
Function call from the LLM
ImageUrl
Image URL structure
ModelBuilder
Builder pattern helpers for common model configurations
SchemaOptimizer
Schema optimizer for creating LLM-compatible JSON schemas
SystemMessage
System message
ToolCall
Tool call from the LLM
ToolDefinition
Definition of a tool that can be called by the LLM
ToolMessage
Tool result message
Usage
Token usage information
UserMessage
User message

Enums§

CacheControlType
ContentPart
Union type for all content parts
LlmError
Error types for LLM operations
Message
Union type for all messages
ReasoningEffort
Reasoning effort levels for o1+ models
StopReason
Stop reason for completion
ToolChoice
Tool choice strategy

Traits§

BaseChatModel
Base trait for chat model implementations

Type Aliases§

ChatStream
Type alias for boxed stream
JsonSchema
JSON Schema for tool parameters