Skip to main content

Module llm

Module llm 

Source
Expand description

LLM client abstraction layer

Provides a unified interface for interacting with LLM providers (Anthropic Claude, OpenAI, and OpenAI-compatible providers).

Re-exports§

pub use anthropic::AnthropicClient;
pub use factory::create_client_with_config;
pub use factory::LlmConfig;
pub use http::default_http_client;
pub use http::HttpClient;
pub use http::HttpResponse;
pub use http::StreamingHttpResponse;
pub use openai::OpenAiClient;

Modules§

anthropic
Anthropic Claude LLM client
factory
LLM client factory
http
HTTP utilities and abstraction for LLM API calls
openai
OpenAI-compatible LLM client

Structs§

Attachment
Image attachment for multi-modal messages.
ImageSource
Image source for the ContentBlock::Image variant.
LlmResponse
LLM response
Message
Message in conversation
SecretString
A string wrapper that redacts its value in Debug and Display output. Prevents API keys from leaking into logs and error messages.
TokenUsage
Token usage statistics
ToolCall
Tool call from LLM
ToolDefinition
Tool definition for LLM

Enums§

ContentBlock
Message content types
StreamEvent
Streaming event from LLM
ToolResultContent
Content within a tool result — either text or an image.
ToolResultContentField
The content field of a ToolResult block.

Traits§

LlmClient
LLM client trait