Skip to main content

Crate llm

Crate llm 

Source
Expand description

§aether-llm

Multi-provider LLM abstraction layer for the Aether AI agent framework.

§Providers

ProviderExample model stringEnv var
Anthropicanthropic:claude-sonnet-4-5-20250929ANTHROPIC_API_KEY
OpenAIopenai:gpt-4oOPENAI_API_KEY
OpenRouteropenrouter:moonshotai/kimi-k2OPENROUTER_API_KEY
ZAIzai:GLM-4.6ZAI_API_KEY
AWS Bedrockbedrock:us.anthropic.claude-sonnet-4-5-20250929-v1:0AWS credentials
Ollamaollama:llama3.2None (local)
Llama.cppllamacppNone (local)

§Key Types

  • StreamingModelProvider – Core trait for all LLM providers. Implement this to add a new provider.
  • Context – Manages the message history, tool definitions, and reasoning effort sent to the model.
  • ChatMessage – Message enum with variants for user, assistant, and tool call messages.
  • ToolDefinition – Describes a tool the model can invoke (name, description, JSON schema).
  • LlmModel – Catalog of known models with metadata (context window, capabilities).

§Usage

use llm::providers::openrouter::OpenRouterProvider;
use llm::StreamingModelProvider;

// Create a provider from a model string
let provider = OpenRouterProvider::default("moonshotai/kimi-k2").unwrap();
println!("Using model: {:?}", provider.model());
println!("Context window: {:?}", provider.context_window());

§Feature Flags

FeatureDescription
bedrockAWS Bedrock provider support
oauthOAuth authentication (used by Codex provider)
codexOpenAI Codex provider (implies oauth)

§License

MIT

Re-exports§

pub use catalog::LlmModel;
pub use error::ContextOverflowError;
pub use error::LlmError;
pub use error::Result;
pub use provider::LlmResponseStream;
pub use provider::ProviderFactory;
pub use provider::StreamingModelProvider;
pub use providers::codex::perform_codex_oauth_flow;

Modules§

alloyed
catalog
error
oauth
parser
provider
providers
testing
types

Structs§

AssistantReasoning
Context
EncryptedReasoningContent
ToolCallError
Error result of a tool call
ToolCallRequest
Tool call request from the LLM
ToolCallResult
Successful result of a tool call
ToolDefinition
Definition of a tool available to the LLM

Enums§

ChatMessage
ContentBlock
LlmResponse
ProviderCredential
Credential for an LLM provider (e.g., Anthropic, OpenRouter)
ReasoningEffort
StopReason