Skip to main content

Crate cognis_llm

Crate cognis_llm 

Source
Expand description

§cognis-llm

v2-beta LLM client. Re-exports error, schemars, and Message from cognis-core so v2 user code can target a single crate_path = "cognis_llm" and have all macro-generated paths resolve.

Re-exports§

pub use schema::schema_for_tool;
pub use chat::ChatOptions;
pub use chat::ChatResponse;
pub use chat::HealthStatus;
pub use chat::StreamChunk;
pub use chat::ToolCallDelta;
pub use chat::Usage;
pub use tools::BaseTool;
pub use tools::SchemaBasedTool;
pub use tools::Tool;
pub use tools::ToolDefinition;
pub use tools::ToolInput;
pub use tools::ToolOutput;
pub use tools::ToolRegistry;
pub use client::Client;
pub use client::ClientBuilder;
pub use factory::ProviderConstructor;
pub use factory::ProviderRegistry;
pub use factory::ProviderSpec;
pub use provider::openrouter::OpenRouterBuilder;
pub use provider::openrouter::OpenRouterProvider;
pub use provider::wrappers::Capability;
pub use provider::wrappers::ChatInterceptor;
pub use provider::wrappers::CircuitBreakerProvider;
pub use provider::wrappers::CircuitState;
pub use provider::wrappers::CircuitStats;
pub use provider::wrappers::FailureClassifier;
pub use provider::wrappers::FnChatInterceptor;
pub use provider::wrappers::GracefulDegradationProvider;
pub use provider::wrappers::InterceptorProvider;
pub use provider::wrappers::LoadBalancerProvider;
pub use provider::wrappers::LoadBalancingStrategy;
pub use provider::wrappers::ProviderRoute;
pub use provider::wrappers::RandomStrategy;
pub use provider::wrappers::RetryableClassifier;
pub use provider::wrappers::RoundRobinStrategy;
pub use provider::wrappers::RoutingProvider;
pub use provider::wrappers::RoutingStrategy;
pub use provider::wrappers::WeightedRoundRobinStrategy;
pub use provider::LLMProvider;
pub use provider::Provider;
pub use streaming::Aggregated;
pub use streaming::StreamAggregator;
pub use usage::UsageTracker;
pub use structured::StructuredClient;
pub use cognis_core::schemars;

Modules§

chat
Chat-completion request/response types shared across providers.
client
User-facing Client. Holds an Arc<dyn LLMProvider> and dispatches through it. Implements Runnable<Vec<Message>, Message> so it composes inside graphs.
error
Error module re-exported from cognis-core.
factory
Provider factory + runtime registry.
message
Message types for LLM conversations.
prelude
Common imports for v2 user code building agents and tools.
provider
Provider enum + LLMProvider trait. Closed enum, not an open registry — adding a provider means editing the enum.
schema
Schema generation tuned for OpenAI / Anthropic / Ollama tool calling.
streaming
Stream-chunk aggregation utilities.
structured
Structured output: turn a Client into a Runnable<Vec<Message>, T> for any T: JsonSchema + DeserializeOwned.
tools
Tool trait + ergonomic tiers + supporting types.
usage
Running-total usage tracker for budget enforcement.

Macros§

schema_for
Generates a RootSchema for the given type using default settings.
simple_tool
Build an Arc<dyn Tool> inline.

Structs§

AiMessage
An AI/assistant message, optionally carrying tool call requests.
HumanMessage
A human/user message.
SystemMessage
A system prompt or instruction message.
ToolCall
One tool invocation requested by the LLM in an AiMessage.
ToolMessage
A tool execution result message.

Enums§

Message
A single message in an LLM conversation.

Traits§

JsonSchema
A type which can be described as a JSON Schema document.

Derive Macros§

JsonSchema