Expand description
§cognis-llm
v2-beta LLM client. Re-exports error, schemars, and Message from
cognis-core so v2 user code can target a single crate_path = "cognis_llm" and have all macro-generated paths resolve.
Re-exports§
pub use schema::schema_for_tool;pub use chat::ChatOptions;pub use chat::ChatResponse;pub use chat::HealthStatus;pub use chat::StreamChunk;pub use chat::ToolCallDelta;pub use chat::Usage;pub use tools::BaseTool;pub use tools::SchemaBasedTool;pub use tools::Tool;pub use tools::ToolDefinition;pub use tools::ToolInput;pub use tools::ToolOutput;pub use tools::ToolRegistry;pub use client::Client;pub use client::ClientBuilder;pub use factory::ProviderConstructor;pub use factory::ProviderRegistry;pub use factory::ProviderSpec;pub use provider::openrouter::OpenRouterBuilder;pub use provider::openrouter::OpenRouterProvider;pub use provider::wrappers::Capability;pub use provider::wrappers::ChatInterceptor;pub use provider::wrappers::CircuitBreakerProvider;pub use provider::wrappers::CircuitState;pub use provider::wrappers::CircuitStats;pub use provider::wrappers::FailureClassifier;pub use provider::wrappers::FnChatInterceptor;pub use provider::wrappers::GracefulDegradationProvider;pub use provider::wrappers::InterceptorProvider;pub use provider::wrappers::LoadBalancerProvider;pub use provider::wrappers::LoadBalancingStrategy;pub use provider::wrappers::ProviderRoute;pub use provider::wrappers::RandomStrategy;pub use provider::wrappers::RetryableClassifier;pub use provider::wrappers::RoundRobinStrategy;pub use provider::wrappers::RoutingProvider;pub use provider::wrappers::RoutingStrategy;pub use provider::wrappers::WeightedRoundRobinStrategy;pub use provider::LLMProvider;pub use provider::Provider;pub use streaming::Aggregated;pub use streaming::StreamAggregator;pub use usage::UsageTracker;pub use structured::StructuredClient;pub use cognis_core::schemars;
Modules§
- chat
- Chat-completion request/response types shared across providers.
- client
- User-facing
Client. Holds anArc<dyn LLMProvider>and dispatches through it. ImplementsRunnable<Vec<Message>, Message>so it composes inside graphs. - error
- Error module re-exported from cognis-core.
- factory
- Provider factory + runtime registry.
- message
- Message types for LLM conversations.
- prelude
- Common imports for v2 user code building agents and tools.
- provider
- Provider enum + LLMProvider trait. Closed enum, not an open registry — adding a provider means editing the enum.
- schema
- Schema generation tuned for OpenAI / Anthropic / Ollama tool calling.
- streaming
- Stream-chunk aggregation utilities.
- structured
- Structured output: turn a
Clientinto aRunnable<Vec<Message>, T>for anyT: JsonSchema + DeserializeOwned. - tools
- Tool trait + ergonomic tiers + supporting types.
- usage
- Running-total usage tracker for budget enforcement.
Macros§
- schema_
for - Generates a
RootSchemafor the given type using default settings. - simple_
tool - Build an
Arc<dyn Tool>inline.
Structs§
- AiMessage
- An AI/assistant message, optionally carrying tool call requests.
- Human
Message - A human/user message.
- System
Message - A system prompt or instruction message.
- Tool
Call - One tool invocation requested by the LLM in an
AiMessage. - Tool
Message - A tool execution result message.
Enums§
- Message
- A single message in an LLM conversation.
Traits§
- Json
Schema - A type which can be described as a JSON Schema document.