cognis-llm
A unified client for Large Language Models (LLMs) with built-in support for tool calling and multiple providers.
Purpose
cognis-llm provides a standard Client and Provider abstraction to interact with various LLM APIs (OpenAI, Ollama, Anthropic, etc.). It simplifies the process of sending chat messages, receiving responses, and handling complex tool-calling workflows.
Key Features
- Multi-Provider Support: Switch between OpenAI, Ollama, Anthropic, Google, and Azure with minimal configuration.
- Unified Client: A single
Clientinterface that implementsRunnable<Vec<Message>, AiMessage>. - Tool Ergonomics: Simplified 5-tier tool system for defining and executing functions that LLMs can call.
- Structured Output: Support for enforcing JSON schemas on model responses.
- Resilience: Built-in
CircuitBreaker,LoadBalancer, andRetryableprovider wrappers.
Usage
Add this to your Cargo.toml:
[]
= "0.3"
Basic Example: Simple Chat
use *;
async
Tool Calling Example
use *;
use tool;
/// Get the current weather for a location.
async
async