Expand description
MiniLLMLib - A minimalist Rust library for LLM interactions
This library provides a clean, async-first interface for interacting with Large Language Models via HTTP APIs (OpenRouter, OpenAI, etc.).
§Features
- Conversation Trees:
ChatNodeprovides a tree-based conversation structure supporting branching dialogues and conversation history - Streaming Support: First-class support for streaming completions via SSE
- Multimodal: Support for images and audio in messages
- JSON Repair: Robust handling of malformed JSON from LLM outputs
- Async/Parallel: Built on Tokio for high-performance async operations
§Quick Start
use minillmlib::{ChatNode, GeneratorInfo};
#[tokio::main]
async fn main() -> minillmlib::error::Result<()> {
// Create a generator for OpenRouter
let generator = GeneratorInfo::openrouter("anthropic/claude-3.5-sonnet");
// Start a conversation
let root = ChatNode::root("You are a helpful assistant.");
let response = root.chat("Hello!", &generator).await?;
println!("Assistant: {}", response.text().unwrap_or_default());
Ok(())
}Re-exports§
pub use chat_node::format_conversation;pub use chat_node::pretty_messages;pub use chat_node::ChatNode;pub use chat_node::ConversationBuilder;pub use chat_node::PrettyPrintConfig;pub use chat_node::ThreadData;pub use chat_node::ThreadMessage;pub use error::MiniLLMError;pub use error::Result;pub use generator::CompletionParameters;pub use generator::GeneratorInfo;pub use generator::NodeCompletionParameters;pub use generator::ProviderSettings;pub use json_repair::loads;pub use json_repair::repair_json;pub use json_repair::JsonValue;pub use json_repair::RepairOptions;pub use message::AudioData;pub use message::ImageData;pub use message::Media;pub use message::MediaData;pub use message::Message;pub use message::MessageContent;pub use message::Role;pub use message::VideoData;pub use provider::CompletionResponse;pub use provider::CostCallback;pub use provider::CostInfo;pub use provider::CostTrackingType;pub use provider::LLMClient;pub use provider::StreamingCompletion;pub use utils::configure_logging;pub use utils::extract_json;pub use utils::extract_json_value;pub use utils::pretty_json;pub use utils::to_dict;pub use utils::validate_json_response;pub use utils::LogLevel;
Modules§
- chat_
node - ChatNode - Core conversation tree structure
- error
- Error types for MiniLLMLib
- generator
- Generator configuration types for LLM interactions
- json_
repair - JSON Repair module - fixes malformed JSON from LLM outputs
- message
- Message types for LLM conversations
- provider
- LLM Provider implementations
- utils
- Utility functions for MiniLLMLib
Functions§
- init
- Initialize the library with default settings
- init_
with_ logging - Initialize with a specific log level