Crate miyabi_llm_core

Crate miyabi_llm_core 

Source
Expand description

Miyabi LLM Core - Unified LLM Interface for Rust

This crate provides core traits and types for interacting with various LLM providers. It is designed to be provider-agnostic and serves as the foundation for provider-specific implementations (OpenAI, Anthropic, Google, etc.).

§Features

  • Unified Interface: Single trait (LlmClient) for all providers
  • Tool Calling: First-class support for function/tool calling
  • Streaming: Optional streaming support via LlmStreamingClient
  • Type Safety: Strong typing for messages, roles, and tool definitions
  • Async/Await: Built on Tokio for efficient async operations

§Example

use miyabi_llm_core::{LlmClient, Message};

async fn chat_example(client: impl LlmClient) -> Result<String, Box<dyn std::error::Error>> {
    let messages = vec![
        Message::system("You are a helpful assistant"),
        Message::user("Hello!"),
    ];

    let response = client.chat(messages).await?;
    Ok(response)
}

§Provider Implementations

This crate defines the core interfaces. Actual provider implementations are in separate crates:

  • miyabi-llm-openai - OpenAI (GPT-4o, GPT-4 Turbo, o1)
  • miyabi-llm-anthropic - Anthropic (Claude 3.5 Sonnet, Opus)
  • miyabi-llm-google - Google (Gemini 1.5 Pro/Flash)

§Architecture

miyabi-llm-core (this crate)
    ├── LlmClient trait
    ├── LlmStreamingClient trait
    ├── Message, Role types
    ├── ToolDefinition, ToolCall types
    └── LlmError type
        ↓ implements
    ┌───────────────────────────────┐
    │   Provider Implementations    │
    ├───────────────────────────────┤
    │ miyabi-llm-openai             │
    │ miyabi-llm-anthropic          │
    │ miyabi-llm-google             │
    └───────────────────────────────┘
        ↓ unified via
    ┌───────────────────────────────┐
    │   Integration Package         │
    ├───────────────────────────────┤
    │ miyabi-llm                    │
    │ ├── HybridRouter              │
    │ └── Feature flags             │
    └───────────────────────────────┘

Re-exports§

pub use client::LlmClient;
pub use client::ToolCallResponse;
pub use error::LlmError;
pub use error::Result;
pub use message::Message;
pub use message::Role;
pub use streaming::LlmStreamingClient;
pub use streaming::StreamEvent;
pub use streaming::StreamResponse;
pub use tools::ToolCall;
pub use tools::ToolDefinition;

Modules§

client
LLM Client trait and response types
error
Error types for LLM operations
message
Message types for LLM conversation
streaming
Streaming support for LLM responses
tools
Tool definitions for Function Calling