Crate miyabi_llm_openai

Crate miyabi_llm_openai 

Source
Expand description

OpenAI GPT SDK for Miyabi LLM

This crate provides OpenAI GPT API integration for the Miyabi LLM framework. It implements the LlmClient and LlmStreamingClient traits from miyabi-llm-core.

§Features

  • GPT-4o, GPT-4o-mini, GPT-4 Turbo support
  • Tool/function calling
  • Streaming responses via SSE
  • Environment variable configuration

§Example

use miyabi_llm_openai::OpenAIClient;
use miyabi_llm_core::{LlmClient, Message};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = OpenAIClient::from_env()?;
    let messages = vec![Message::user("Hello!")];
    let response = client.chat(messages).await?;
    println!("Response: {}", response);
    Ok(())
}

Structs§

Message
Message in a conversation
OpenAIChoice
Choice in OpenAI response
OpenAIClient
OpenAI GPT client
OpenAIFunction
OpenAI function definition
OpenAIFunctionCall
Function call in OpenAI tool call
OpenAIMessage
OpenAI message format
OpenAIResponse
OpenAI API response
OpenAIResponseMessage
Response message in OpenAI choice
OpenAITool
OpenAI tool definition
OpenAIToolCall
Tool call in OpenAI response
OpenAIUsage
Token usage information
ToolCall
Tool call from LLM
ToolDefinition
Tool definition for LLM Function Calling

Enums§

LlmError
Core LLM error type
Role
Message role
StreamEvent
Stream event type for more detailed streaming information
ToolCallResponse
Response from LLM with tool calling support

Traits§

LlmClient
LLM Client trait - unified interface for all providers
LlmStreamingClient
LLM Streaming Client trait

Type Aliases§

Result
Result type for LLM operations
StreamResponse
Type alias for streaming response