Skip to main content

Crate llm_stack_openai

Crate llm_stack_openai 

Source
Expand description

OpenAI provider for the llm-stack SDK.

This crate implements Provider for OpenAI’s Chat Completions API, supporting both non-streaming and streaming generation with tool calling and structured output.

§Quick start

use llm_stack_openai::{OpenAiConfig, OpenAiProvider};
use llm_stack::{ChatMessage, ChatParams, Provider};

let provider = OpenAiProvider::new(OpenAiConfig {
    api_key: std::env::var("OPENAI_API_KEY").unwrap(),
    ..Default::default()
});

let params = ChatParams {
    messages: vec![ChatMessage::user("Hello!")],
    ..Default::default()
};

let response = provider.generate(&params).await?;
println!("{}", response.text().unwrap_or("no text"));

Structs§

OpenAiConfig
Configuration for the OpenAI provider.
OpenAiFactory
Factory for creating OpenAiProvider instances from configuration.
OpenAiProvider
OpenAI provider implementing Provider.

Functions§

register_global
Registers the OpenAI factory with the global registry.