Expand description
OpenAI provider for the llm-stack SDK.
This crate implements Provider for OpenAI’s
Chat Completions API, supporting both non-streaming and streaming
generation with tool calling and structured output.
§Quick start
use llm_stack_openai::{OpenAiConfig, OpenAiProvider};
use llm_stack::{ChatMessage, ChatParams, Provider};
let provider = OpenAiProvider::new(OpenAiConfig {
api_key: std::env::var("OPENAI_API_KEY").unwrap(),
..Default::default()
});
let params = ChatParams {
messages: vec![ChatMessage::user("Hello!")],
..Default::default()
};
let response = provider.generate(¶ms).await?;
println!("{}", response.text().unwrap_or("no text"));Structs§
- Open
AiConfig - Configuration for the
OpenAIprovider. - Open
AiFactory - Factory for creating
OpenAiProviderinstances from configuration. - Open
AiProvider OpenAIprovider implementingProvider.
Functions§
- register_
global - Registers the
OpenAIfactory with the global registry.