pub struct OpenAIProvider { /* private fields */ }Expand description
OpenAI API provider.
Also supports OpenAI-compatible APIs (Groq, Together, Fireworks, etc.) by specifying a custom base_url.
Also supports Azure OpenAI by specifying an AzureConfig.
Implementations§
Source§impl OpenAIProvider
impl OpenAIProvider
Sourcepub fn new(api_key: String, model: String) -> Self
pub fn new(api_key: String, model: String) -> Self
Create a new OpenAI provider with API key and model.
Sourcepub fn with_base_url(api_key: String, model: String, base_url: String) -> Self
pub fn with_base_url(api_key: String, model: String, base_url: String) -> Self
Create a new OpenAI-compatible provider with a custom base URL.
Use this for providers like Groq, Together, Fireworks, etc. The base_url should be the API base (e.g., “https://api.groq.com/openai/v1”).
Sourcepub fn azure(
api_key: String,
resource: String,
deployment: String,
api_version: String,
) -> Self
pub fn azure( api_key: String, resource: String, deployment: String, api_version: String, ) -> Self
Create a new Azure OpenAI provider.
Azure OpenAI uses a different URL format and authentication header. URL: https://{resource}.openai.azure.com/openai/deployments/{deployment}/chat/completions?api-version={version} Auth: api-key header instead of Authorization: Bearer
Trait Implementations§
Source§impl LlmProvider for OpenAIProvider
impl LlmProvider for OpenAIProvider
Source§fn send_msg(
&self,
client: &HttpClient,
messages: &[Message],
options: &MessageOptions,
) -> Pin<Box<dyn Future<Output = Result<Message, LlmError>> + Send>>
fn send_msg( &self, client: &HttpClient, messages: &[Message], options: &MessageOptions, ) -> Pin<Box<dyn Future<Output = Result<Message, LlmError>> + Send>>
Send a message to the LLM.
Returns the assistant’s response message or an error.
Source§fn send_msg_stream(
&self,
client: &HttpClient,
messages: &[Message],
options: &MessageOptions,
) -> Pin<Box<dyn Future<Output = Result<Pin<Box<dyn Stream<Item = Result<StreamEvent, LlmError>> + Send>>, LlmError>> + Send>>
fn send_msg_stream( &self, client: &HttpClient, messages: &[Message], options: &MessageOptions, ) -> Pin<Box<dyn Future<Output = Result<Pin<Box<dyn Stream<Item = Result<StreamEvent, LlmError>> + Send>>, LlmError>> + Send>>
Send a streaming message to the LLM.
Returns a stream of events as they arrive from the API.
Auto Trait Implementations§
impl Freeze for OpenAIProvider
impl RefUnwindSafe for OpenAIProvider
impl Send for OpenAIProvider
impl Sync for OpenAIProvider
impl Unpin for OpenAIProvider
impl UnwindSafe for OpenAIProvider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more