Expand description
OpenAI provider for the LLM library.
This crate provides an implementation of the LLM core traits for OpenAI’s API, including support for chat, completion, streaming, embeddings, and tool calling.
Re-exports§
pub use config::OpenAIConfig;
pub use error::OpenAIError;
pub use provider::OpenAIProvider;
pub use types::OpenAIChatChoice;
pub use types::OpenAIChatRequest;
pub use types::OpenAIChatResponse;
pub use types::OpenAICompletionChoice;
pub use types::OpenAICompletionRequest;
pub use types::OpenAICompletionResponse;
pub use types::OpenAIEmbeddingsRequest;
pub use types::OpenAIEmbeddingsResponse;
pub use types::OpenAIMessage;
pub use types::OpenAITool;
pub use types::OpenAIToolCall;
pub use types::OpenAIUsage;
Modules§
- config
- OpenAI provider configuration.
- error
- OpenAI-specific error types.
- provider
- OpenAI provider implementation.
- types
- OpenAI-specific request and response types.
Traits§
- Chat
Provider - Base trait for chat-based LLM providers.
- Completion
Provider - Trait for providers that support text completion (non-chat).
- Embedding
Provider - Trait for providers that support text embeddings.
- Streaming
Provider - Optional trait for providers that support streaming responses.
- Tool
Provider - Optional trait for providers that support tool/function calling.