Crate ferrous_llm_openai

Crate ferrous_llm_openai 

Source
Expand description

OpenAI provider for the LLM library.

This crate provides an implementation of the LLM core traits for OpenAI’s API, including support for chat, completion, streaming, embeddings, and tool calling.

Re-exports§

pub use config::OpenAIConfig;
pub use error::OpenAIError;
pub use provider::OpenAIProvider;
pub use types::OpenAIChatChoice;
pub use types::OpenAIChatRequest;
pub use types::OpenAIChatResponse;
pub use types::OpenAICompletionChoice;
pub use types::OpenAICompletionRequest;
pub use types::OpenAICompletionResponse;
pub use types::OpenAIEmbeddingsRequest;
pub use types::OpenAIEmbeddingsResponse;
pub use types::OpenAIMessage;
pub use types::OpenAITool;
pub use types::OpenAIToolCall;
pub use types::OpenAIUsage;

Modules§

config
OpenAI provider configuration.
error
OpenAI-specific error types.
provider
OpenAI provider implementation.
types
OpenAI-specific request and response types.

Traits§

ChatProvider
Base trait for chat-based LLM providers.
CompletionProvider
Trait for providers that support text completion (non-chat).
EmbeddingProvider
Trait for providers that support text embeddings.
StreamingProvider
Optional trait for providers that support streaming responses.
ToolProvider
Optional trait for providers that support tool/function calling.