pub struct ProviderConfig {
pub base_url: String,
pub api_key_env: String,
pub chat_endpoint: String,
pub chat_model: String,
pub multimodal_model: Option<String>,
pub upload_endpoint: Option<String>,
pub upload_size_limit: Option<u64>,
pub models_endpoint: Option<String>,
pub headers: HashMap<String, String>,
pub field_mapping: FieldMapping,
}
Expand description
Provider configuration template defining API access parameters
This struct contains all necessary configuration for connecting to an AI provider, including base URL, API endpoints, authentication, and model specifications.
Fields§
§base_url: String
Base URL for the provider’s API
api_key_env: String
Environment variable name for the API key
chat_endpoint: String
Chat completion endpoint path
chat_model: String
Default chat model for this provider
multimodal_model: Option<String>
Optional multimodal model for this provider (if supported)
upload_endpoint: Option<String>
Optional file upload endpoint path (e.g. OpenAI: “/v1/files”)
upload_size_limit: Option<u64>
Optional file size limit (bytes) above which files should be uploaded instead of inlined
models_endpoint: Option<String>
Model list endpoint path
headers: HashMap<String, String>
Request headers template
field_mapping: FieldMapping
Field mapping configuration
Implementations§
Source§impl ProviderConfig
impl ProviderConfig
Sourcepub fn openai_compatible(
base_url: &str,
api_key_env: &str,
chat_model: &str,
multimodal_model: Option<&str>,
) -> Self
pub fn openai_compatible( base_url: &str, api_key_env: &str, chat_model: &str, multimodal_model: Option<&str>, ) -> Self
OpenAI-compatible configuration template
Creates a standard OpenAI-compatible configuration with default models. The default chat model is “gpt-3.5-turbo” and multimodal model is “gpt-4o”.
§Arguments
base_url
- The base URL for the provider’s APIapi_key_env
- Environment variable name for the API keychat_model
- Default chat model namemultimodal_model
- Optional multimodal model name
Sourcepub fn openai_compatible_default(base_url: &str, api_key_env: &str) -> Self
pub fn openai_compatible_default(base_url: &str, api_key_env: &str) -> Self
OpenAI-compatible configuration template with default models
This is a convenience method that uses standard default models.
For custom models, use openai_compatible()
with explicit model names.
Sourcepub fn validate(&self) -> Result<(), AiLibError>
pub fn validate(&self) -> Result<(), AiLibError>
Validate the configuration for completeness and correctness
§Returns
Result<(), AiLibError>
- Ok on success, error information on failure
Sourcepub fn models_url(&self) -> Option<String>
pub fn models_url(&self) -> Option<String>
Get the complete models list URL
Sourcepub fn upload_url(&self) -> Option<String>
pub fn upload_url(&self) -> Option<String>
Get the complete file upload URL
Sourcepub fn default_chat_model(&self) -> &str
pub fn default_chat_model(&self) -> &str
Get the default chat model for this provider
Sourcepub fn multimodal_model(&self) -> Option<&str>
pub fn multimodal_model(&self) -> Option<&str>
Get the multimodal model if available
Trait Implementations§
Source§impl Clone for ProviderConfig
impl Clone for ProviderConfig
Source§fn clone(&self) -> ProviderConfig
fn clone(&self) -> ProviderConfig
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read more