Expand description
§Azure OpenAI Provider for LLM Kit
This crate provides an Azure OpenAI provider implementation for the LLM Kit. It allows you to use Azure OpenAI models for text generation, embeddings, and image generation.
§Features
- Chat Models: GPT-4, GPT-3.5-turbo, and other chat models
- Completion Models: GPT-3.5-turbo-instruct and other completion models
- Embedding Models: text-embedding-ada-002 and other embedding models
- Image Models: DALL-E 3 and other image generation models
- Azure-specific Authentication: Uses
api-keyheader - Flexible URL Formats: Supports both v1 API and deployment-based URLs
§Quick Start (Recommended: Builder Pattern)
use llm_kit_azure::AzureClient;
use llm_kit_provider::LanguageModel;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create Azure OpenAI provider using the builder
let provider = AzureClient::new()
.resource_name("my-azure-resource")
.api_key("your-api-key")
.build();
// Get a chat model using your deployment name
let model = provider.chat_model("gpt-4-deployment");
println!("Model: {}", model.model_id());
println!("Provider: {}", model.provider());
Ok(())
}§Alternative: Direct Instantiation
use llm_kit_azure::{AzureOpenAIProvider, AzureOpenAIProviderSettings};
use llm_kit_provider::LanguageModel;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create provider with settings
let provider = AzureOpenAIProvider::new(
AzureOpenAIProviderSettings::new()
.with_resource_name("my-azure-resource")
.with_api_key("your-api-key")
);
let model = provider.chat_model("gpt-4-deployment");
println!("Model: {}", model.model_id());
Ok(())
}§Configuration Options
§Using Resource Name
use llm_kit_azure::AzureClient;
let provider = AzureClient::new()
.resource_name("my-resource")
.api_key("key")
.build();§Using Custom Base URL
use llm_kit_azure::AzureClient;
let provider = AzureClient::new()
.base_url("https://my-resource.openai.azure.com/openai")
.api_key("key")
.build();§With Custom API Version
use llm_kit_azure::AzureClient;
let provider = AzureClient::new()
.resource_name("my-resource")
.api_key("key")
.api_version("2024-02-15-preview")
.build();§With Custom Headers
use llm_kit_azure::AzureClient;
let provider = AzureClient::new()
.resource_name("my-resource")
.api_key("key")
.header("X-Custom-Header", "value")
.build();§With Deployment-Based URLs (Legacy Format)
use llm_kit_azure::AzureClient;
let provider = AzureClient::new()
.resource_name("my-resource")
.api_key("key")
.use_deployment_based_urls(true)
.build();§URL Formats
Azure OpenAI supports two URL formats:
§V1 API Format (Default)
https://{resource}.openai.azure.com/openai/v1{path}?api-version={version}§Deployment-Based Format (Legacy)
https://{resource}.openai.azure.com/openai/deployments/{deployment}{path}?api-version={version}Use .with_use_deployment_based_urls(true) to enable the legacy format.
§Environment Variables
The provider will read from these environment variables if not explicitly configured:
AZURE_API_KEY- API key for authenticationAZURE_RESOURCE_NAME- Azure OpenAI resource name
§Model Types
§Chat Models
Use .chat_model() or .model() for conversational AI:
let model = provider.chat_model("gpt-4-deployment");§Completion Models
Use .completion_model() for text completion:
let model = provider.completion_model("gpt-35-turbo-instruct");§Embedding Models
Use .text_embedding_model() for embeddings:
let model = provider.text_embedding_model("text-embedding-ada-002");§Image Models
Use .image_model() for image generation:
let model = provider.image_model("dall-e-3");Structs§
- Azure
Client - Builder for creating an Azure OpenAI provider.
- Azure
OpenAI Provider - Azure OpenAI provider implementation.
- Azure
OpenAI Provider Settings - Configuration options for creating an Azure OpenAI provider.