ai
Simple to use AI library for Rust primarily targeting OpenAI compatible providers with more to come.
This library is work in progress, and the API is subject to change.
Using the library
Add ai as a depdenency along with tokio. This
library directly uses reqwest for http client when making requests to the
servers.
cargo add ai
Cargo Features
| Feature | Description | Default |
|---|---|---|
openai_client |
Enable OpenAI client | ✅ |
azure_openai_client |
Enable Azure OpenAI client | |
ollama_client |
Enable Ollama client | |
native_tls |
Enable native TLS for reqwest http client | ✅ |
rustls_tls |
Enable rustls TLS for reqwest http client |
Examples
| Example Name | Description |
|---|---|
| openai_chat_completions | Basic chat completions using OpenAI API |
| azure_openai_chat_completions | Basic chat completions using Azure OpenAI API |
| clients_dynamic_runtime | Dynamic runtime client selection |
| chat_completions_tool_calling | Tool/Function calling example |
Chat Completion API
use ;
async
Using tuples for messages. Unrecognized role will cause panic.
let request = &default
.model
.messages
.build?;
Clients
OpenAI
let openai = new?;
let openai = from_url?;
let openai = from_env?;
Gemini API via OpenAI
Set http1_title_case_headers for Gemini API.
let gemini = default
.http_client
.api_key
.base_url
.build?;
Azure OpenAI
let azure_openai = default
.auth
// .auth(ai::clients::azure_openai::Auth::ApiKey(
// std::env::var(ai::clients::azure_openai::AZURE_OPENAI_API_KEY_ENV_VAR)
// .map_err(|e| Error::EnvVarError(ai::clients::azure_openai::AZURE_OPENAI_API_KEY_ENV_VAR.to_string(), e))?
// .into(),
// ))
.api_version
.base_url
.build?;
Pass the deployment_id as model of the ChatCompletionRequest.
Use the following command to get bearer token.
Ollama
Suggest using openai client instead of ollama for maximum compatibility.
let ollama = new?;
let ollama = from_url?;
LICENSE
MIT