ai
Simple to use AI library for Rust primarily targeting OpenAI compatible providers with more to come.
This library is work in progress, and the API is subject to change.
Table of Contents
Using the library
Add ai as a dependency along with tokio. For
streaming add futures crate, for CancellationToken support add tokio-util.
This library directly uses reqwest for http client when making requests to the
servers.
cargo add ai
Cargo Features
| Feature | Description | Default |
|---|---|---|
openai_client |
Enable OpenAI client | ✅ |
azure_openai_client |
Enable Azure OpenAI client | ✅ |
ollama_client |
Enable Ollama client | |
native_tls |
Enable native TLS for reqwest http client | ✅ |
rustls_tls |
Enable rustls TLS for reqwest http client |
Examples
| Example Name | Description |
|---|---|
| azure_openai_chat_completions | Basic chat completions using Azure OpenAI API |
| chat_completions_streaming | Chat completions streaming example |
| chat_completions_streaming_with_cancellation_token | Chat completions streaming with cancellation token |
| chat_completions_tool_calling | Tool/Function calling example |
| chat_console | Console chat example |
| clients_dynamic_runtime | Dynamic runtime client selection |
| openai_chat_completions | Basic chat completions using OpenAI API |
Chat Completion API
use ;
async
Using tuples for messages. Unrecognized role will cause panic.
let request = &default
.model
.messages
.build?;
Clients
OpenAI
let openai = new?;
let openai = from_url?;
let openai = from_env?;
Gemini API via OpenAI
Set http1_title_case_headers for Gemini API.
let gemini = default
.http_client
.api_key
.base_url
.build?;
Azure OpenAI
cargo add ai --features=azure_openai_client
let azure_openai = default
.auth
// .auth(ai::clients::azure_openai::Auth::ApiKey(
// std::env::var(ai::clients::azure_openai::AZURE_OPENAI_API_KEY_ENV_VAR)
// .map_err(|e| Error::EnvVarError(ai::clients::azure_openai::AZURE_OPENAI_API_KEY_ENV_VAR.to_string(), e))?
// .into(),
// ))
.api_version
.base_url
.build?;
Pass the deployment_id as model of the ChatCompletionRequest.
Use the following command to get bearer token.
Ollama
Suggest using openai client instead of ollama for maximum compatibility.
let ollama = new?;
let ollama = from_url?;
LICENSE
MIT