Overview
async-openai is an unofficial Rust library for OpenAI, based on OpenAI OpenAPI spec. It implements all APIs from the spec.
| What | APIs | Crate Feature Flags |
|---|---|---|
| Responses API | Responses, Conversations, Streaming events | responses |
| Webhooks | Webhook Events | webhook |
| Platform APIs | Audio, Audio Streaming, Videos, Images, Image Streaming, Embeddings, Evals, Fine-tuning, Graders, Batch, Files, Uploads, Models, Moderations | audio, video, image, embedding, evals, finetuning, grader, batch, file, upload, model, moderation |
| Vector stores | Vector stores, Vector store files, Vector store file batches | vectorstore |
| ChatKit (Beta) | ChatKit | chatkit |
| Containers | Containers, Container Files | container |
| Skills | Skills | skill |
| Realtime | Realtime Calls, Client secrets, Client events, Server events | realtime |
| Chat Completions | Chat Completions, Streaming | chat-completion |
| Assistants (Beta) | Assistants, Threads, Messages, Runs, Run steps, Streaming | assistant |
| Administration | Admin API Keys, Invites, Users, Groups, Roles, Role assignments, Projects, Project users, Project groups, Project service accounts, Project API keys, Project rate limits, Audit logs, Usage, Certificates | administration |
| Legacy | Completions | completions |
- Bring your own custom types for Request or Response objects.
- Requests are retried with exponential backoff when rate limited.
- Ergonomic builder pattern for all request objects.
- SSE streaming.
- Customize path, query and headers per request or for all requests.
- Granular feature flags to enable any types or apis.
- Microsoft Azure OpenAI Service.
- WASM.
- Middleware support with tower ecosystem.
Usage
The library reads API key from the environment variable OPENAI_API_KEY.
# On macOS/Linux
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
Other official environment variables supported are: OPENAI_ADMIN_KEY, OPENAI_BASE_URL, OPENAI_ORG_ID, OPENAI_PROJECT_ID
- Visit examples directory on how to use
async-openai. - Visit docs.rs/async-openai for docs.
Image Generation Example
use ;
use Error;
async
Bring Your Own Types
Enable methods whose input and outputs are generics with byot feature. It creates a new method with same name and _byot suffix.
For example, to use serde_json::Value as request and response type:
let response: Value = client
.chat
.create_byot
.await?;
This can be useful in many scenarios:
- To use this library with other OpenAI compatible APIs whose types don't exactly match OpenAI.
- Extend existing types in this crate with new fields with
serde(for example with#[serde(flatten)]). - To avoid typing verbose types.
- To escape deserialization errors.
*_byot methods require same trait bounds as regular methods.
Visit examples/bring-your-own-type directory to learn more.
References: Borrow Instead of Move
With byot use reference to request types
let response: Response = client
.responses
.create_byot.await?
Visit examples/borrow-instead-of-move to learn more.
Rust Types
To only use Rust types from the crate - disable default features and use feature flag types.
There are granular feature flags like response-types, chat-completion-types, etc.
These granular types are enabled when the corresponding API feature is enabled - for example responses will enable response-types.
Configurable Requests
Individual Request
Certain individual APIs that need additional query or header parameters - these can be provided by chaining .query(), .header(), .headers() on the API group.
For example:
client.
.chat
// query can be a struct or a map too.
.query?
// header for demo
.header?
.list
.await?
All Requests
Use Config, OpenAIConfig etc. for configuring url, headers or query parameters globally for all requests.
OpenAI-compatible Providers
Even though the scope of the crate is official OpenAI APIs, it is very configurable to work with compatible providers.
Configurable Path
In addition to .query(), .header(), .headers() a path for individual request can be changed by using .path(), method on the API group.
For example:
client
.chat
.path?
.create
.await?
Dynamic Dispatch
This allows you to use same code (say a fn) to call APIs on different OpenAI-compatible providers.
Create a client with Box or Arc wrapped configuration.
For example:
use ;
// Use `Box` or `std::sync::Arc` to wrap the config
let config = Boxnew as ;
// create client
let client: = with_config;
// A function can now accept a `&Client<Box<dyn Config>>` parameter
// which can invoke any openai compatible api
Webhooks
Support for webhook includes event types, signature verification, and building webhook events from payloads.
Middleware
Middleware is supported via Tower ecosystem, which can be enabled with middleware feature. See middleware for more detail.
Contributing
🎉 Thank you for taking the time to contribute and improve the project. I'd be happy to have you!
Please see contributing guide!
Complimentary Crates
- async-openai-wasm provides WASM support.
- openai-func-enums macros for working with function/tool calls.
License
This project is licensed under MIT license.