Crate async_openai

source ·
Expand description

Async Rust library for OpenAI REST API based on OpenAPI spec.

Creating client

use async_openai as openai;

// Create a client with api key from env var OPENAI_API_KEY and default base url.
let client = openai::Client::new();

// OR use API key from different source
let api_key = "sk-..."; // This could be from a file, hard coding secret is not a best practice.
let client = openai::Client::new().with_api_key(api_key);

Making requests

 use async_openai as openai;
 use openai::{Client, Completion, types::{CreateCompletionRequest}};

 // Create client
 let client = Client::new();
 // Create request
 let request = CreateCompletionRequest {
     model: "text-davinci-003".to_owned(),
     prompt: Some("Tell me a joke about the universe".to_owned()),
     ..Default::default()
 };
 // Call API
 let response = Completion::create(&client, request).await.unwrap();

 println!("{}", response.choices.first().unwrap().text);

Examples

For full working examples for all supported features see examples directory in the repository.

Modules

Errors originating from API calls, parsing responses, and reading-or-writing to the file system.
Types used in OpenAI API requests and responses. These types are created from component schemas in the OpenAPI spec

Structs

Client container for api key, base url and other metadata required to make API calls.
Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.
Given a prompt and an instruction, the model will return an edited version of the prompt.
Given a prompt and/or an input image, the model will generate a new image.
List and describe the various models available in the API. You can refer to the Models documentation to understand what models are available and the differences between them.
Given a input text, outputs if the model classifies it as violating OpenAI’s content policy.

Constants

Default v1 API base url