Struct OpenAI

Source
pub struct OpenAI<C: OpenAIConfig> {
    pub client: Client,
    pub api_key: String,
    pub disable_live_stream: bool,
    pub config: C,
}
Expand description

The OpenAI struct is the main entry point for interacting with the OpenAI API. It contains the API key, the client, and the configuration for the API call, such as the chat completion endpoint. It also contains a boolean flag to disable the live stream of the chat endpoint.

Fields§

§client: Client

The HTTP client used to make requests to the OpenAI API.

§api_key: String

The API key used to authenticate with the OpenAI API.

§disable_live_stream: bool

A boolean flag to disable the live stream of the chat endpoint.

§config: C

An endpoint specific configuration struct that holds all necessary parameters for the API call.

Implementations§

Source§

impl<C: OpenAIConfig + Serialize + Debug> OpenAI<C>

Source

pub fn new() -> Self

Examples found in repository?
examples/list_files.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let res = OpenAI::<Files>::new().list().await?;
6    println!("{:#?}", res);
7    Ok(())
8}
More examples
Hide additional examples
examples/moderation.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let resp = OpenAI::<Moderation>::new()
6        .moderate("I want to kill you.")
7        .await?;
8    println!("Moderation: {:?}", resp);
9    Ok(())
10}
examples/chat.rs (line 7)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6
7    OpenAI::<Chat>::new()
8        .set_primer(context_primer)
9        .chat()
10        .await?;
11    Ok(())
12}
examples/edit_image.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let image_list = OpenAI::<Image>::new()
6        .edit("Invert the colors", "./img/logo.png", None)
7        .await?;
8    println!("Image list: {:?}", image_list);
9
10    Ok(())
11}
examples/transcribe.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let transcribe = OpenAI::<Audio>::new()
6        .transcribe("examples/samples/sample-1.mp3")
7        .await?;
8    println!("Transcription: {:?}", transcribe.text);
9    Ok(())
10}
examples/translate.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let translate = OpenAI::<Audio>::new()
6        .translate("examples/samples/colours-german.mp3")
7        .await?;
8    println!("Translation: {:?}", translate.text);
9    Ok(())
10}
Source

pub fn with_config(self, config: C) -> Self

Allows to batch configure the AI assistant with the settings provided in the Chat struct.

§Arguments
  • config: A Chat struct that contains the settings for the AI assistant.
§Returns

This function returns the instance of the AI assistant with the new configuration.

Source

pub fn disable_stdout(self) -> Self

Disables standard output for the instance of OpenAi, which is enabled by default. This is only interesting for the chat completion, as it will otherwise print the messages of the AI assistant to the terminal.

Source

pub fn is_valid_temperature(&mut self, temperature: f64, limit: f64) -> bool

Source

pub async fn models( &mut self, ) -> Result<Vec<String>, Box<dyn Error + Send + Sync>>

Fetches a list of available models from the OpenAI API.

This method sends a GET request to the OpenAI API and returns a vector of identifiers of all available models.

§Returns

A Result which is:

  • Ok if the request was successful, carrying a Vec<String> of model identifiers.
  • Err if the request or the parsing failed, carrying the error of type Box<dyn std::error::Error + Send + Sync>.
§Errors

This method will return an error if the GET request fails, or if the response from the OpenAI API cannot be parsed into a ModelsResponse.

§Example
use aionic::openai::{OpenAI, Chat};


#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let mut client = OpenAI::<Chat>::new();
    match client.models().await {
        Ok(models) => println!("Models: {:?}", models),
        Err(e) => println!("Error: {}", e),
    }
   Ok(())
}
§Note

This method is async and needs to be awaited.

Source

pub async fn check_model( &mut self, model: &str, ) -> Result<Model, Box<dyn Error + Send + Sync>>

Fetches a specific model by identifier from the OpenAI API.

This method sends a GET request to the OpenAI API for a specific model and returns the Model.

§Parameters
  • model: A &str that represents the name of the model to fetch.
§Returns

A Result which is:

  • Ok if the request was successful, carrying the Model.
  • Err if the request or the parsing failed, carrying the error of type Box<dyn std::error::Error + Send + Sync>.
§Errors

This method will return an error if the GET request fails, or if the response from the OpenAI API cannot be parsed into a Model.

§Example
use aionic::openai::{OpenAI, Chat};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let mut client = OpenAI::<Chat>::new();
    match client.check_model("gpt-3.5-turbo").await {
        Ok(model) => println!("Model: {:?}", model),
        Err(e) => println!("Error: {}", e),
    }
    Ok(())
}
§Note

This method is async and needs to be awaited.

Source

pub async fn create_file_upload_part<P: AsRef<Path> + Send>( &mut self, path: P, ) -> Result<Part, Box<dyn Error + Send + Sync>>

Creates a file upload part for a multi-part upload operation.

This method reads the file at the given path, prepares it for uploading, and returns a Part that represents this file in the multi-part upload operation.

§Type Parameters
  • P: The type of the file path. Must implement the AsRef<Path> trait.
§Parameters
  • path: The path of the file to upload. This can be any type that implements AsRef<Path>.
§Returns

A Result which is:

  • Ok if the file was read successfully and the Part was created, carrying the Part.
  • Err if there was an error reading the file or creating the Part, carrying the error of type Box<dyn Error + Send + Sync>.
§Errors

This method will return an error if there was an error reading the file at the given path, or if there was an error creating the Part (for example, if the MIME type was not recognized).

§Example
use aionic::openai::{OpenAI, Chat};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let mut client = OpenAI::<Chat>::new();
    match client.create_file_upload_part("path/to/file.txt").await {
        Ok(part) => println!("Part created successfully."),
        Err(e) => println!("Error: {}", e),
    }
    Ok(())
}
§Note

This method is async and needs to be awaited.

Source

pub async fn handle_api_errors( &mut self, res: Response, ) -> Result<Response, Box<dyn Error + Send + Sync>>

A helper function to handle potential errors from OpenAI API responses.

§Arguments
  • res - A Response object from the OpenAI API call.
§Returns

Result<Response, Box<dyn std::error::Error + Send + Sync>>: Returns the original Response object if the status code indicates success. If the status code indicates an error, it will attempt to deserialize the response into an OpenAIError and returns a std::io::Error constructed from the error message.

Source§

impl OpenAI<Chat>

Source

pub fn set_model<S: Into<String>>(self, model: S) -> Self

Sets the model of the AI assistant.

§Arguments
  • model: A string that specifies the model name to be used by the AI assistant.
§Returns

This function returns the instance of the AI assistant with the specified model.

Examples found in repository?
examples/prompt.rs (line 9)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub fn set_max_tokens(self, max_tokens: u64) -> Self

Sets the maximum number of tokens that the AI model can generate in a single response.

§Arguments
  • max_tokens: An unsigned 64-bit integer that specifies the maximum number of tokens that the AI model can generate in a single response.
§Returns

This function returns the instance of the AI assistant with the specified maximum number of tokens.

Examples found in repository?
examples/prompt.rs (line 11)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub fn set_messages(self, messages: Vec<Message>) -> Self

Allows to set the chat history in a specific state.

§Arguments
  • messages: A vector of Message structs.
§Returns

This function returns the instance of the AI assistant with the specified messages.

Source

pub fn set_temperature(self, temperature: f64) -> Self

Sets the temperature of the AI model’s responses.

The temperature setting adjusts the randomness of the AI’s responses. Higher values produce more random responses, while lower values produce more deterministic responses. The allowed range of values is between 0.0 and 2.0, with 0 being the most deterministic and 1 being the most random.

§Arguments
  • temperature: A float that specifies the temperature.
§Returns

This function returns the instance of the AI assistant with the specified temperature.

Examples found in repository?
examples/prompt.rs (line 10)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub fn set_stream_responses(self, streamed: bool) -> Self

Sets the streaming configuration of the AI assistant.

If streaming is enabled, the AI assistant will fetch and process the AI’s responses as they arrive. If it’s disabled, the assistant will collect all of the AI’s responses at once and return them as a single response.

§Arguments
  • streamed: A boolean that specifies whether streaming should be enabled.
§Returns

This function returns the instance of the AI assistant with the specified streaming setting.

Examples found in repository?
examples/prompt.rs (line 12)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub fn set_primer<S: Into<String>>(self, primer_msg: S) -> Self

Sets a primer message for the AI assistant.

The primer message is inserted at the beginning of the messages vector in the config struct. This can be used to prime the AI model with a certain context or instruction.

§Arguments
  • primer_msg: A string that specifies the primer message.
§Returns

This function returns the instance of the AI assistant with the specified primer message.

Examples found in repository?
examples/chat.rs (line 8)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6
7    OpenAI::<Chat>::new()
8        .set_primer(context_primer)
9        .chat()
10        .await?;
11    Ok(())
12}
More examples
Hide additional examples
examples/prompt_state.rs (line 8)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let mut client = OpenAI::<Chat>::new().set_primer(context_primer);
9
10    client
11        .ask(message, true) // <-- notice the change here
12        .await?;
13
14    client.ask("What did I just ask you earlier?", true).await?;
15    Ok(())
16}
examples/prompt.rs (line 13)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub fn get_last_message(&self) -> Option<&Message>

Returns the last message in the AI assistant’s configuration.

§Returns

This function returns an Option that contains a reference to the last Message in the config struct if it exists, or None if it doesn’t.

Source

pub fn clear_state(self) -> Self

Clears the messages in the AI assistant’s configuration to start from a clean state. This is only necessary in very specific cases.

§Returns

This function returns the instance of the AI assistant with no messages in its configuration.

Source

pub async fn ask<P: Into<Message> + Send>( &mut self, prompt: P, persist_state: bool, ) -> Result<String, Box<dyn Error + Send + Sync>>

Makes a request to OpenAI’s GPT model and retrieves a response based on the provided prompt.

This function accepts a prompt, converts it into a string, and sends a request to the OpenAI API. Depending on the streaming configuration (is_streamed), the function either collects all of the AI’s responses at once, or fetches and processes them as they arrive.

§Arguments
  • prompt: A value that implements Into<String>. This will be converted into a string and sent to the API as the prompt for the AI model.

  • persist_state: If true, the function will push the AI’s response to the messages vector in the config struct. If false, it will remove the last message from the messages vector.

§Returns
  • Ok(String): A success value containing the AI’s response as a string.

  • Err(Box<dyn std::error::Error + Send + Sync>): An error value. This is a dynamic error, meaning it could represent various kinds of failures. The function will return an error if any step in the process fails, such as making the HTTP request, parsing the JSON response, or if there’s an issue with the streaming process.

§Errors

This function will return an error if the HTTP request fails, the JSON response from the API cannot be parsed, or if an error occurs during streaming.

§Examples
  
use aionic::openai::chat::Chat;
use aionic::openai::OpenAI;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let prompt = "Hello, world!";
    let mut client = OpenAI::<Chat>::new();
    let result = client.ask(prompt, true).await;
    match result {
        Ok(response) => println!("{}", response),
        Err(e) => println!("Error: {}", e),
    }
    Ok(())
 }
§Note

This function is async and must be awaited when called.

Examples found in repository?
examples/prompt_state.rs (line 11)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let mut client = OpenAI::<Chat>::new().set_primer(context_primer);
9
10    client
11        .ask(message, true) // <-- notice the change here
12        .await?;
13
14    client.ask("What did I just ask you earlier?", true).await?;
15    Ok(())
16}
More examples
Hide additional examples
examples/prompt.rs (line 14)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6    let message = "What is the meaning of life?";
7
8    let _res = OpenAI::<Chat>::new()
9        .set_model(Chat::get_default_model())
10        .set_temperature(Chat::get_default_temperature())
11        .set_max_tokens(Chat::get_default_max_tokens())
12        .set_stream_responses(Chat::get_default_stream())
13        .set_primer(context_primer)
14        .ask(message, false)
15        .await?;
16    Ok(())
17}
Source

pub async fn chat(&mut self) -> Result<(), Box<dyn Error + Send + Sync>>

Starts a chat session with the AI assistant.

This function uses a Readline-style interface for input and output. The user types a message at the >>> prompt, and the message is sent to the AI assistant using the ask function. The AI’s response is then printed to the console.

If the user enters CTRL-C, the function prints “CTRL-C” and exits the chat session.

If the user enters CTRL-D, the function prints “CTRL-D” and exits the chat session.

If there’s an error during readline, the function prints the error message and exits the chat session.

§Returns
  • Ok(()): A success value indicating that the chat session ended normally.

  • Err(Box<dyn std::error::Error + Send + Sync>): An error value. This is a dynamic error, meaning it could represent various kinds of failures. The function will return an error if any step in the process fails, such as reading a line from the console, or if there’s an error in the ask function.

§Errors

This function will return an error if the readline fails or if there’s an error in the ask function.

§Examples
use aionic::openai::chat::Chat;
use aionic::openai::OpenAI;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let mut client = OpenAI::<Chat>::new();
    let result = client.chat().await;
    match result {
        Ok(()) => println!("Chat session ended."),
        Err(e) => println!("Error during chat session: {}", e),
    }
    Ok(())
}
§Note

This function is async and must be awaited when called.

Examples found in repository?
examples/chat.rs (line 9)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let context_primer = "Answer as if you were Yoda";
6
7    OpenAI::<Chat>::new()
8        .set_primer(context_primer)
9        .chat()
10        .await?;
11    Ok(())
12}
Source§

impl OpenAI<Image>

Source

pub fn set_response_format(self, response_format: &ResponseDataType) -> Self

Allows setting the return format of the response. ResponseDataType is an enum with the following variants:

  • Url: The response will be a vector of URLs to the generated images.
  • Base64Json: The response will be a vector of base64 encoded images.
Source

pub fn set_max_images(self, number_of_images: u64) -> Self

Allows setting the number of images to be generated.

Source

pub fn set_size(self, size: &Size) -> Self

Allows setting the dimensions of the generated images.

Source

pub async fn create<S: Into<String> + Send>( &mut self, prompt: S, ) -> Result<Vec<String>, Box<dyn Error + Send + Sync>>

Generates an image based on a textual description.

This function sets the prompt to the given string and sends a request to the OpenAI API to create an image. The function then parses the response and returns a vector of image URLs.

§Arguments
  • prompt: A string that describes the image to be generated.
§Returns

This function returns a Result with a vector of strings on success, each string being a URL to an image. If there’s an error, it returns a dynamic error.

Examples found in repository?
examples/create_image.rs (line 7)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let image_prompt = "Create an image that represents the meaning of life?";
6
7    let image_list = OpenAI::<Image>::new().create(image_prompt).await?;
8    println!("Image list: {:?}", image_list);
9
10    Ok(())
11}
Source

pub async fn edit<S: Into<String> + Send>( &mut self, prompt: S, image_file_path: S, mask: Option<S>, ) -> Result<Vec<String>, Box<dyn Error + Send + Sync>>

Modifies an existing image based on a textual description.

This function sets the image and optionally the mask, then sets the prompt to the given string and sends a request to the OpenAI API to modify the image. The function then parses the response and returns a vector of image URLs.

§Arguments
  • prompt: A string that describes the modifications to be made to the image.
  • image_file_path: A string that specifies the path to the image file to be modified.
  • mask: An optional string that specifies the path to a mask file. If the mask is not provided, it is set to None.
§Returns

This function returns a Result with a vector of strings on success, each string being a URL to an image. If there’s an error, it returns a dynamic error.

Examples found in repository?
examples/edit_image.rs (line 6)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let image_list = OpenAI::<Image>::new()
6        .edit("Invert the colors", "./img/logo.png", None)
7        .await?;
8    println!("Image list: {:?}", image_list);
9
10    Ok(())
11}
Source

pub async fn variation<S: Into<String> + Send>( &mut self, image_file_path: S, ) -> Result<Vec<String>, Box<dyn Error + Send + Sync>>

Generates variations of an existing image.

This function sets the image and sends a request to the OpenAI API to create variations of the image. The function then parses the response and returns a vector of image URLs.

§Arguments
  • image_file_path: A string that specifies the path to the image file.
§Returns

This function returns a Result with a vector of strings on success, each string being a URL to a new variation of the image. If there’s an error, it returns a dynamic error.

Source§

impl OpenAI<Embedding>

Source

pub fn set_model<S: Into<String>>(self, model: S) -> Self

Sets the model of the AI assistant.

§Arguments
  • model: A string that specifies the model name to be used by the AI assistant.
§Returns

This function returns the instance of the AI assistant with the specified model.

Source

pub async fn embed<S: Into<InputType> + Send>( &mut self, prompt: S, ) -> Result<EmbeddingResponse, Box<dyn Error + Send + Sync>>

Sends a POST request to the OpenAI API to get embeddings for the given prompt.

This method accepts a prompt of type S which can be converted into InputType (an enum that encapsulates the different types of possible inputs). The method converts the provided prompt into InputType and assigns it to the input field of the config instance variable. It then sends a POST request to the OpenAI API and attempts to parse the response as EmbeddingResponse.

§Type Parameters
  • S: The type of the prompt. Must implement the Into<InputType> trait.
§Parameters
  • prompt: The prompt for which to get embeddings. Can be a String, a Vec<String>, a Vec<u64>, or a &str that is converted into an InputType.
§Returns

A Result which is:

  • Ok if the request was successful, carrying the EmbeddingResponse which contains the embeddings.
  • Err if the request or the parsing failed, carrying the error of type Box<dyn std::error::Error + Send + Sync>.
§Errors

This method will return an error if the POST request fails, or if the response from the OpenAI API cannot be parsed into an EmbeddingResponse.

§Example
use aionic::openai::embeddings::Embedding;
use aionic::openai::OpenAI;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
    let mut client = OpenAI::<Embedding>::new();
    let prompt = "Hello, world!";
    match client.embed(prompt).await {
        Ok(response) => println!("Embeddings: {:?}", response),
        Err(e) => println!("Error: {}", e),
    }
    Ok(())
}
§Note

This method is async and needs to be awaited.

Examples found in repository?
examples/embedding.rs (line 7)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let mut client = OpenAI::<Embedding>::new();
6    let embedding = client
7        .embed("The food was delicious and the waiter...")
8        .await?;
9    println!("{:?}", embedding);
10    Ok(())
11}
Source§

impl OpenAI<Audio>

Source

pub fn set_model<S: Into<String>>(self, model: S) -> Self

Sets the model of the AI assistant.

§Arguments
  • model: A string that specifies the model name to be used by the AI assistant.
§Returns

This function returns the instance of the AI assistant with the specified model.

Source

pub fn set_prompt<S: Into<String>>(self, prompt: S) -> Self

Sets the optional prompt to giode the model’s style of response.

§Arguments
  • prompt: An optional string that specifies the prompt to guide the model’s style of response.
§Returns

This function returns the instance of the AI assistant with the specified prompt

Source

pub fn set_response_format(&mut self, format: AudioResponseFormat) -> &mut Self

Sets the optional audio file format to be returned

§Arguments
  • format: An optional enum type that specifies the audio file format to be returned. The default is AudioResponseFormat::Json..
§Returns

This function returns the instance of the AI assistant with the specified audio file format.

Source

pub async fn transcribe<P: AsRef<Path> + Sync + Send>( &mut self, audio_file: P, ) -> Result<AudioResponse, Box<dyn Error + Send + Sync>>

Transcribe an audio file.

§Arguments
  • audio_file - The path to the audio file to transcribe.
§Returns

Result<AudioResponse, Box<dyn std::error::Error + Send + Sync>>: An AudioResponse object representing the transcription of the audio file, or an error if the request fails.

Examples found in repository?
examples/transcribe.rs (line 6)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let transcribe = OpenAI::<Audio>::new()
6        .transcribe("examples/samples/sample-1.mp3")
7        .await?;
8    println!("Transcription: {:?}", transcribe.text);
9    Ok(())
10}
Source

pub async fn translate<P: AsRef<Path> + Send + Sync>( &mut self, audio_file: P, ) -> Result<AudioResponse, Box<dyn Error + Send + Sync>>

Translate an audio file. Currently only supports translating to English.

§Arguments
  • audio_file - The path to the audio file to translate.
§Returns

Result<AudioResponse, Box<dyn std::error::Error + Send + Sync>>: An AudioResponse object representing the translation of the audio file, or an error if the request fails.

Examples found in repository?
examples/translate.rs (line 6)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let translate = OpenAI::<Audio>::new()
6        .translate("examples/samples/colours-german.mp3")
7        .await?;
8    println!("Translation: {:?}", translate.text);
9    Ok(())
10}
Source§

impl OpenAI<Files>

Source

pub async fn list( &mut self, ) -> Result<FileResponse, Box<dyn Error + Send + Sync>>

List all files that have been uploaded.

§Returns

Result<FileResponse, Box<dyn std::error::Error + Send + Sync>>: A FileResponse object representing all uploaded files, or an error if the request fails.

Examples found in repository?
examples/list_files.rs (line 5)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let res = OpenAI::<Files>::new().list().await?;
6    println!("{:#?}", res);
7    Ok(())
8}
More examples
Hide additional examples
examples/upload.rs (line 6)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let mut client = OpenAI::<Files>::new();
6    let res = client.list().await?;
7    println!("current uploads: {:#?}", res);
8
9    let res = client.upload("examples/samples/test.jsonl").await?;
10    println!("uploaded: {:#?}", res);
11
12    println!("waiting for file to be processed...");
13    // sleep for 3 seconds to allow the file to be processed
14    tokio::time::sleep(tokio::time::Duration::from_secs(3)).await;
15
16    let res = client.list().await?;
17    println!("current uploads: {:#?}", res);
18
19    let res = client.delete(res.data[0].id.clone()).await?;
20    println!("deleted: {:#?}", res);
21    Ok(())
22}
Source

pub async fn retrieve<S: Into<String> + Display + Sync + Send>( &mut self, file_id: S, ) -> Result<FileData, Box<dyn Error + Send + Sync>>

Retrieve the details of a specific file.

§Arguments
  • file_id - A string that holds the unique id of the file.
§Returns

Result<FileData, Box<dyn std::error::Error + Send + Sync>>: A FileData object representing the file’s details, or an error if the request fails.

Source

pub async fn retrieve_content<S: Into<String> + Display + Send + Sync>( &mut self, file_id: S, ) -> Result<Vec<PromptCompletion>, Box<dyn Error + Send + Sync>>

Retrieve the content of a specific file.

§Arguments
  • file_id - A string that holds the unique id of the file.
§Returns

Result<FileData, Box<dyn std::error::Error + Send + Sync>>: A FileData object representing the file’s content, or an error if the request fails.

Source

pub async fn upload<P: AsRef<Path> + Send + Sync>( &mut self, file: P, ) -> Result<FileData, Box<dyn Error + Send + Sync>>

Upload a file to the OpenAI API.

§Arguments
  • file - The path to the file to upload.
  • purpose - The purpose of the upload (e.g., ‘answers’, ‘questions’).
§Returns

Result<FileData, Box<dyn std::error::Error + Send + Sync>>: A FileData object representing the uploaded file’s details, or an error if the request fails.

Examples found in repository?
examples/upload.rs (line 9)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let mut client = OpenAI::<Files>::new();
6    let res = client.list().await?;
7    println!("current uploads: {:#?}", res);
8
9    let res = client.upload("examples/samples/test.jsonl").await?;
10    println!("uploaded: {:#?}", res);
11
12    println!("waiting for file to be processed...");
13    // sleep for 3 seconds to allow the file to be processed
14    tokio::time::sleep(tokio::time::Duration::from_secs(3)).await;
15
16    let res = client.list().await?;
17    println!("current uploads: {:#?}", res);
18
19    let res = client.delete(res.data[0].id.clone()).await?;
20    println!("deleted: {:#?}", res);
21    Ok(())
22}
Source

pub async fn delete<S: Into<String> + Display + Send + Sync>( &mut self, file_id: S, ) -> Result<DeleteResponse, Box<dyn Error + Send + Sync>>

Delete a specific file.

§Arguments
  • file_id - A string that holds the unique id of the file.
§Returns

Result<DeleteResponse, Box<dyn std::error::Error + Send + Sync>>: A DeleteResponse object representing the response from the delete request, or an error if the request fails.

Examples found in repository?
examples/upload.rs (line 19)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let mut client = OpenAI::<Files>::new();
6    let res = client.list().await?;
7    println!("current uploads: {:#?}", res);
8
9    let res = client.upload("examples/samples/test.jsonl").await?;
10    println!("uploaded: {:#?}", res);
11
12    println!("waiting for file to be processed...");
13    // sleep for 3 seconds to allow the file to be processed
14    tokio::time::sleep(tokio::time::Duration::from_secs(3)).await;
15
16    let res = client.list().await?;
17    println!("current uploads: {:#?}", res);
18
19    let res = client.delete(res.data[0].id.clone()).await?;
20    println!("deleted: {:#?}", res);
21    Ok(())
22}
Source§

impl OpenAI<FineTune>

Source

pub async fn create<S: Into<String> + Send + Sync>( &mut self, training_file: S, ) -> Result<FineTuneResponse, Box<dyn Error + Send + Sync>>

Create a fine-tune from an uploaded training_file.

§Arguments
  • training_file - A string that holds the unique id of the file.
§Returns

Result<FineTuneResponse, Box<dyn std::error::Error + Send + Sync>>: A FineTuneResponse object representing the result of the fine-tune request, or an error if the request fails.

Source

pub async fn list( &mut self, ) -> Result<FineTuneListResponse, Box<dyn Error + Send + Sync>>

List all fine-tunes.

§Returns

Result<FineTuneListResponse, Box<dyn std::error::Error + Send + Sync>>: A FineTuneResponse object representing the result of the list fine-tunes request, or an error if the request fails.

Source

pub async fn retrieve<S: Into<String> + Send + Sync + Display>( &mut self, fine_tune_id: S, ) -> Result<FineTuneResponse, Box<dyn Error + Send + Sync>>

Get a specific fine-tune by its id

§Arguments
  • fine_tune_id - A string that holds the unique id of the file.
§Returns

Result<FineTuneResponse, Box<dyn std::error::Error + Send + Sync>>: A FineTuneResponse object representing the result of the get fine-tune request, or an error if the request fails.

Source

pub async fn cancel<S: Into<String> + Send + Sync + Display>( &mut self, fine_tune_id: S, ) -> Result<FineTuneResponse, Box<dyn Error + Send + Sync>>

Immediately cancel a fine-tune job.

§Arguments
  • fine_tune_id - A string that holds the unique id of the file.
§Returns

Result<FineTuneResponse, Box<dyn std::error::Error + Send + Sync>>: A FineTuneResponse object representing the result of the cancel fine-tune request, or an error if the request fails.

Source

pub async fn list_events<S: Into<String> + Send + Sync + Display>( &mut self, fine_tune_id: S, ) -> Result<FineTuneEventResponse, Box<dyn Error + Send + Sync>>

Get fine-grained status updates for a fine-tune job.

§Arguments
  • fine_tune_id - A string that holds the unique id of the file.
§Returns

Result<FineTuneEventResponse, Box<dyn std::error::Error + Send + Sync>>: A FineTuneEventResponse object representing the result of the list fine-tunes request, or an error if the request fails.

Source

pub async fn delete_model<S: Into<String> + Send + Sync + Display>( &mut self, model: S, ) -> Result<DeleteResponse, Box<dyn Error + Send + Sync>>

Delete a fine-tuned model. You must have the Owner role in your organization.

§Arguments
  • model - The model to delete
§Returns

Result<DeleteResponse, Box<dyn std::error::Error + Send + Sync>>: A DeleteResponse object representing the status of the delete request, or an error if the request fails.

Source§

impl OpenAI<Moderation>

Source

pub async fn moderate<S: Into<String> + Send + Sync>( &mut self, input: S, ) -> Result<ModerationResponse, Box<dyn Error + Send + Sync>>

Create moderation for a classification if text violates OpenAI’s Content Policy

§Arguments
  • input - The text input to classify
§Returns

Result<, Box<dyn std::error::Error + Send + Sync>>: A ModerationResponse object representing the result of the moderation request, or an error if the request fails.

Examples found in repository?
examples/moderation.rs (line 6)
4async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
5    let resp = OpenAI::<Moderation>::new()
6        .moderate("I want to kill you.")
7        .await?;
8    println!("Moderation: {:?}", resp);
9    Ok(())
10}

Trait Implementations§

Source§

impl<C: Clone + OpenAIConfig> Clone for OpenAI<C>

Source§

fn clone(&self) -> OpenAI<C>

Returns a copy of the value. Read more
1.0.0 · Source§

fn clone_from(&mut self, source: &Self)

Performs copy-assignment from source. Read more
Source§

impl<C: Debug + OpenAIConfig> Debug for OpenAI<C>

Source§

fn fmt(&self, f: &mut Formatter<'_>) -> Result

Formats the value using the given formatter. Read more
Source§

impl<C: OpenAIConfig + Serialize + Sync + Send + Debug> Default for OpenAI<C>

Source§

fn default() -> Self

Returns the “default value” for a type. Read more

Auto Trait Implementations§

§

impl<C> Freeze for OpenAI<C>
where C: Freeze,

§

impl<C> !RefUnwindSafe for OpenAI<C>

§

impl<C> Send for OpenAI<C>

§

impl<C> Sync for OpenAI<C>

§

impl<C> Unpin for OpenAI<C>
where C: Unpin,

§

impl<C> !UnwindSafe for OpenAI<C>

Blanket Implementations§

Source§

impl<T> Any for T
where T: 'static + ?Sized,

Source§

fn type_id(&self) -> TypeId

Gets the TypeId of self. Read more
Source§

impl<T> AsAny for T
where T: Any,

Source§

fn as_any(&self) -> &(dyn Any + 'static)

Source§

fn as_any_mut(&mut self) -> &mut (dyn Any + 'static)

Source§

impl<T> Borrow<T> for T
where T: ?Sized,

Source§

fn borrow(&self) -> &T

Immutably borrows from an owned value. Read more
Source§

impl<T> BorrowMut<T> for T
where T: ?Sized,

Source§

fn borrow_mut(&mut self) -> &mut T

Mutably borrows from an owned value. Read more
Source§

impl<T> CloneToUninit for T
where T: Clone,

Source§

unsafe fn clone_to_uninit(&self, dest: *mut u8)

🔬This is a nightly-only experimental API. (clone_to_uninit)
Performs copy-assignment from self to dest. Read more
Source§

impl<T> From<T> for T

Source§

fn from(t: T) -> T

Returns the argument unchanged.

Source§

impl<T> Instrument for T

Source§

fn instrument(self, span: Span) -> Instrumented<Self>

Instruments this type with the provided Span, returning an Instrumented wrapper. Read more
Source§

fn in_current_span(self) -> Instrumented<Self>

Instruments this type with the current Span, returning an Instrumented wrapper. Read more
Source§

impl<T, U> Into<U> for T
where U: From<T>,

Source§

fn into(self) -> U

Calls U::from(self).

That is, this conversion is whatever the implementation of From<T> for U chooses to do.

Source§

impl<T> IntoEither for T

Source§

fn into_either(self, into_left: bool) -> Either<Self, Self>

Converts self into a Left variant of Either<Self, Self> if into_left is true. Converts self into a Right variant of Either<Self, Self> otherwise. Read more
Source§

fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
where F: FnOnce(&Self) -> bool,

Converts self into a Left variant of Either<Self, Self> if into_left(&self) returns true. Converts self into a Right variant of Either<Self, Self> otherwise. Read more
Source§

impl<T> Pointable for T

Source§

const ALIGN: usize

The alignment of pointer.
Source§

type Init = T

The type for initializers.
Source§

unsafe fn init(init: <T as Pointable>::Init) -> usize

Initializes a with the given initializer. Read more
Source§

unsafe fn deref<'a>(ptr: usize) -> &'a T

Dereferences the given pointer. Read more
Source§

unsafe fn deref_mut<'a>(ptr: usize) -> &'a mut T

Mutably dereferences the given pointer. Read more
Source§

unsafe fn drop(ptr: usize)

Drops the object pointed to by the given pointer. Read more
Source§

impl<T> ToOwned for T
where T: Clone,

Source§

type Owned = T

The resulting type after obtaining ownership.
Source§

fn to_owned(&self) -> T

Creates owned data from borrowed data, usually by cloning. Read more
Source§

fn clone_into(&self, target: &mut T)

Uses borrowed data to replace owned data, usually by cloning. Read more
Source§

impl<T, U> TryFrom<U> for T
where U: Into<T>,

Source§

type Error = Infallible

The type returned in the event of a conversion error.
Source§

fn try_from(value: U) -> Result<T, <T as TryFrom<U>>::Error>

Performs the conversion.
Source§

impl<T, U> TryInto<U> for T
where U: TryFrom<T>,

Source§

type Error = <U as TryFrom<T>>::Error

The type returned in the event of a conversion error.
Source§

fn try_into(self) -> Result<U, <U as TryFrom<T>>::Error>

Performs the conversion.
Source§

impl<T> WithSubscriber for T

Source§

fn with_subscriber<S>(self, subscriber: S) -> WithDispatch<Self>
where S: Into<Dispatch>,

Attaches the provided Subscriber to this type, returning a WithDispatch wrapper. Read more
Source§

fn with_current_subscriber(self) -> WithDispatch<Self>

Attaches the current default Subscriber to this type, returning a WithDispatch wrapper. Read more
Source§

impl<T> ErasedDestructor for T
where T: 'static,

Source§

impl<T> MaybeSendSync for T