Crate rust_gpt

source ·
Expand description

OpenAI Completion/Chat Rust API

Provides a neat and rusty way of interacting with the OpenAI Completion/Chat API. You can find the documentation for the API here.

Example

use rust_gpt::RequestBuilder;
use rust_gpt::CompletionModel;
use rust_gpt::SendRequest;

#[tokio::main]
async fn main() {
    let req = RequestBuilder::new(CompletionModel::TextDavinci003, "YOUR_API_KEY")
        .prompt("Write a sonnet about a crab named Ferris in the style of Shakespeare.")
        .build_completion();
    let response = req.send().await.unwrap();
    println!("My bot replied with: \"{:?}\"", response);
}

General Usage

You will most likely just use the RequestBuilder to create a request. You can then use the SendRequest trait to send the request. Right now only the completion and chat endpoints are supported. These two endpoints require different parameters, so you will need to use the build_completion and build_chat methods respectively.

RequestBuilder can take any type that implements ToString as the model input and any type that implements Display as the API key.

Completion

The completion endpoint requires a prompt parameter. You can set this with the prompt method which takes any type that implements ToString.

Chat

The chat endpoint is a little more complicated. It requires a messages parameter which is a list of messages. These messages are represented by the [ChatMessage] struct. You can create a [ChatMessage] with the new method.

Additional Notes

The API is still in development, so there may be some breaking changes in the future.
The API is also not fully tested, so there may be some bugs.
There is a little bit of error handling, but it is not very robust.
serde_json is used to seralize and deserialize the responses and messages. Although since many are derived they may not match up with the exact API json responses.

Modules

Structs

Enums

Traits

  • A trait for abstracting sending requests between APIs.