Crate chat_gpt_lib_rs

Source
Expand description

§OpenAI Rust Client Library

This crate provides a Rust client library for the OpenAI API, implemented in an idiomatic Rust style. It aims to mirror the functionality of other official and community OpenAI client libraries, while leveraging Rust’s strong type system, async capabilities, and error handling.

§Getting Started

Add the following to your Cargo.toml:

[dependencies]
chat-gpt-lib-rs = "" // latest and greatest version

Then in your code:

use chat_gpt_lib_rs::{OpenAIClient, OpenAIError};

#[tokio::main]
async fn main() -> Result<(), OpenAIError> {
    // Create the client (pull the API key from the OPENAI_API_KEY environment variable by default).
    let client = OpenAIClient::new(Some("sk-...".to_string()))?;

    // Now you can make calls like:
    // let response = client.create_completion("text-davinci-003", "Hello, world!", 50, 0.7).await?;
    // println!("Completion: {}", response);

    Ok(())
}

§Environment Variables

By default, this library reads your OpenAI API key from the OPENAI_API_KEY environment variable. If you wish to pass in a key at runtime, use the OpenAIClient::new constructor with an explicit key.

§Features

  • Async-first – Uses Tokio and Reqwest
  • JSON Serialization – Powered by Serde
  • Custom Error Handling – Utilizes thiserror for ergonomic error types
  • Configurable – Customize timeouts, organization IDs, or other settings

§Roadmap

  1. Implement all major endpoints (e.g. Completions, Chat, Embeddings, Files, Fine-tunes, Moderations).
  2. Provide detailed logging with log and env_logger.
  3. Offer improved error handling by parsing OpenAI error fields directly.
  4. Provide thorough documentation and examples.

§Contributing

Contributions to this project are more than welcome! Feel free to open issues, submit pull requests, or suggest improvements. Please see our GitHub repository for more details.

Re-exports§

pub use config::OpenAIClient;
pub use error::OpenAIError;

Modules§

api
This module will contain the low-level request and response logic, leveraging reqwest to communicate with the OpenAI endpoints. The api module contains low-level functions for making HTTP requests to the OpenAI API. It handles authentication headers, organization headers, error parsing, and JSON (de)serialization.
api_resources
API Resources Module
config
This module will contain the central configuration, including the OpenAIClient struct, environment variable helpers, and other utilities. The config module provides functionality for configuring and creating the OpenAIClient, including handling API keys, organization IDs, timeouts, and base URLs.
error
This module will define custom errors and error types for the library. The error module defines all error types that may arise when interacting with the OpenAI API.