Expand description
§API Resources Module
This module groups together the various API resource modules that correspond to different OpenAI endpoints, such as models, completions, chat, embeddings, etc. Each sub-module provides high-level functions and data structures for interacting with a specific set of endpoints.
§Currently Implemented
- models: Retrieve and list available models
- completions: Generate text completions
- chat: Handle chat-based completions (ChatGPT)
- embeddings: Obtain vector embeddings for text
- moderations: Check text for policy violations
- fine_tunes: Manage fine-tuning jobs
- files: Upload and manage files
§Planned Modules
§Example
use chat_gpt_lib_rs::OpenAIClient;
use chat_gpt_lib_rs::api_resources::models;
use chat_gpt_lib_rs::error::OpenAIError;
#[tokio::main]
async fn main() -> Result<(), OpenAIError> {
    let client = OpenAIClient::new(None)?;
    // Example: list and retrieve models
    let model_list = models::list_models(&client).await?;
    let first_model = &model_list[0];
    let model_details = models::retrieve_model(&client, &first_model.id).await?;
    println!("Model details: {:?}", model_details);
    Ok(())
}Modules§
- chat
- This module provides functionality for creating chat-based completions using the OpenAI Chat Completions API.
- completions
- This module provides functionality for creating text completions using the OpenAI Completions API.
- embeddings
- This module provides functionality for creating embeddings using the OpenAI Embeddings API.
- files
- This module provides functionality for working with files using the OpenAI Files API.
- fine_tunes 
- This module provides functionality for working with fine-tuning jobs using the OpenAI Fine-tunes API.
- models
- This module provides functionality for interacting with the OpenAI Models API.
- moderations
- This module provides functionality for classifying text against OpenAI’s content moderation policies using the OpenAI Moderations API.