Module embeddings

Source
Expand description

This module provides functionality for creating embeddings using the OpenAI Embeddings API.

The Embeddings API takes in text or tokenized text and returns a vector representation (embedding) that can be used for tasks like similarity searches, clustering, or classification in vector databases.

§Overview

The core usage involves calling create_embeddings with a CreateEmbeddingsRequest, which includes the model name (e.g., "text-embedding-ada-002") and the input text(s).

use chat_gpt_lib_rs::api_resources::embeddings::{create_embeddings, CreateEmbeddingsRequest, EmbeddingsInput};
use chat_gpt_lib_rs::error::OpenAIError;
use chat_gpt_lib_rs::OpenAIClient;

#[tokio::main]
async fn main() -> Result<(), OpenAIError> {
    let client = OpenAIClient::new(None)?; // Reads API key from OPENAI_API_KEY

    let request = CreateEmbeddingsRequest {
        model: "text-embedding-ada-002".into(),
        input: EmbeddingsInput::String("Hello world".to_string()),
        user: None,
    };

    let response = create_embeddings(&client, &request).await?;
    for (i, emb) in response.data.iter().enumerate() {
        println!("Embedding #{}: vector size = {}", i, emb.embedding.len());
    }
    println!("Model used: {:?}", response.model);
    if let Some(usage) = &response.usage {
        println!("Usage => prompt_tokens: {}, total_tokens: {}",
            usage.prompt_tokens, usage.total_tokens);
    }

    Ok(())
}

Structs§

CreateEmbeddingsRequest
A request struct for creating embeddings with the OpenAI API.
CreateEmbeddingsResponse
The response returned by the OpenAI Embeddings API.
EmbeddingData
The embedding result for a single input item.
EmbeddingsUsage
Usage statistics for an embeddings request, if provided by the API.

Enums§

EmbeddingsInput
Represents the diverse ways the input can be supplied for embeddings:

Functions§

create_embeddings
Creates embeddings using the OpenAI Embeddings API.