Expand description
Simple example:
use rig::{completion::Prompt, providers::openai};
#[tokio::main]
async fn main() {
// Create OpenAI client and model
let openai_client = openai::Client::from_env();
let gpt4 = openai_client.model("gpt-4").build();
// Prompt the model and print its response
let response = gpt4
.prompt("Who are you?")
.await
.expect("Failed to prompt GPT-4");
println!("GPT-4: {response}");
}
Modules§
- This module contains the implementation of the
Agent
struct and its builder. - This module contains the implementation of the completion functionality for the LLM (Large Language Model) chat interface. It provides traits, structs, and enums for generating completion requests, handling completion responses, and defining completion models.
- This module provides functionality for working with embeddings and embedding models. Embeddings are numerical representations of documents or other objects, typically used in natural language processing (NLP) tasks such as text classification, information retrieval, and document similarity.
- This module contains the implementation of the
Agent
struct and its builder.