rustlift 2.0.1

A typestate-driven deployment agent for Azure Web Apps
Documentation
---
description: "Implement a RAG pipeline with Rig (Rust)"
---

Implement a robust Retrieval-Augmented Generation (RAG) system using the `rig` crate (Rig.rs).

## Requirements
{{args}}

## Implementation Steps
1.  **Document Loading:** Use crates like `pdf-extract` or `text-loader` to ingest raw data.
2.  **Chunking:** Implement a chunking strategy (e.g., fixed-size with overlap, semantic chunking) to preserve context.
3.  **Embeddings:** Initialize an `EmbeddingModel` (e.g., OpenAI, Cohere, or local via `candle`).
4.  **Vector Store:** Setup a `VectorStore` (e.g., `Qdrant`, `LanceDB`, or `InMemoryVectorStore` for prototypes).
5.  **Indexing:** Create an index from the vector store for efficient retrieval.
6.  **Agent Construction:** Build a `Rig` agent with:
    *   **Preamble:** Strict system instructions (grounding).
    *   **Dynamic Context:** Attach the index using `.dynamic_context()`.
7.  **Execution:** Run the agent with user queries.

## Example Reference (Conceptual Rig 0.x)
```rust
use rig::{
    client::openai,
    embeddings::EmbeddingsBuilder,
    vector_store::in_memory_store::InMemoryVectorStore,
    rag::VectorStore,
};

#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
    // 1. Setup Client & Models
    let openai_client = openai::Client::from_env();
    let embedding_model = openai_client.embedding_model(openai::TEXT_EMBEDDING_ADA_002);

    // 2. Load & Embed Documents
    let documents = vec![
        "Rust is a systems language focused on safety.",
        "Rig is a library for building LLM apps in Rust.",
    ];
    
    let embeddings = EmbeddingsBuilder::new(embedding_model.clone())
        .simple_document_text(documents) // Simplified helper
        .build()
        .await?;

    // 3. Create Vector Store & Index
    let vector_store = InMemoryVectorStore::from_documents(embeddings);
    let index = vector_store.index(embedding_model);

    // 4. Build Agent
    let agent = openai_client.agent(openai::GPT_4)
        .preamble("You are a helpful assistant. Answer ONLY based on the context provided.")
        .dynamic_context(1, index) // Retrieve top 1 context
        .build();

    // 5. Query
    let response = agent.prompt("What is Rig?").await?;
    println!("Response: {}", response);
    
    Ok(())
}
```