Skip to main content

rig/
lib.rs

1#![cfg_attr(docsrs, feature(doc_cfg))]
2#![cfg_attr(
3    test,
4    allow(
5        clippy::expect_used,
6        clippy::indexing_slicing,
7        clippy::panic,
8        clippy::unwrap_used,
9        clippy::unreachable
10    )
11)]
12//! Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.
13//!
14//! # Table of contents
15//! - [High-level features](#high-level-features)
16//! - [Simple Example](#simple-example)
17//! - [Core Concepts](#core-concepts)
18//! - [Integrations](#integrations)
19//!
20//! # High-level features
21//! - Full support for LLM completion and embedding workflows
22//! - Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, in-memory)
23//! - Integrate LLMs in your app with minimal boilerplate
24//!
25//! # Simple example:
26//! ```
27//! use rig::{client::{CompletionClient, ProviderClient}, completion::Prompt, providers::openai};
28//!
29//! #[tokio::main]
30//! async fn main() -> Result<(), Box<dyn std::error::Error>> {
31//!     // Create OpenAI client and agent.
32//!     // This requires the `OPENAI_API_KEY` environment variable to be set.
33//!     let openai_client = openai::Client::from_env()?;
34//!
35//!     let gpt4 = openai_client.agent("gpt-4").build();
36//!
37//!     // Prompt the model and print its response
38//!     let response = gpt4
39//!         .prompt("Who are you?")
40//!         .await
41//!         .expect("Failed to prompt GPT-4");
42//!
43//!     println!("GPT-4: {response}");
44//!
45//!     Ok(())
46//! }
47//! ```
48//! Note: using `#[tokio::main]` requires you enable tokio's `macros` and `rt-multi-thread` features
49//! or just `full` to enable all features (`cargo add tokio --features macros,rt-multi-thread`).
50//!
51//! # Core concepts
52//! ## Completion and embedding models
53//! Rig provides a consistent API for working with LLMs and embeddings. Specifically,
54//! each provider (e.g. OpenAI, Cohere) has a `Client` struct that can be used to initialize completion
55//! and embedding models. These models implement the [CompletionModel](crate::completion::CompletionModel)
56//! and [EmbeddingModel](crate::embeddings::EmbeddingModel) traits respectively, which provide a common,
57//! low-level interface for creating completion and embedding requests and executing them.
58//!
59//! ## Agents
60//! Rig also provides high-level abstractions over LLMs in the form of the [Agent](crate::agent::Agent) type.
61//!
62//! The [Agent](crate::agent::Agent) type can be used to create anything from simple agents that use vanilla models to full blown
63//! RAG systems that can be used to answer questions using a knowledge base.
64//!
65//! ## Vector stores and indexes
66//! Rig provides a common interface for working with vector stores and indexes. Specifically, the library
67//! provides the [VectorStoreIndex](crate::vector_store::VectorStoreIndex)
68//! trait, which can be implemented to define vector stores and indices respectively.
69//! Those can then be used as the knowledge base for a RAG enabled [Agent](crate::agent::Agent), or
70//! as a source of context documents in a custom architecture that use multiple LLMs or agents.
71//!
72//! # Integrations
73//! ## Model Providers
74//! Rig natively supports the following completion and embedding model provider integrations:
75//! - Anthropic
76//! - Azure
77//! - Cohere
78//! - Deepseek
79//! - Galadriel
80//! - Gemini
81//! - Groq
82//! - Huggingface
83//! - Hyperbolic
84//! - Mira
85//! - Mistral
86//! - Moonshot
87//! - Ollama
88//! - Openai
89//! - OpenRouter
90//! - Perplexity
91//! - Together
92//! - Voyage AI
93//! - xAI
94//!
95//! You can also implement your own model provider integration by defining types that
96//! implement the [CompletionModel](crate::completion::CompletionModel) and [EmbeddingModel](crate::embeddings::EmbeddingModel) traits.
97//!
98//! Vector stores are available as separate companion-crates:
99//!
100//! - MongoDB: [`rig-mongodb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-mongodb)
101//! - LanceDB: [`rig-lancedb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-lancedb)
102//! - Neo4j: [`rig-neo4j`](https://github.com/0xPlaygrounds/rig/tree/main/rig-neo4j)
103//! - Qdrant: [`rig-qdrant`](https://github.com/0xPlaygrounds/rig/tree/main/rig-qdrant)
104//! - SQLite: [`rig-sqlite`](https://github.com/0xPlaygrounds/rig/tree/main/rig-sqlite)
105//! - SurrealDB: [`rig-surrealdb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-surrealdb)
106//! - Milvus: [`rig-milvus`](https://github.com/0xPlaygrounds/rig/tree/main/rig-milvus)
107//! - ScyllaDB: [`rig-scylladb`](https://github.com/0xPlaygrounds/rig/tree/main/rig-scylladb)
108//! - AWS S3Vectors: [`rig-s3vectors`](https://github.com/0xPlaygrounds/rig/tree/main/rig-s3vectors)
109//!
110//! You can also implement your own vector store integration by defining types that
111//! implement the [VectorStoreIndex](crate::vector_store::VectorStoreIndex) trait.
112//!
113//! The following providers are available as separate companion-crates:
114//!
115//! - Fastembed: [`rig-fastembed`](https://github.com/0xPlaygrounds/rig/tree/main/rig-fastembed)
116//! - Eternal AI: [`rig-eternalai`](https://github.com/0xPlaygrounds/rig/tree/main/rig-eternalai)
117//!
118
119extern crate self as rig;
120
121pub mod agent;
122#[cfg(feature = "audio")]
123#[cfg_attr(docsrs, doc(cfg(feature = "audio")))]
124pub mod audio_generation;
125pub mod client;
126pub mod completion;
127pub mod embeddings;
128
129#[cfg(feature = "experimental")]
130#[cfg_attr(docsrs, doc(cfg(feature = "experimental")))]
131pub mod evals;
132pub mod extractor;
133pub mod http_client;
134#[cfg(feature = "image")]
135#[cfg_attr(docsrs, doc(cfg(feature = "image")))]
136pub mod image_generation;
137pub mod integrations;
138pub(crate) mod json_utils;
139pub mod loaders;
140pub mod markers;
141pub mod model;
142pub mod one_or_many;
143pub mod pipeline;
144pub mod prelude;
145pub mod providers;
146
147pub mod streaming;
148pub mod tool;
149pub mod tools;
150pub mod transcription;
151pub mod vector_store;
152pub mod wasm_compat;
153
154// Re-export commonly used types and traits
155pub use completion::message;
156pub use embeddings::Embed;
157pub use extractor::ExtractionResponse;
158pub use one_or_many::{EmptyListError, OneOrMany};
159
160#[cfg(feature = "derive")]
161#[cfg_attr(docsrs, doc(cfg(feature = "derive")))]
162pub use rig_derive::{Embed, rig_tool as tool_macro};
163
164pub mod telemetry;