redis-vl
Rust implementation of the Redis Vector Library, providing vector search, semantic caching, message history, and routing on top of Redis.
Status: pre-release (
0.1.0). The library is functional for core workflows but has not yet reached full parity with the Pythonredisvlpackage. See the Parity Matrix for current coverage.
Features
- Schema – define index schemas in YAML or JSON with typed field attributes (Tag, Text, Numeric, Geo, Timestamp, Vector), stopwords, multi-prefix support, and Hash/JSON storage selection.
- Search Index – sync and async index lifecycle: create, delete, load,
fetch, search, query, batch operations, pagination, hybrid search, aggregate
queries, multi-vector queries, and
from_existing. - Filters – composable filter DSL:
Tag,Text,Num,Geo,GeoRadius,Timestampwith boolean composition (&,|,!). - Queries –
VectorQuery,VectorRangeQuery,TextQuery,FilterQuery,CountQuery,HybridQuery(Redis 8.4+FT.HYBRID),AggregateHybridQuery(FT.AGGREGATE), andMultiVectorQuery. - SQL Queries –
SQLQuerybehind thesqlfeature flag: translates SQLSELECTstatements to Redis Search queries withWHERE,ORDER BY,LIMIT/OFFSET, field projection, aggregate functions (COUNT,SUM,AVG,GROUP BY), vector search functions (vector_distance(),cosine_distance()), and geo functions (geo_distance()in WHERE and SELECT clauses). - Vectorizers –
OpenAITextVectorizer,LiteLLMTextVectorizer,CustomTextVectorizer,AzureOpenAITextVectorizer,CohereTextVectorizer,VoyageAITextVectorizer,MistralAITextVectorizer,VertexAITextVectorizer,BedrockTextVectorizer,AnthropicTextVectorizer(Voyage AI-backed), andHuggingFaceTextVectorizer(local ONNX viafastembed). - Rerankers –
CohereRerankerbehind thererankersfeature flag with sync and async support. - Extensions –
EmbeddingsCache,SemanticCache,MessageHistory,SemanticMessageHistory, andSemanticRouter, all Redis-backed. - CLI (
rvl) –version,index create/delete/destroy/info/listall, andstatscommands. - Benchmarks – Criterion micro-benchmarks for schema parsing, filter rendering, query building, and Redis-backed index/search/cache operations.
Not yet implemented
- SQL date functions (
YEAR()),IS NULL/IS NOT NULL,HAVING - Richer CLI command/flag parity (
load, query commands)
Quick start
Add redis-vl to your project:
To use only the core library without vectorizer dependencies:
Defining a schema (YAML)
index:
name: my-index
prefix: doc
storage_type: hash
fields:
- name: title
type: tag
- name: content
type: text
- name: score
type: numeric
- name: embedding
type: vector
attrs:
algorithm: flat
dims: 128
distance_metric: cosine
datatype: float32
Creating an index and loading data
use ;
use json;
let schema = from_yaml_file.unwrap;
let index = new;
index.create.unwrap;
let docs = vec!;
index.load.unwrap;
Running a vector query
use ;
let schema = from_yaml_file.unwrap;
let index = new;
let vector = new;
let query = new
.with_return_fields;
let result = index.search.unwrap;
println!;
for doc in &result.docs
Composing filters
use ;
// Combine with & (AND), | (OR), and ! (NOT)
let filter = new.eq
& new.between;
let text_filter = new.eq
| new.eq;
Message history
use ;
let history = new;
history.add_message.unwrap;
history.add_message.unwrap;
let recent = history.get_recent.unwrap;
for msg in &recent
Semantic router
use ;
let routes = vec!;
// Create with a vectorizer:
// let router = SemanticRouter::new(vectorizer, routes, "my-router", "redis://...", RoutingConfig::default());
// let result = router.route(Some("howdy!"), None).unwrap();
Redis 8.4+ hybrid search
Redis 8.4 introduces FT.HYBRID for combined text + vector search:
use ;
let query = new
.with_num_results
.with_combination_method
.with_return_fields;
Note: Hybrid and multi-vector queries require Redis 8.4+. See the user guide for
AggregateHybridQueryandMultiVectorQuerydetails.
CLI
Install the rvl binary:
Set REDIS_URL or pass --redis-url to override the default
redis://127.0.0.1:6379.
Feature flags
| Flag | Default | Description |
|---|---|---|
openai |
✓ | OpenAI-compatible vectorizer |
litellm |
✓ | LiteLLM vectorizer (requires openai) |
azure-openai |
Azure OpenAI vectorizer | |
cohere |
Cohere vectorizer | |
voyageai |
VoyageAI vectorizer | |
mistral |
Mistral vectorizer | |
vertex-ai |
Google Vertex AI vectorizer | |
bedrock |
AWS Bedrock vectorizer | |
anthropic |
Anthropic adapter (Voyage AI-backed; requires voyageai) |
|
hf-local |
HuggingFace local ONNX embedding via fastembed |
|
sql |
SQL query support (SQLQuery) |
|
rerankers |
Reranker support (CohereReranker) |
Examples
See the examples/ directory for runnable code samples:
| Example | Description |
|---|---|
schema_basics |
Parse and validate index schemas from YAML |
filter_basics |
Build and compose filter expressions |
vector_search |
Create an index, load data, and run vector queries |
semantic_cache_basics |
Set up a semantic LLM response cache |
message_history_basics |
Store and retrieve conversation messages |
semantic_router_basics |
Route text to predefined categories |
sql_query_basics |
Translate SQL to Redis Search queries (requires sql feature) |
Benchmarks
Criterion micro-benchmarks cover schema parsing, filter rendering, query building, and Redis-backed operations (index lifecycle, search, cache, history).
REDISVL_RUN_INTEGRATION=1
See benches/README.md for the full benchmark inventory.
Development
Integration tests require a running Redis instance with the Search module (Redis 8+ or Redis Stack):
REDISVL_RUN_INTEGRATION=1
Redis 8.4+ hybrid/aggregate/multi-vector tests additionally require a Redis 8.4 server.
Documentation
- API Reference (docs.rs) – auto-generated Rustdoc
- User Guide (mdBook) – getting started, schema, queries, extensions, CLI
- Parity Matrix – feature-level tracking against Python RedisVL
- Publishing Guide – crates.io, docs, and release workflow notes
License
MIT - see the LICENSE file for details.