# OpenMemory Rust SDK
[](https://crates.io/crates/openmemory)
[](https://docs.rs/openmemory)
[](https://opensource.org/licenses/MIT)
[Report Bug](https://github.com/honeymaro/openmemory-rs/issues) • [Request Feature](https://github.com/honeymaro/openmemory-rs/issues)
Local-first long-term memory engine for AI apps and agents. **Self-hosted. Explainable. Scalable.**
A Rust port of the [OpenMemory JavaScript SDK](https://github.com/caviraOSS/openmemory) with native performance.
---
## Quick Start
Add to your `Cargo.toml`:
```toml
[dependencies]
openmemory = "0.1"
tokio = { version = "1", features = ["full"] }
```
```rust
use openmemory::{OpenMemory, AddOptions, QueryOptions};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let mem = OpenMemory::new(None).await?;
// Add a memory
let result = mem.add(
"I'm building a web app with OpenMemory",
AddOptions::default()
).await?;
println!("Added memory: {}", result.id);
// Query memories
let results = mem.query("What am I building?", QueryOptions::default()).await?;
for r in results {
println!("[{:.2}] {}", r.score, r.content);
}
Ok(())
}
```
**That's it.** You're now running a fully local cognitive memory engine 🎉
---
## Features
✅ **Local-first** - Runs entirely on your machine, zero external dependencies
✅ **Multi-sector memory** - Episodic, Semantic, Procedural, Emotional, Reflective
✅ **Memory decay** - Adaptive forgetting with sector-specific rates
✅ **Waypoint graph** - Associative recall paths via BFS expansion
✅ **Hybrid search** - Vector similarity + keyword filtering
✅ **Zero config** - Works out of the box with sensible defaults
✅ **Native performance** - Rust-powered speed and memory safety
---
## Configuration
### Basic Configuration
```rust
use openmemory::{OpenMemory, Config, EmbeddingKind, Tier};
use std::path::PathBuf;
let config = Config::builder()
.db_path(PathBuf::from("./data/memory.db"))
.tier(Tier::Smart)
.embedding_kind(EmbeddingKind::Synthetic)
.build();
let mem = OpenMemory::new(Some(config)).await?;
```
### Embedding Providers
#### Synthetic (Testing/Development)
```rust
let config = Config::builder()
.embedding_kind(EmbeddingKind::Synthetic)
.build();
```
#### OpenAI (Recommended for Production)
```rust
let config = Config::builder()
.embedding_kind(EmbeddingKind::OpenAI)
.openai_key("sk-...".to_string())
.openai_model("text-embedding-3-small".to_string())
.build();
```
#### Gemini
```rust
let config = Config::builder()
.embedding_kind(EmbeddingKind::Gemini)
.gemini_key("your-api-key".to_string())
.build();
```
#### Ollama (Fully Local)
```rust
let config = Config::builder()
.embedding_kind(EmbeddingKind::Ollama)
.ollama_url("http://localhost:11434".to_string())
.ollama_model("llama3".to_string())
.build();
```
#### AWS Bedrock
Enable the `aws` feature in `Cargo.toml`:
```toml
[dependencies]
openmemory = { version = "0.1", features = ["aws"] }
```
```rust
let config = Config::builder()
.embedding_kind(EmbeddingKind::Bedrock)
.build();
// Uses AWS credentials from environment or ~/.aws/credentials
```
### Performance Tiers
| `Fast` | 256 | Optimized for speed, lower precision |
| `Smart` | 384 | Balanced performance and accuracy (default) |
| `Deep` | 1536 | Maximum accuracy, slower |
| `Hybrid` | 384 | Adaptive with keyword filtering |
```rust
use openmemory::Tier;
let config = Config::builder()
.tier(Tier::Hybrid)
.build();
```
---
## API Reference
### `add(content, options)`
Store a new memory.
```rust
use openmemory::AddOptions;
let result = mem.add(
"User prefers dark mode",
AddOptions {
tags: Some(vec!["preference".to_string(), "ui".to_string()]),
salience: Some(0.8),
..Default::default()
}
).await?;
println!("ID: {}", result.id);
println!("Sector: {:?}", result.primary_sector);
```
### `query(query, options)`
Search for relevant memories using the HSG (Hybrid Similarity Graph) algorithm.
```rust
use openmemory::QueryOptions;
let results = mem.query(
"user preferences",
QueryOptions {
k: 10,
min_salience: Some(0.5),
..Default::default()
}
).await?;
for r in results {
println!("[{:.3}] {} - {:?}", r.score, r.content, r.primary_sector);
}
```
### `get_all(limit, offset)`
Retrieve all memories with pagination.
```rust
let memories = mem.get_all(100, 0).await?;
println!("Total memories: {}", memories.len());
```
### `delete(id)`
Remove a memory by ID.
```rust
mem.delete(&memory_id).await?;
```
### `reinforce(id, boost)`
Boost a memory's salience score.
```rust
mem.reinforce(&memory_id, 0.2).await?;
```
### `run_decay()`
Process memory decay based on time elapsed.
```rust
let stats = mem.run_decay().await?;
println!("Processed: {}, Decayed: {}", stats.processed, stats.decayed);
```
---
## Cognitive Sectors
OpenMemory automatically classifies content into 5 cognitive sectors:
| **Episodic** | Time-bound events & experiences | "Yesterday I attended a conference" | Medium (0.015) |
| **Semantic** | Timeless facts & knowledge | "Paris is the capital of France" | Very Low (0.005) |
| **Procedural** | Skills, procedures, how-tos | "To deploy: build, test, push" | Low (0.008) |
| **Emotional** | Feelings, sentiment, mood | "I'm excited about this project!" | High (0.02) |
| **Reflective** | Meta-cognition, insights | "I learn best through practice" | Very Low (0.001) |
```rust
use openmemory::Sector;
// Query specific sectors
let results = mem.query(
"how to deploy",
QueryOptions {
sectors: Some(vec![Sector::Procedural]),
..Default::default()
}
).await?;
```
---
## HSG Query Algorithm
The Hybrid Similarity Graph (HSG) combines multiple signals for retrieval:
```
final_score = sigmoid(
0.40 × vector_similarity +
0.20 × token_overlap +
0.15 × waypoint_weight +
0.15 × recency_score +
0.10 × tag_match +
keyword_boost (Hybrid tier)
)
```
Features:
- **Sector penalties** for cross-sector retrieval
- **BFS waypoint expansion** for associative recall
- **Feedback learning** with EMA score updates
---
## Environment Variables
| `OM_DB_PATH` | Database file path | `:memory:` |
| `OM_TIER` | Performance tier | `smart` |
| `OM_EMBEDDING` | Embedding provider | `synthetic` |
| `OM_VEC_DIM` | Vector dimensions | Tier default |
| `OPENAI_API_KEY` | OpenAI API key | - |
| `GEMINI_API_KEY` | Gemini API key | - |
---
## Examples
Run the basic usage example:
```bash
cd openmemory-rs
cargo run --example basic_usage
```
---
## Performance
Benchmarks on Apple M1:
| Synthetic embed | ~0.5ms |
| Add memory | ~2ms |
| Query (1k memories) | ~15ms |
| Decay batch (1k) | ~50ms |
Run benchmarks:
```bash
cargo bench
```
---
## Feature Flags
| `aws` | Enable AWS Bedrock embedding provider |
```toml
[dependencies]
openmemory = { version = "0.1", features = ["aws"] }
```
---
## Minimum Supported Rust Version
Rust 1.70 or later.
---
## License
MIT License - see [LICENSE](LICENSE) for details.
---
## Contributing
Contributions are welcome! Please read our contributing guidelines before submitting PRs.
```bash
# Run tests
cargo test
# Run clippy
cargo clippy
# Format code
cargo fmt
```