meme
Long-term memory for AI agents.
meme is a production-grade memory pipeline written in Rust. It persists
knowledge extracted from conversations (or raw facts) to disk via
LanceDB and tracks every change in a SQLite audit log.
Pipeline
- Semantic Structured Compression — dialogues are windowed and sent to
an LLM which extracts atomic, self-contained [
Memory] entries with structured metadata (timestamps, locations, persons, keywords, …). - Lifecycle Reconciliation — each new entry is compared against existing memories in a single LLM call that decides ADD / UPDATE / DELETE / NOOP, preventing duplicates and resolving contradictions.
- Intent-Aware Hybrid Retrieval — queries are analyzed by an LLM to produce a retrieval plan that drives parallel semantic (ANN), lexical (FTS), and structured-metadata searches. Optional reflection rounds iteratively refine coverage.
Quick Start
use ;
# async
Public API
| Entry point | Purpose |
|---|---|
[MemeBuilder] |
Fluent builder — configure API key, model, storage path |
[Meme] |
Runtime facade — add, flush, put, search, ask, CRUD, consolidate |
[Memory] |
A single self-contained unit of knowledge |
[Dialogue] |
Speaker + content input for conversation ingestion |
[Event] / [EventType] |
Change-history audit records |
[ConsolidationParams] |
Parameters for [Meme::consolidate] |
[ConsolidationStats] |
Summary returned by [Meme::consolidate] |
Crate Layout
| Module | Visibility | Contents |
|---|---|---|
[config] |
pub | Pure data configuration structs with validation |
[error] |
pub | [MemeError] enum and Result alias |
[model] |
pub | Domain types ([Memory], [Dialogue], [Event], …) |
[store] |
pub | LanceDB vector store, SQLite history store, consolidation |
embedding |
pub(crate) | API and optional ONNX embedding providers |
llm |
pub(crate) | OpenAI-compatible LLM client, prompts, JSON schemas |
pipeline |
pub(crate) | Extractor, reconciler, hybrid retriever, answer generator |