LLMBrain: Rust Memory Layer
LLMBrain is a Rust library providing a foundational memory layer inspired by neuroscience concepts, particularly suited for building Retrieval-Augmented Generation (RAG) systems and other AI applications requiring structured memory.
It focuses on core functionalities for storing, managing, and recalling information fragments, leveraging vector embeddings for semantic search.
Core Concepts
LLMBrainStruct: The main entry point for interacting with the memory system. It manages configuration, storage, embedding generation, and recall operations.MemoryFragment: The basic unit of information stored, containing content (text), metadata (a flexible JSON value for properties, relationships, types, etc.), and an embedding vector.- Storage: Currently utilizes SurrealDB as the backend for persistent storage (enabled via feature flags).
- Embeddings: Integrates with OpenAI (via
OpenAiClientand feature flags) to generate text embeddings for semantic recall. - Recall: Provides the
recall()method for retrieving relevantMemoryFragments based on semantic similarity to a query string. - Integrations: Includes a
ConceptNetClientfor optional knowledge enrichment (requires separate setup).
Installation
Add llm-brain as a dependency in your Cargo.toml:
[]
= { = "/path/to/llm-brain/crate" } # Or use version = "x.y.z" if published
# Enable features as needed, e.g., for SurrealDB storage and OpenAI embeddings:
# llm-brain = { path = "...", features = ["storage-surrealdb", "llm-openai"] }
Quick Start
use llm-;
use json;
async
Development
- Build:
cargo build - Check:
cargo check(orcargo check --tests) - Test:
cargo test(usecargo test -- --ignored --nocaptureto run all tests, requires setup likeOPENAI_API_KEY) - Lint/Format:
cargo fmtandcargo clippy
License
This library is licensed under the Unlicense License. See the root LICENSE file for details.