omega-memory
12-tier cosmic memory system with automatic consolidation spanning from instant (milliseconds) to omega (universal) timescales.
Part of the ExoGenesis-Omega cognitive architecture.
Overview
omega-memory implements a revolutionary hierarchical memory architecture that spans 12 temporal tiers—from immediate sensory memory lasting milliseconds to cosmic memory operating at the scale of the universe's lifetime. The system automatically consolidates memories between tiers based on importance, recency, and access patterns, mimicking biological memory consolidation.
This architecture enables AI systems to operate coherently across vastly different timescales, from real-time interactions to multi-generational learning, while efficiently managing memory resources.
Features
- 12 Memory Tiers: Instant, Session, Episodic, Semantic, Collective, Evolutionary, Architectural, Substrate, Civilizational, Temporal, Physical, Omega
- Automatic Consolidation: Intelligent migration of memories between tiers
- Time Decay: Tier-appropriate decay functions for memory relevance
- Multi-Modal Storage: Text, embeddings, structured data, sensory data
- Access Tracking: Automatic tracking of memory access patterns
- Importance Scoring: Combined importance + recency + frequency scoring
- Async-First: Full Tokio support for concurrent memory operations
- Type-Safe Queries: Strongly typed query builder with filters
Installation
Add this to your Cargo.toml:
[]
= "0.1.0"
Quick Start
use ;
async
Core Concepts
12-Tier Memory Hierarchy
The memory system is organized into three scales:
Individual Scale (Tiers 1-4):
- Instant (milliseconds) - Sensory buffers, reflexive responses
- Session (hours) - Working memory, current context
- Episodic (days) - Event memories, experiences
- Semantic (weeks) - Factual knowledge, concepts
Species Scale (Tiers 5-8): 5. Collective (months) - Shared knowledge, culture 6. Evolutionary (years) - Learned behaviors, adaptations 7. Architectural (decades) - Structural patterns, designs 8. Substrate (centuries) - Fundamental principles
Cosmic Scale (Tiers 9-12): 9. Civilizational (millennia) - Cultural knowledge 10. Temporal (millions of years) - Temporal patterns 11. Physical (billions of years) - Physical laws 12. Omega (age of universe) - Universal constants
Memory Consolidation
Memories automatically migrate between tiers based on:
- Importance: High-importance memories consolidate faster
- Access Frequency: Frequently accessed memories are retained
- Recency: Recent access prevents decay
- Tier Policies: Each tier has specific consolidation rules
Memory Content Types
Relevance Scoring
Memory relevance is computed as:
relevance = (importance × time_decay) + access_boost
where:
importance: Base importance score (0.0-1.0)time_decay: Tier-specific exponential decayaccess_boost: Logarithmic boost from access frequency
Use Cases
1. Multi-Scale Agent Memory
use ;
let memory = new.await?;
// Store immediate sensory data
let sensory = new;
memory.store.await?;
// Store episodic event
let event = new;
memory.store.await?;
// Store semantic knowledge
let knowledge = new;
memory.store.await?;
2. Knowledge Base with Automatic Pruning
use ;
let memory = new.await?;
// Store many facts
for fact in facts
// Low-importance, rarely accessed memories automatically decay
// High-importance, frequently accessed memories consolidate to higher tiers
memory.auto_consolidate.await?;
// Query returns only relevant, non-expired memories
let query = new
.with_text
.with_min_importance
.build;
let results = memory.recall.await?;
3. Long-Term Learning System
use ;
let memory = new.await?;
// Short-term learning (session tier)
for observation in recent_observations
// Consolidate important patterns to semantic tier
memory.consolidate.await?;
// Over time, foundational knowledge reaches evolutionary tier
memory.consolidate.await?;
// Architectural patterns emerge at higher tiers
let architectural_memories = memory.recall.await?;
4. Multi-Agent Shared Memory
use ;
let collective_memory = new.await?;
// Individual agent stores to collective tier
async
// All agents can query collective knowledge
let query = new
.with_text
.build;
let shared_knowledge = collective_memory.recall.await?;
5. Hierarchical Recall Across Tiers
use ;
let memory = new.await?;
// Build query with embedding
let query = new
.with_embedding
.with_min_importance
.with_max_results
.build;
// Search across multiple tiers simultaneously
let results = memory.recall.await?;
// Results are automatically sorted by relevance
for mem in results
Examples
Memory Statistics
let memory = new.await?;
// Store various memories...
let stats = memory.stats.await;
println!;
println!;
println!;
println!;
println!;
Custom Consolidation Logic
use ;
let memory = new.await?;
// Consolidate session → episodic for memories > 1 hour old
memory.consolidate.await?;
// Consolidate episodic → semantic for important, old memories
memory.consolidate.await?;
// Or use automatic consolidation with built-in heuristics
memory.auto_consolidate.await?;
Architecture
The memory system is structured in three layers:
┌─────────────────────────────────────────┐
│ CosmicMemory (API) │
│ - Unified interface │
│ - Query routing │
│ - Consolidation orchestration │
└────────┬────────────┬────────────┬───────┘
│ │ │
▼ ▼ ▼
┌────────────┐┌────────────┐┌────────────┐
│ Individual ││ Species ││ Cosmic │
│ Memory ││ Memory ││ Memory │
│ (1-4) ││ (5-8) ││ (9-12) │
└────────┬───┘└─────┬──────┘└──────┬─────┘
│ │ │
▼ ▼ ▼
┌─────────────────────────────────────────┐
│ Memory Consolidator │
│ - Importance-based migration │
│ - Time decay application │
│ - Access pattern analysis │
└─────────────────────────────────────────┘
Performance
Memory system performance characteristics:
- Store: O(1) - Constant time insertion
- Recall: O(log n) - Logarithmic search with indexes
- Consolidation: O(n) - Linear scan with filtering
- Memory Usage: ~200 bytes per memory + embedding size
Optimization Strategies
- Tier Separation: Keeps hot (recent) and cold (old) data separate
- Lazy Consolidation: Only consolidates when needed
- Access Tracking: Minimal overhead with atomic counters
- Embedding Compression: Optional quantization for large datasets
Related Crates
- omega-core - Core memory types and traits
- omega-agentdb - Vector search backend
- omega-persistence - SQLite storage layer
- omega-loops - Temporal loop integration
- omega-meta-sona - Architecture evolution
- omega-runtime - Runtime orchestration
- omega-hippocampus - Hippocampal memory circuits
- omega-sleep - Memory consolidation during sleep
License
Licensed under the MIT License. See LICENSE for details.