Expand description
In-memory cache storage with LRU eviction.
This module provides a thread-safe, concurrent cache storage layer using:
DashMap: Sharded hash map for lock-free concurrent reads- LRU /
TinyLFU: Eviction framework (default LRU; adaptive policies wired later) - Size tracking: Maintains total byte count for eviction
§Architecture
┌─────────────────────────────────────┐
│ CacheStorage │
├─────────────────────────────────────┤
│ DashMap<CacheKey, CacheEntry> │ ← Lock-free reads
│ Mutex<LruCache<CacheKey, ()>> │ ← Protected LRU ordering
│ Total size counter (atomic) │ ← Enforce cap
└─────────────────────────────────────┘§Concurrency
- Reads: Lock-free via
DashMapsharding - Writes: Per-key locks (
DashMaphandles this) - LRU updates: Brief lock on LRU cache to record access
- Eviction: Lock-protected to prevent race conditions
§Example
ⓘ
use sqry_core::cache::{CacheStorage, CacheKey, GraphNodeSummary};
let storage = CacheStorage::new(50 * 1024 * 1024); // 50 MB cap
storage.insert(key, vec![summary1, summary2]);
if let Some(summaries) = storage.get(&key) {
// Cache hit
}Structs§
- Cache
Stats - Cache statistics for telemetry and diagnostics.
- Cache
Storage - Thread-safe in-memory cache storage with LRU eviction.