Skip to main content

modo/cache/
mod.rs

1//! # modo::cache
2//!
3//! In-memory LRU cache.
4//!
5//! This module is always available; no feature flag is required.
6//!
7//! ## Provides
8//!
9//! - [`LruCache`] — fixed-capacity least-recently-used cache backed by a
10//!   [`HashMap`](std::collections::HashMap) for O(1) key-value lookup and a
11//!   [`VecDeque`](std::collections::VecDeque) for recency tracking.
12//!
13//! ## Performance
14//!
15//! The overall complexity of `get` and `put` is O(n) because updating the
16//! recency order requires a linear scan of the deque. For small caches (up to a
17//! few thousand entries) this is negligible in practice.
18//!
19//! ## Thread safety
20//!
21//! `LruCache` is not `Sync`. Wrap it in [`std::sync::RwLock`] (or
22//! [`std::sync::Mutex`]) when sharing across threads; because `get` requires
23//! `&mut self`, even read-only lookups need an exclusive lock.
24//!
25//! ## Quick start
26//!
27//! ```
28//! use std::num::NonZeroUsize;
29//! use modo::cache::LruCache;
30//!
31//! let mut cache: LruCache<&str, u32> = LruCache::new(NonZeroUsize::new(2).unwrap());
32//! cache.put("a", 1);
33//! cache.put("b", 2);
34//! assert_eq!(cache.get(&"a"), Some(&1));
35//!
36//! // Inserting a third entry evicts "b" (least recently used).
37//! cache.put("c", 3);
38//! assert!(cache.get(&"b").is_none());
39//! ```
40
41mod lru;
42
43pub use lru::LruCache;