Expand description
§Cache
A collection of high-performance, memory-efficient cache implementations supporting various eviction policies.
This crate provides cache implementations optimized for performance and memory usage that can be used
in both std and no_std environments. All cache operations (get
, get_mut
, put
, and remove
)
have O(1) time complexity.
§Available Cache Algorithms
Algorithm | Description | Best Use Case |
---|---|---|
LruCache | Least Recently Used | General purpose, recency-based access patterns |
SlruCache | Segmented LRU | Mixed access patterns with both hot and cold items |
LfuCache | Least Frequently Used | Frequency-based access patterns |
LfudaCache | LFU with Dynamic Aging | Long-running caches with changing popularity |
GdsfCache | Greedy Dual Size Frequency | CDNs and size-aware caching |
§Performance Characteristics
Algorithm | Space Overhead | Hit Rate for Recency | Hit Rate for Frequency | Scan Resistance |
---|---|---|---|---|
LRU | Low | High | Low | Poor |
SLRU | Medium | High | Medium | Good |
LFU | Medium | Low | High | Excellent |
LFUDA | Medium | Medium | High | Excellent |
GDSF | High | Medium | High | Good |
§When to Use Each Algorithm
- LRU: Use for general-purpose caching where recent items are likely to be accessed again.
- SLRU: Use when you have a mix of frequently and occasionally accessed items.
- LFU: Use when access frequency is more important than recency.
- LFUDA: Use for long-running caches where item popularity changes over time.
- GDSF: Use when items have different sizes and you want to optimize for both size and popularity.
§Feature Flags
hashbrown
: Uses the hashbrown crate for HashMap implementation (enabled by default)nightly
: Enables nightly-only optimizations for improved performancestd
: Enables standard library features (disabled by default to supportno_std
)
§No-std Support
This crate works in no_std
environments by default. Enable the std
feature for additional functionality.
§Example with no_std
use cache_rs::LruCache;
use core::num::NonZeroUsize;
// Create a cache in a no_std environment
let mut cache = LruCache::new(NonZeroUsize::new(100).unwrap());
cache.put("key", "value");
assert_eq!(cache.get(&"key"), Some(&"value"));
§Modules
lru
: A Least Recently Used (LRU) cache implementationslru
: A Segmented LRU (SLRU) cache implementationlfu
: A Least Frequently Used (LFU) cache implementationlfuda
: An LFU with Dynamic Aging (LFUDA) cache implementationgdsf
: A Greedy Dual Size Frequency (GDSF) cache implementationconfig
: Configuration structures for all cache algorithm implementationsmetrics
: Metrics collection for cache performance monitoring
Re-exports§
pub use gdsf::GdsfCache;
pub use lfu::LfuCache;
pub use lfuda::LfudaCache;
pub use lru::LruCache;
pub use slru::SlruCache;
Modules§
- config
- Cache configuration structures.
- gdsf
- Greedy Dual-Size Frequency (GDSF) cache implementation.
- lfu
- Least Frequently Used (LFU) cache implementation.
- lfuda
- Least Frequently Used with Dynamic Aging (LFUDA) cache implementation.
- lru
- Least Recently Used (LRU) cache implementation.
- metrics
- Cache metrics system.
- slru
- Segmented LRU (SLRU) cache implementation.