Expand description
Least Recently Used (LRU) cache implementation.
Provides a fixed-size cache that evicts the least recently used items when the capacity is reached. Least Recently Used (LRU) Cache Implementation
This module provides a memory-efficient LRU cache implementation with O(1) operations for all common cache operations. LRU is one of the most widely used cache eviction algorithms due to its simplicity and good performance for workloads with temporal locality.
§Algorithm
The LRU cache maintains items in order of recency of use, evicting the least recently used item when capacity is reached. This works on the principle of temporal locality: items that have been accessed recently are likely to be accessed again soon.
§Performance Characteristics
-
Time Complexity:
- Get: O(1)
- Put: O(1)
- Remove: O(1)
-
Space Complexity:
- O(n) where n is the capacity of the cache
- Memory overhead is approximately 48 bytes per entry plus the size of keys and values
§When to Use
LRU caches are ideal for:
- General-purpose caching where access patterns exhibit temporal locality
- Simple implementation with predictable performance
- Caching with a fixed memory budget
They are less suitable for:
- Workloads where frequency of access is more important than recency
- Scanning patterns where a large set of items is accessed once in sequence
§Thread Safety
This implementation is not thread-safe. For concurrent access, you should
wrap the cache with a synchronization primitive such as Mutex or RwLock.