This is a Rust implementation for HashiCorp's golang-lru.
This crate contains three LRU based cache, LRUCache, TwoQueueCache and AdaptiveCache.
See Introduction, Trade-Off and Usages for more details.
English | 简体中文
Introduction
The MSRV for this crate is 1.55.0.
LRU based caches are O(1) for read, write and delete.
-
LRUCacheorRawLRUis a fixed size LRU cache. -
AdaptiveCacheis a fixed size Adaptive Replacement Cache (ARC). ARC is an enhancement over the standard LRU cache in that tracks both frequency and recency of use. This avoids a burst in access to new entries from evicting the frequently used older entries. -
TwoQueueCacheis a fixed size 2Q cache. 2Q is an enhancement over the standard LRU cache in that it tracks both frequently and recently used entries separately.
Trade-Off
In theory, AdaptiveCache and TwoQueueCache add some additional tracking overhead to a LRUCache cache, computationally it is roughly 2x the cost, and the extra memory overhead is linear with the size of the cache. AdaptiveCache and TwoQueueCache have similar computationally cost, which has been patented by IBM, but the TwoQueueCache (2Q) need to set reasonable parameters.
However, the implementation for the RawLRU uses Box and a raw pointer for each entry to break the limitation of the Rust language (It does not use Rc, because Rc is slower). Thus, in practice, TwoQueueCache is 2.5x computationally slower than LRUCache and AdaptiveCache is 3x computationally slower than LRUCache, because TwoQueueCache and AdaptiveCache has to do more box and re-box than LRUCache, even though I try my best to avoid boxing and re-boxing and use mem::swap to avoid memory allocating and deallocating.
Hence, it is better to understand what is the situation is (your project wants a cache with a higher hit ratio or faster computationally performance), and then choose the reasonable Cache in your project.
(For more performance details, you can clone the project and run cargo bench. The source code for benchmark is in the benches, I am also looking forward to anyone's help for writing more reasonable test cases for benchmark).
Usages
RawLRU and LRUCache
RawLRU is the basic data structure over the crate, it has a generic type E: OnEvictCallback, which support users to write and apply their own callback policy.
LRUCache is a type alias for RawLRU<K, V, DefaultOnEvictCallback, S>, so it does not support custom the on_evict callback.
More methods and examples, please see documents.
Default No-op Callback
Use RawLRU with default noop callback.
use ;
Custom Callback
Use RawLRU with a custom callback.
use Arc;
use ;
use ;
// EvictedCounter is a callback which is used to record the number of evicted entries.
AdaptiveCache (Adaptive Replacement Cache)
More methods and examples, please see documents.
use AdaptiveCache;
TwoQueueCache
More methods and examples, please see documents.
use ;
Acknowledgments
-
The implementation of
RawLRUis highly inspired by Jerome Froelich's LRU implementation andstd::collectionslibrary of Rust. -
Thanks for HashiCorp's golang-lru providing the amazing Go implementation.