Expand description
The lockable library offers thread-safe HashMap (see LockableHashMap), LruCache (see LockableLruCache) and LockPool (see LockPool) types. In all of these data types, individual keys can be locked/unlocked, even if there is no entry for this key in the map or cache.
This can be very useful for synchronizing access to an underlying key-value store or for building cache data structures on top of such a key-value store.
§LRU cache example
This example builds a simple LRU cache and locks some entries.
use lockable::{AsyncLimit, LockableLruCache};
let lockable_cache = LockableLruCache::<i64, String>::new();
// Insert an entry
lockable_cache.async_lock(4, AsyncLimit::no_limit())
.await?
.insert(String::from("Value"));
// Hold a lock on a different entry
let guard = lockable_cache.async_lock(5, AsyncLimit::no_limit())
.await?;
// This next line would wait until the lock gets released,
// which in this case would cause a deadlock because we're
// on the same thread
// let guard2 = lockable_cache.async_lock(5, AsyncLimit::no_limit())
// .await?;
// After dropping the corresponding guard, we can lock it again
std::mem::drop(guard);
let guard2 = lockable_cache.async_lock(5, AsyncLimit::no_limit())
.await?;
§Lockpool example
This example builds a simple lock pool using the LockPool data structure. A lock pool is a pool of keyable locks. This can be used if you don’t need a cache but just some way to synchronize access to an underlying resource.
use lockable::LockPool;
let lockpool = LockPool::new();
let guard1 = lockpool.async_lock(4).await;
let guard2 = lockpool.async_lock(5).await;
// This next line would wait until the lock gets released,
// which in this case would cause a deadlock because we're
// on the same thread.
// let guard3 = lockpool.async_lock(4).await;
// After dropping the corresponding guard, we can lock it again
std::mem::drop(guard1);
let guard3 = lockpool.async_lock(4).await;
§HashMap example
If you need a lockable key-value store but don’t need the LRU ordering, you can use LockableHashMap.
use lockable::{AsyncLimit, LockableHashMap};
let lockable_map = LockableHashMap::<i64, String>::new();
// Insert an entry
lockable_map.async_lock(4, AsyncLimit::no_limit())
.await?
.insert(String::from("Value"));
// Hold a lock on a different entry
let guard = lockable_map.async_lock(5, AsyncLimit::no_limit())
.await?;
// This next line would wait until the lock gets released,
// which in this case would cause a deadlock because we're
// on the same thread
// let guard2 = lockable_map.async_lock(5, AsyncLimit::no_limit())
// .await?;
// After dropping the corresponding guard, we can lock it again
std::mem::drop(guard);
let guard2 = lockable_map.async_lock(5, AsyncLimit::no_limit())
.await?;
§WARNING: Deadlocks
This data structure is powerful and with great power comes great danger (or something like that). Having concurrent threads or tasks lock keys in an arbitrary order can easily lead to deadlocks where one thread is waiting for a lock held by another thread, while the second thread is waiting for a lock held by the first thread. Be careful and apply common deadlock prevention strategies, e.g. always lock keys in the same order.
§Crate Features
lru
: Enables the LockableLruCache type which adds a dependency on the lru crate.slow_assertions
: Enables slow assertions. Don’t use this in production code. It is very slow. This is useful to assert invariants and search for bugs within thelockable
crate. It is not helpful in finding bugs in user code. If you do enable this and encounter an assertion failing, please report it in a GitHub issue.
Structs§
- Guard
- A RAII implementation of a scoped lock for locks from a LockableHashMap or LockableLruCache. When this instance is dropped (falls out of scope), the lock will be unlocked.
- Lock
Pool - A pool of locks where individual locks can be locked/unlocked by key. It initially considers all keys as “unlocked”, but they can be locked and if a second thread tries to acquire a lock for the same key, they will have to wait.
- Lockable
Hash Map - A threadsafe hash map where individual keys can be locked/unlocked, even if there is no entry for this key in the map. It initially considers all keys as “unlocked”, but they can be locked and if a second thread tries to acquire a lock for the same key, they will have to wait.
- Lockable
LruCache - A threadsafe LRU cache where individual keys can be locked/unlocked, even if there is no entry for this key in the cache. It initially considers all keys as “unlocked”, but they can be locked and if a second thread tries to acquire a lock for the same key, they will have to wait.
Enums§
- Async
Limit - An instance of this enum defines a limit on the number of entries in a LockableLruCache or a LockableHashMap. It can be used to cause old entries to be evicted if a limit on the number of entries is exceeded in a call to the following functions:
- Never
- A type that can never be instantiated. This can be used in a
Result<T, Never>
to indicate that an operation cannot return an error. - Sync
Limit - An instance of this enum defines a limit on the number of entries in a LockableLruCache or a LockableHashMap. It can be used to cause old entries to be evicted if a limit on the number of entries is exceeded in a call to the following functions:
- TryInsert
Error - This error is thrown by Guard::try_insert if the entry already exists
Traits§
- Infallible
Unwrap - Extension trait for
Result<T, Never>
that adds infallible_unwrap(), an infallible version of unwrap(). - Lockable
- A common trait for both LockableHashMap and LockableLruCache that offers some common functionalities.