1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
//! The [lockable](https://crates.io/crates/lockable) library offers thread-safe
//! HashMap (see [LockableHashMap](crate::lockable_hash_map::LockableHashMap)),
//! LruCache (see [LockableLruCache](crate::lockable_lru_cache::LockableLruCache))
//! and LockPool (see [LockPool](crate::lockpool::LockPool)) types. In all of these
//! data types, individual keys can be locked/unlocked, even if there is no entry
//! for this key in the map or cache.
//!
//! This can be very useful for synchronizing access to an underlying key-value
//! store or for building cache data structures on top of such a key-value store.
//!
//! ## LRU cache example
//! This example builds a simple LRU cache and locks some entries.
//! ```
//! use lockable::{AsyncLimit, LockableLruCache};
//!
//! let lockable_cache = LockableLruCache::<i64, String>::new();
//! # tokio::runtime::Runtime::new().unwrap().block_on(async {
//!
//! // Insert an entry
//! lockable_cache.async_lock(4, AsyncLimit::no_limit())
//! .await?
//! .insert(String::from("Value"));
//!
//! // Hold a lock on a different entry
//! let guard = lockable_cache.async_lock(5, AsyncLimit::no_limit())
//! .await?;
//!
//! // This next line would wait until the lock gets released,
//! // which in this case would cause a deadlock because we're
//! // on the same thread
//! // let guard2 = lockable_cache.async_lock(5, AsyncLimit::no_limit())
//! // .await?;
//!
//! // After dropping the corresponding guard, we can lock it again
//! std::mem::drop(guard);
//! let guard2 = lockable_cache.async_lock(5, AsyncLimit::no_limit())
//! .await?;
//! # Ok::<(), lockable::Never>(())}).unwrap();
//! ```
//!
//! ## Lockpool example
//! This example builds a simple lock pool using the [LockPool](crate::lockpool::LockPool)
//! data structure. A lock pool is a pool of keyable locks. This can be used if
//! you don't need a cache but just some way to synchronize access to an underlying
//! resource.
//! ```
//! use lockable::LockPool;
//!
//! let lockpool = LockPool::new();
//! # tokio::runtime::Runtime::new().unwrap().block_on(async {
//! let guard1 = lockpool.async_lock(4).await;
//! let guard2 = lockpool.async_lock(5).await;
//!
//! // This next line would wait until the lock gets released,
//! // which in this case would cause a deadlock because we're
//! // on the same thread.
//! // let guard3 = lockpool.async_lock(4).await;
//!
//! // After dropping the corresponding guard, we can lock it again
//! std::mem::drop(guard1);
//! let guard3 = lockpool.async_lock(4).await;
//! # Ok::<(), lockable::Never>(())}).unwrap();
//! ```
//!
//! ## HashMap example
//! If you need a lockable key-value store but don't need the LRU ordering,
//! you can use [LockableHashMap](crate::lockable_hash_map::LockableHashMap).
//! ```
//! use lockable::{AsyncLimit, LockableHashMap};
//!
//! let lockable_map = LockableHashMap::<i64, String>::new();
//! # tokio::runtime::Runtime::new().unwrap().block_on(async {
//!
//! // Insert an entry
//! lockable_map.async_lock(4, AsyncLimit::no_limit())
//! .await?
//! .insert(String::from("Value"));
//!
//! // Hold a lock on a different entry
//! let guard = lockable_map.async_lock(5, AsyncLimit::no_limit())
//! .await?;
//!
//! // This next line would wait until the lock gets released,
//! // which in this case would cause a deadlock because we're
//! // on the same thread
//! // let guard2 = lockable_map.async_lock(5, AsyncLimit::no_limit())
//! // .await?;
//!
//! // After dropping the corresponding guard, we can lock it again
//! std::mem::drop(guard);
//! let guard2 = lockable_map.async_lock(5, AsyncLimit::no_limit())
//! .await?;
//! # Ok::<(), lockable::Never>(())}).unwrap();
//! ```
//!
//! ## WARNING: Deadlocks
//! This data structure is powerful and with great power comes great danger (or something like that).
//! Having concurrent threads or tasks lock keys in an arbitrary order can easily lead to deadlocks
//! where one thread is waiting for a lock held by another thread, while the second thread is
//! waiting for a lock held by the first thread.
//! Be careful and apply common deadlock prevention strategies, e.g. always lock keys in the same order.
//!
//! ## Crate Features
//! - `lru`: Enables the [LockableLruCache](crate::lockable_lru_cache::LockableLruCache)
//! type which adds a dependency on the [lru](https://crates.io/crates/lru) crate.
//! - `slow_assertions`: Enables slow assertions. Don't use this in production code. It is *very* slow.
//! This is useful to assert invariants and search for bugs within the `lockable` crate.
//! It is not helpful in finding bugs in user code. If you do enable this and encounter an
//! assertion failing, please report it in a GitHub issue.
// TODO Figure out which functions actually should or shouldn't be #[inline]
// We need to add explicit links because our `gen_readme.sh` script requires them.
pub use ;
pub use ;
pub use LockableHashMap;
pub use LockableLruCache;
pub use Lockable;
pub use LockPool;
pub use ;