1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
//! # Reinhardt Cache
//!
//! Caching framework for Reinhardt.
//!
//! ## Features
//!
//! - **InMemoryCache**: Simple in-memory cache backend with optional layered cleanup
//! - Naive cleanup: Traditional O(n) full scan (simple, suitable for small caches)
//! - Layered cleanup: Redis 6.0-inspired O(1) amortized strategy (100-1000x faster for large caches)
//! - **LayeredCacheStore**: Standalone layered cache storage with optimized TTL cleanup
//! - **FileCache**: File-based persistent cache backend
//! - **RedisCache**: Redis-backed cache (requires redis-backend feature)
//! - **MemcachedCache**: Memcached-backed cache (requires memcached-backend feature)
//! - **HybridCache**: Multi-tier caching (memory + distributed)
//! - **RedisSentinelCache**: Redis Sentinel support (requires redis-sentinel feature)
//! - **Pub/Sub**: Cache invalidation via Redis channels (requires redis-backend feature)
//! - **Cache Warming**: Pre-populate cache on startup
//! - **Cache Tags**: Tag-based invalidation for related entries
//! - TTL support for automatic expiration
//! - Async-first API
//!
//! ## Examples
//!
//! ### Basic Usage (Naive Cleanup)
//!
//! ```rust
//! use reinhardt_utils::cache::{Cache, InMemoryCache};
//!
//! # async fn example() {
//! let cache = InMemoryCache::new();
//!
//! // Set a value
//! cache.set("key", &"value".to_string(), None).await.unwrap();
//!
//! // Get a value
//! let value: Option<String> = cache.get("key").await.unwrap();
//! assert_eq!(value, Some("value".to_string()));
//!
//! // Delete a value
//! cache.delete("key").await.unwrap();
//! # }
//! ```
//!
//! ### Optimized Usage (Layered Cleanup for Large Caches)
//!
//! ```rust
//! use reinhardt_utils::cache::{Cache, InMemoryCache};
//! use std::time::Duration;
//!
//! # async fn example() {
//! // Use layered cleanup for better performance (100-1000x faster for large caches)
//! let cache = InMemoryCache::with_layered_cleanup();
//!
//! // Or customize sampling parameters
//! let cache = InMemoryCache::with_custom_layered_cleanup(50, 0.30);
//!
//! // Same API as naive strategy
//! cache.set("key", &"value", Some(Duration::from_secs(60))).await.unwrap();
//! let value: Option<String> = cache.get("key").await.unwrap();
//! # }
//! ```
//!
//! ### Memcached Backend
//!
//! Memcached support is available with the `memcached-backend` feature:
//!
//! ```toml
//! [dependencies]
//! reinhardt-cache = { version = "0.1", features = ["memcached-backend"] }
//! ```
//!
//! ```rust,ignore
//! use reinhardt_utils::cache::{Cache, MemcachedCache, MemcachedConfig};
//! use std::time::Duration;
//!
//! # async fn example() {
//! let config = MemcachedConfig {
//! servers: vec!["127.0.0.1:11211".to_string()],
//! pool_size: 10,
//! timeout_ms: 1000,
//! };
//!
//! let cache = MemcachedCache::new(config).await.unwrap();
//! cache.set("key", b"value", Some(Duration::from_secs(3600))).await.unwrap();
//! # }
//! ```
//!
// Re-export exception types
pub use Result;
// Re-export core items
pub use Cache;
pub use ;
pub use CacheKeyBuilder;
pub use LayeredCacheStore;
pub use ;
pub use RedisCache;
pub use ;
pub use HybridCache;
pub use ;
pub use ;
// Re-export file backend
pub use FileCache;
// Re-export cache warming
pub use ;
// Re-export cache tags
pub use ;