Expand description
§Cachelito
A lightweight, thread-safe caching library for Rust that provides automatic memoization through procedural macros.
§Features
- Easy to use: Simply add
#[cache]attribute to any function or method - Global scope by default: Cache shared across all threads (use
scope = "thread"for thread-local) - High-performance synchronization: Uses
parking_lot::RwLockfor global caches - Thread-local option: Optional thread-local storage for maximum performance
- Multiple eviction policies: FIFO, LRU, LFU, ARC, Random, and TLRU
- TLRU with frequency_weight: Fine-tune recency vs frequency balance (v0.15.0)
- Flexible key generation: Supports custom cache key implementations
- Result-aware: Intelligently caches only successful
Result::Okvalues - Cache limits: Control size with
limit(entry count) ormax_memory(memory-based) - TTL support: Time-to-live expiration for automatic cache invalidation
- Statistics: Track hit/miss rates via
statsfeature - Smart invalidation: Tag-based, event-driven, and conditional invalidation
- Conditional caching: Cache only valid results with
cache_ifpredicates - Type-safe: Full compile-time type checking
§Quick Start
Add the #[cache] attribute to any function you want to memoize:
use cachelito::cache;
#[cache]
fn fibonacci(n: u32) -> u64 {
if n <= 1 {
return n as u64;
}
fibonacci(n - 1) + fibonacci(n - 2)
}
// First call computes the result
let result1 = fibonacci(10);
// Second call returns cached result instantly
let result2 = fibonacci(10);
assert_eq!(result1, result2);§Custom Cache Keys
For complex types, you can implement custom cache key generation:
use cachelito::cache;
use cachelito_core::{CacheableKey, DefaultCacheableKey};
#[derive(Debug, Clone)]
struct User {
id: u64,
name: String,
}
// Option 1: Use default Debug-based key
impl DefaultCacheableKey for User {}
// Note: You can also implement CacheableKey directly instead of DefaultCacheableKey
// for better performance, but not both at the same timeOr with a custom implementation:
use cachelito::cache;
use cachelito_core::CacheableKey;
#[derive(Debug, Clone)]
struct UserId {
id: u64,
name: String,
}
// Custom key implementation (more efficient than Debug-based)
impl CacheableKey for UserId {
fn to_cache_key(&self) -> String {
format!("user:{}", self.id)
}
}§Caching with Methods
The #[cache] attribute also works with methods:
use cachelito::cache;
use cachelito_core::DefaultCacheableKey;
#[derive(Debug, Clone)]
struct Calculator;
impl DefaultCacheableKey for Calculator {}
impl Calculator {
#[cache]
fn add(&self, a: i32, b: i32) -> i32 {
a + b
}
}§Error Handling
Functions returning Result<T, E> only cache successful results:
use cachelito::cache;
#[cache]
fn divide(a: i32, b: i32) -> Result<i32, String> {
if b == 0 {
Err("Division by zero".to_string())
} else {
Ok(a / b)
}
}
// Ok results are cached
let _ = divide(10, 2);
// Err results are NOT cached
let _ = divide(10, 0);Modules§
- invalidation
- Cache Invalidation
- stats_
registry - utils
Structs§
- Async
Global Cache - A thread-safe async global cache with configurable eviction policies and TTL support.
- Cache
Entry - Internal wrapper that tracks when a value was inserted into the cache. Used for TTL expiration support.
- Cache
Stats - Cache statistics for monitoring hit/miss rates and performance.
- Global
Cache - A thread-safe global cache that can be shared across multiple threads.
- Invalidation
Metadata - Metadata about cache invalidation configuration
- Invalidation
Registry - Registry for managing cache invalidation
- Thread
Local Cache - Core cache abstraction that stores values in a thread-local HashMap with configurable limits.
Enums§
- Cache
Scope - Cache scope: thread-local or global
- Eviction
Policy - Represents the policy used for evicting elements from a cache when it reaches its limit.
- Invalidation
Strategy - Strategy for cache invalidation
Traits§
- Cacheable
Key - Trait defining how to generate a cache key for a given type.
- Default
Cacheable Key - Marker trait for types that want to use the default cache key behavior.
- Memory
Estimator - Trait for estimating the memory size of cached values.
Functions§
- invalidate_
all_ with - Invalidate entries across all caches based on a check function
- invalidate_
by_ dependency - Global convenience function to invalidate all dependent caches
- invalidate_
by_ event - Global convenience function to invalidate all caches listening to an event
- invalidate_
by_ tag - Global convenience function to invalidate all caches with a given tag
- invalidate_
cache - Invalidate a specific cache by its name
- invalidate_
with - Invalidate entries in a specific cache based on a check function
Attribute Macros§
- cache
- A procedural macro that adds automatic memoization to functions and methods.