clache 0.2.0

Small utilities for caching data
Documentation
  • Coverage
  • 100%
    34 out of 34 items documented2 out of 33 items with examples
  • Size
  • Source code size: 37.33 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 2.5 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 15s Average build duration of successful builds.
  • all releases: 14s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • Yrgh/clache
    0 0 0
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • Yrgh

Caching Utilities

Clache provides lightweight, easy-to-integrate, caching utilities indexed by arbitrary strings, with both globally-shared and strongly-typed cache types.

Note: The keys to the cache are commonly referred to as "paths", but they are arbitrary strings. It could be a filesystem path, URI, UUID, or any other identifying string.

There are two types of caches in Clache, GlobalCache and LocalCache.

You should use the GlobalCache when:

  • Using dynamic types.
  • Want to share data across the whole program.
  • Or want to ensure cache keys are unique.
  • But be careful: key conflicts can cause problems with downcasting and execution ordering, and dynamic typing is slower.

You should use a LocalCache when:

  • Using a single, static type.
  • Want to control visibility, for a subsystem or value cache for a type.
  • And want performance.
  • But be careful: LocalCaches require much more boilerplate, and two LocalCaches may contain the same key, but have different stored values.

Multi-threading and async support

Both types of caches can be shared across threads and share their cached values via [Arc]. Cache hits are very fast, but cache misses are very costly, due to locking.

Both GlobalCache and LocalCache support async versions of their get_or_else methods that are near the same performance as their blocking counterparts

Minimal usage example

Only the first access has a large performance penalty; subsequent cache hits are significantly faster, especially since they don't run loading logic.

use clache::GlobalCache;

// Slow cache miss and load. Returns Arc<String>
let config = GlobalCache::get_or_else("data://config.json", || {
    std::fs::read_to_string("config.json").unwrap()
});

// Returns the same Arc<String>, much faster. You could also use `get_or_else`
let config: Arc<String> = GlobalCache::get("data://config.json");

LocalCache works the same, except each has its own storage of paths, so be careful.

use clache::LocalCache;
use std::sync::Arc;

let cache = LocalCache::new();
// Points to the same cache as 'cache'
let clone = cache.clone();

// Send 'clone' to another thread...
let value_clone = cache.get_or("value:", Arc::new(10));

// On main thread...
let value_cache = cache.get_or("value:", Arc::new(15));

assert!(Arc::ptr_eq(value_clone, value_cache));
// Both are the same allocation, but it could be either 10 or 15, depending on which `get_or` executed first