cache_loader_async

Crates.iocache_loader_async
lib.rscache_loader_async
version0.2.1
sourcesrc
created_at2021-05-05 17:33:42.201667
updated_at2022-09-09 11:57:00.778662
descriptionA thread safe loading cache with async loader functions based on tokio
homepage
repositoryhttps://github.com/ZeroTwo-Bot/cache-loader-async-rs
max_upload_size
id393500
size86,386
Alex (ByteAlex)

documentation

README

cache-loader-async

Tests

crates.io

The goal of this crate is to provide a thread-safe and easy way to access any data structure which might is stored in a database at most once and keep it in cache for further requests.

This library is based on tokio-rs and futures.

Usage

Using this library is as easy as that:

#[tokio::main]
async fn main() {
    let static_db: HashMap<String, u32> =
        vec![("foo".into(), 32), ("bar".into(), 64)]
            .into_iter()
            .collect();
    
    let cache = LoadingCache::new(move |key: String| {
        let db_clone = static_db.clone();
        async move {
            db_clone.get(&key).cloned().ok_or("error-message")
        }
    });

    let result = cache.get("foo".to_owned()).await.unwrap().0;

    assert_eq!(result, 32);
}

The LoadingCache will first try to look up the result in an internal HashMap and if it's not found and there's no load ongoing, it will fire the load request and queue any other get requests until the load request finishes.

Features & Cache Backings

The cache-loader-async library currently supports two additional inbuilt backings: LRU & TTL LRU evicts keys based on the cache maximum size, while TTL evicts keys automatically after their TTL expires.

LRU Backing

You can use a simple pre-built LRU cache from the lru-rs crate by enabling the lru-cache feature.

To create a LoadingCache with lru cache backing use the with_backing method on the LoadingCache.

async fn main() {
    let size: usize = 10;
    let cache = LoadingCache::with_backing(LruCacheBacking::new(size), move |key: String| {
        async move {
            Ok(key.to_lowercase())
        }
    });
}

TTL Backing

You can use a simple pre-build TTL cache by enabling the ttl-cache feature. This will not require any additional dependencies.

To create a LoadingCache with ttl cache backing use the with_backing method on the LoadingCache.

async fn main() {
    let duration: Duration = Duration::from_secs(30);
    let cache = LoadingCache::with_backing(TtlCacheBacking::new(duration), move |key: String| {
        async move {
            Ok(key.to_lowercase())
        }
    });
}

You can also provide a custom TTL per key, if you use the with_meta_loader method. Below example will override the global 30s ttl with a 10s ttl. Yes, it doesn't make sense to override every key, so you should be having conditions there.

async fn main() {
    let duration: Duration = Duration::from_secs(30);
    let cache = LoadingCache::with_meta_loader(TtlCacheBacking::new(duration), move |key: String| {
        async move {
            Ok(key.to_lowercase())
                .with_meta(Some(TtlMeta::from(Duration::from_secs(10))))
        }
    });
}

Additionally, the TTL backing allows you to customize the underlying backing. By default, it's using the HashMapBacking.

async fn main() {
    let duration: Duration = Duration::from_secs(30);
    let cache = LoadingCache::with_meta_loader(TtlCacheBacking::with_backing(LruCacheBacking::new(10), duration), move |key: String| {
        async move {
            Ok(key.to_lowercase())
                .with_meta(Some(TtlMeta::from(Duration::from_secs(10))))
        }
    });
}

Own Backing

To implement an own cache backing, simply implement the public CacheBacking trait from the backing mod.

pub trait CacheBacking<K, V>
    where K: Eq + Hash + Sized + Clone + Send,
          V: Sized + Clone + Send {
    type Meta: Clone + Send;

    fn get_mut(&mut self, key: &K) -> Result<Option<&mut V>, BackingError>;
    fn get(&mut self, key: &K) -> Result<Option<&V>, BackingError>;
    fn set(&mut self, key: K, value: V, meta: Option<Self::Meta>) -> Result<Option<V>, BackingError>;
    fn remove(&mut self, key: &K) -> Result<Option<V>, BackingError>;
    fn contains_key(&mut self, key: &K) -> Result<bool, BackingError>;
    fn remove_if(&mut self, predicate: Box<dyn Fn((&K, &V)) -> bool + Send + Sync + 'static>) -> Result<Vec<(K, V)>, BackingError>;
    fn clear(&mut self) -> Result<(), BackingError>;
}
Commit count: 39

cargo fmt