Cachey: Caching for Analytic Computations

0.2.1 · maintenance · verified Thu Apr 16

Cachey is a Python library from Dask designed for caching in analytic computations where the costs of recomputation and storage can vary significantly. Unlike traditional caching policies (e.g., LRU), Cachey is mindful of these varying costs. The latest PyPI version is 0.2.1, released in March 2020. The project README states it is 'new and not robust'.

Common errors

Warnings

Install

Imports

Quickstart

This example demonstrates how to initialize `Cachey` with a memory limit and use the `@cache.memoize` decorator to cache the results of an expensive function. The `key` and `cost` parameters help Cachey manage its eviction policy effectively.

from cachey import Cache
from time import sleep

# Initialize a cache with a 2 GB limit
cache = Cache(2 * 10**9) 

@cache.memoize(key='my-expensive-function', cost=100) # Define key and estimated cost
def my_expensive_function(x):
    print(f"Computing for {x}...")
    sleep(0.1) # Simulate expensive computation
    return x + 1

# First call will compute and cache
result1 = my_expensive_function(1)
print(f"Result 1: {result1}")

# Second call with same arguments will retrieve from cache
result2 = my_expensive_function(1)
print(f"Result 2: {result2}")

# Call with different arguments will compute and cache separately
result3 = my_expensive_function(2)
print(f"Result 3: {result3}")

view raw JSON →