{"id":14483,"library":"cachey","title":"Cachey: Caching for Analytic Computations","description":"Cachey is a Python library from Dask designed for caching in analytic computations where the costs of recomputation and storage can vary significantly. Unlike traditional caching policies (e.g., LRU), Cachey is mindful of these varying costs. The latest PyPI version is 0.2.1, released in March 2020. The project README states it is 'new and not robust'.","status":"maintenance","version":"0.2.1","language":"en","source_language":"en","source_url":"https://github.com/dask/cachey/","tags":["cache","caching","dask","analytics","performance"],"install":[{"cmd":"pip install cachey","lang":"bash","label":"Install latest version"}],"dependencies":[],"imports":[{"symbol":"Cache","correct":"from cachey import Cache"}],"quickstart":{"code":"from cachey import Cache\nfrom time import sleep\n\n# Initialize a cache with a 2 GB limit\ncache = Cache(2 * 10**9) \n\n@cache.memoize(key='my-expensive-function', cost=100) # Define key and estimated cost\ndef my_expensive_function(x):\n    print(f\"Computing for {x}...\")\n    sleep(0.1) # Simulate expensive computation\n    return x + 1\n\n# First call will compute and cache\nresult1 = my_expensive_function(1)\nprint(f\"Result 1: {result1}\")\n\n# Second call with same arguments will retrieve from cache\nresult2 = my_expensive_function(1)\nprint(f\"Result 2: {result2}\")\n\n# Call with different arguments will compute and cache separately\nresult3 = my_expensive_function(2)\nprint(f\"Result 3: {result3}\")","lang":"python","description":"This example demonstrates how to initialize `Cachey` with a memory limit and use the `@cache.memoize` decorator to cache the results of an expensive function. The `key` and `cost` parameters help Cachey manage its eviction policy effectively."},"warnings":[{"fix":"Thoroughly test `cachey` in your specific use case. Consider alternative, more mature caching libraries for robust production environments if stability is a primary concern.","message":"The official GitHub README for Cachey explicitly states: 'Cachey is new and not robust.' Users should be aware that the library might have unaddressed issues or lack full stability, and it may not be suitable for critical production systems without thorough testing.","severity":"gotcha","affected_versions":"All versions (0.1.0 - 0.2.1)"},{"fix":"Review the project's GitHub repository for any recent activity not reflected in PyPI releases. For new projects, evaluate more actively maintained caching solutions. If using Cachey, be prepared to address potential compatibility issues or maintain the code yourself.","message":"The last release of Cachey (v0.2.1) was on March 11, 2020. This indicates a lack of active development and maintenance, which may lead to unpatched bugs, security vulnerabilities, or incompatibility with newer Python versions or related libraries.","severity":"deprecated","affected_versions":"<=0.2.1"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Ensure the `key` argument of `@cache.memoize` is a unique identifier derived from all relevant function inputs. For example, if a function takes multiple arguments, combine them into a tuple for the key. Re-evaluate the `cost` parameter to accurately reflect the relative expense (computation + storage) of recomputing and storing the function's output.","cause":"The `key` parameter in `@cache.memoize` might not fully represent all differentiating inputs to the function, leading to cache collisions or incorrect retrieval. Alternatively, the cache eviction policy might be removing items prematurely due to incorrect `cost` estimation.","error":"Unexpected cache misses or stale data when using @cache.memoize"},{"fix":"Reduce the initial `nbytes` value passed to the `Cache` constructor. Profile the actual memory usage of the objects being cached and adjust the `cost` parameter in `@cache.memoize` to better reflect their true memory footprint. Monitor system memory usage to identify if other processes are competing for resources.","cause":"The `Cache` object is initialized with a `nbytes` parameter (total memory limit) that is too large for the available system memory, or the `cost` attributed to cached items is underestimated, causing the cache to attempt to store more data than it can hold or than the system can provide.","error":"MemoryError: Cannot allocate memory for cache"}],"ecosystem":"pypi"}