{"id":8819,"library":"aioprocessing","title":"aioprocessing","description":"aioprocessing is a Python 3.5+ library that provides asynchronous, asyncio-compatible versions of many blocking instance methods found in Python's standard `multiprocessing` module. It allows seamless integration of multiprocessing objects within `asyncio` coroutines without blocking the event loop. The library is currently at version 2.0.1 and generally follows an active release cadence, with the last major update (2.0.0) introducing `dill` support and internal `async/await` usage.","status":"active","version":"2.0.1","language":"en","source_language":"en","source_url":"https://github.com/dano/aioprocessing","tags":["asyncio","multiprocessing","concurrency","parallelism","inter-process communication"],"install":[{"cmd":"pip install aioprocessing","lang":"bash","label":"Default Install"},{"cmd":"pip install aioprocessing[dill]","lang":"bash","label":"Install with dill for universal pickling"}],"dependencies":[{"reason":"Optional dependency for universal pickling across processes, useful for more complex object serialization.","package":"dill","optional":true}],"imports":[{"symbol":"AioProcess","correct":"from aioprocessing import AioProcess"},{"symbol":"AioPool","correct":"from aioprocessing import AioPool"},{"symbol":"AioQueue","correct":"from aioprocessing import AioQueue"},{"symbol":"AioLock","correct":"from aioprocessing import AioLock"},{"symbol":"AioEvent","correct":"from aioprocessing import AioEvent"},{"note":"All Aio-prefixed multiprocessing primitives are typically exposed directly under the top-level `aioprocessing` package.","wrong":"from aioprocessing.manager import AioManager","symbol":"AioManager","correct":"from aioprocessing import AioManager"}],"quickstart":{"code":"import asyncio\nimport time\nimport aioprocessing\n\ndef worker_func(queue, event, lock, items):\n    \"\"\" Demo worker function for a separate process. \"\"\"\n    with lock:\n        event.set()\n    for item in items:\n        time.sleep(0.1) # Simulate work\n        queue.put(item + 5)\n    queue.put(None) # Signal completion\n\nasync def example():\n    queue = aioprocessing.AioQueue()\n    lock = aioprocessing.AioLock()\n    event = aioprocessing.AioEvent()\n    items_to_process = [1, 2, 3, 4, 5]\n\n    print(\"Starting worker process...\")\n    p = aioprocessing.AioProcess(target=worker_func, args=(queue, event, lock, items_to_process))\n    p.start()\n\n    # Wait for the worker to signal it's ready\n    await event.coro_wait()\n    print(\"Worker is ready.\")\n\n    async with lock: # Acquire lock in async context\n        print(\"Lock acquired in main process (async).\")\n        # Demonstrating put from main process\n        await queue.coro_put(78)\n        print(\"Put 78 into queue from main process.\")\n\n    print(\"Collecting results from queue...\")\n    while True:\n        result = await queue.coro_get()\n        if result is None:\n            break\n        print(f\"Got result: {result}\")\n\n    await p.coro_join() # Asynchronously wait for process to finish\n    print(\"Worker process finished.\")\n\nif __name__ == \"__main__\":\n    asyncio.run(example())","lang":"python","description":"This quickstart demonstrates how to use `AioProcess`, `AioQueue`, `AioLock`, and `AioEvent` from `aioprocessing` within an `asyncio` application. A worker function runs in a separate process, interacting with shared `aioprocessing` primitives, while the main `asyncio` loop communicates with it without blocking. Coroutine versions of blocking methods are prefixed with `coro_` (e.g., `coro_get`, `coro_put`, `coro_wait`, `coro_acquire`)."},"warnings":[{"fix":"Be mindful of your `multiprocessing` start method, especially on Unix-like systems where 'fork' is the default. If encountering issues, consider explicitly setting the start method to 'spawn' using `multiprocessing.set_start_method('spawn')` at the beginning of your program, although this also has implications (see 'RuntimeError' below). Ensure any objects passed to child processes are picklable.","message":"Mixing threads with forked processes can lead to unexpected behavior and issues, as `aioprocessing` often uses `ThreadPoolExecutor` internally to make blocking `multiprocessing` calls asynchronous. This caveat applies particularly if the underlying `multiprocessing.Pool` (used by `AioPool`) is employing threads.","severity":"gotcha","affected_versions":"All versions"},{"fix":"If experiencing pickling issues, ensure `dill` is installed (`pip install aioprocessing[dill]`). If you need to force standard library `multiprocessing` pickling behavior, set the environment variable `AIOPROCESSING_DILL_DISABLED=1`.","message":"Version 2.0.0 introduced support for universal pickling using `dill` and also moved to using `async/await` internally. This might affect applications relying on specific pickling behaviors or internal event loop interactions, though the external API remains largely consistent with `coro_` prefixes.","severity":"breaking","affected_versions":">=2.0.0"},{"fix":"Investigate the lifecycle of tasks within `maxtasksperchild`. Ensure all resources (e.g., file handles, network connections) acquired by a worker are properly closed before the task completes or the worker process is terminated. Review resource management within your worker functions.","message":"Using `AioPool` with `maxtasksperchild` can sometimes lead to `OSError: [Errno 24] Too many open files` if worker processes are not properly cleaned up and new ones are spawned, especially under heavy load.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-16T00:00:00.000Z","next_check":"2026-07-15T00:00:00.000Z","problems":[{"fix":"Wrap all code that spawns new processes within an `if __name__ == '__main__':` block. This ensures the code only runs in the main process and not in newly spawned child processes during bootstrapping. Example: `if __name__ == '__main__': asyncio.run(main())`.","cause":"This error typically occurs on Windows and macOS (Python 3.8+) where the default multiprocessing start method is 'spawn'. It happens when process creation code is placed directly at the top level of a script, leading to re-import issues in child processes.","error":"RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase."},{"fix":"Ensure that any functions or classes passed as `target` to `AioProcess` or used within `AioPool.map` are defined at the top level of a module, making them globally accessible and therefore picklable. Avoid lambda functions or nested function definitions for multiprocessing targets.","cause":"When using the 'spawn' or 'forkserver' start methods (default on Windows/macOS), objects (including functions) passed to child processes must be importable from the child's context. Nested or locally defined functions cannot be pickled and sent to other processes.","error":"AttributeError: Can't pickle local object 'example.<locals>.worker_func'"},{"fix":"Always use the `coro_` prefixed methods provided by `aioprocessing` (e.g., `queue.coro_put()`, `event.coro_wait()`, `lock.coro_acquire()`) when interacting with `aioprocessing` objects from within `asyncio` coroutines. The non-`coro_` methods are blocking.","cause":"Attempting to use a `multiprocessing` object directly in an `asyncio` coroutine without its `aioprocessing` (coroutine-friendly) wrapper will block the event loop, as the underlying `multiprocessing` methods are synchronous.","error":"BlockingIOError: [Errno 11] Resource temporarily unavailable"}]}