asyncio-pool
asyncio-pool is a Python library that provides a pool for asyncio coroutines, offering a familiar interface similar to `multiprocessing.Pool`. It ensures that a controlled number of coroutines are active concurrently, managing task scheduling and results. The current version is 0.6.0, released in May 2022, with a stalled release cadence but remains functional.
Common errors
-
RuntimeWarning: coroutine 'my_async_function' was never awaited
cause An asynchronous function was called, creating a coroutine object, but this object was not scheduled to run on the event loop, typically by forgetting `await` or not passing it to a pool method.fixIf `my_async_function` is intended to run as part of the pool, pass its result (the coroutine object) to a pool method, e.g., `future = pool.spawn(my_async_function())`. If it's a top-level entry point, ensure it's run with `asyncio.run(my_async_function())`. -
Task was destroyed but it is pending!
cause An `asyncio.Task` was garbage-collected while still running or pending. This often happens if the event loop is closed before all tasks have completed or been properly cancelled.fixEnsure all tasks managed by `AioPool` are allowed to complete. Using `async with AioPool(...) as pool:` is the recommended way, as it handles proper shutdown. Alternatively, ensure `await pool.join()` is called before the event loop is stopped, or that `pool.map` finishes collecting all results. -
TypeError: an asyncio.Future, a asyncio.Task, or an awaitable function is required
cause An object that is not an awaitable type (like a coroutine, `asyncio.Task`, or `asyncio.Future`) was passed to an `await` expression or a function expecting an awaitable.fixVerify that you are passing actual coroutine objects (e.g., the result of calling an `async def` function: `my_coro_func()`, not `my_coro_func`) to methods like `pool.spawn()`, and that any `await` calls are made on valid awaitables.
Warnings
- breaking In version 0.5.0, the `spawn_n` and `map_n` methods were changed to be synchronous. They now spawn all tasks upfront and manage their execution, which is a breaking change from their previous asynchronous behavior.
- gotcha Using `spawn_n`, `map_n`, or `itermap` with an extremely large number of tasks (e.g., 10^6+) can lead to high memory consumption and performance degradation. These methods spawn all tasks into the event loop, which can exhaust system memory if not managed carefully.
- gotcha Despite using `asyncio-pool`, it's still possible to encounter `RuntimeWarning: coroutine '...' was never awaited`. This typically occurs when an `async def` function is called, returning a coroutine object, but that object is not explicitly `await`ed or passed to an `AioPool` method (like `spawn`) for scheduling.
Install
-
pip install asyncio-pool
Imports
- AioPool
from asyncio_pool.pool import AioPool
from asyncio_pool import AioPool
Quickstart
import asyncio
from asyncio_pool import AioPool
async def worker(n: int) -> int:
"""A dummy async worker that simulates some work."""
print(f"Worker {n}: Starting...")
await asyncio.sleep(1 / (n + 1)) # Simulate work, avoid division by zero
print(f"Worker {n}: Done.")
return n * 2
async def main():
pool_size = 5
todo_items = range(10)
print(f"Creating a pool with size {pool_size}")
async with AioPool(size=pool_size) as pool:
print("Mapping tasks to the pool...")
# Map worker function over todo_items, collect results
results = await pool.map(worker, todo_items)
print(f"All tasks completed. Results: {results}")
# Example of spawning individual tasks
print("Spawning individual tasks...")
futures = []
for i in range(3):
futures.append(pool.spawn(worker(i + 10)))
# Await all spawned tasks to complete explicitly if not using map/itermap
final_results = await asyncio.gather(*futures)
print(f"Individual tasks completed. Results: {final_results}")
if __name__ == "__main__":
asyncio.run(main())