Multiprocess

raw JSON →
0.70.19 verified Tue May 12 auth: no python install: verified quickstart: verified

Multiprocess is a Python library that serves as a friendly fork of the standard `multiprocessing` module, primarily providing enhanced and more robust serialization capabilities through the use of `dill`. It aims to be a drop-in replacement for `multiprocessing` in many scenarios, offering better handling of complex objects and functions. The library is actively maintained, with regular releases (several per year) addressing updates and Python version compatibility. The current version is 0.70.19.

pip install multiprocess
error TypeError: can't pickle _thread.lock objects
cause This error occurs because low-level threading synchronization primitives, like `_thread.lock` objects, cannot be serialized (pickled) and transferred between separate processes. This often happens when passing `threading.Lock` or `queue.Queue` (from the `queue` module, which is thread-safe) directly to `multiprocess` workers, rather than using the process-safe equivalents from the `multiprocess` module itself.
fix
Use the process-safe synchronization primitives provided by the multiprocess library, such as multiprocess.Lock() or multiprocess.Queue(), instead of their threading or standard queue module counterparts. Ensure any shared state or objects are designed to be picklable or are managed via multiprocess.Manager. For instances where you are creating threads within a child process, create the _thread.lock object *inside* the child process function, not in the parent.
error RuntimeError: An attempt has been made to start a new process before the current process has finished its bootstrapping phase. This probably means that you are not using fork to start your child processes and you have forgotten to use the proper idiom in the main module: if __name__ == '__main__': freeze_support()
cause This error is common on Windows and macOS (which default to the 'spawn' start method) when code that creates child processes (like `multiprocess.Process` or `multiprocess.Pool`) is executed directly at the top level of a script, rather than being protected within an `if __name__ == '__main__':` block. Child processes must import the main module to run, and if the process creation code is at the top level, it leads to an infinite recursion of process spawning.
fix
Wrap all code that creates new processes within an if __name__ == '__main__': block. If creating a frozen executable, also include multiprocess.freeze_support() at the very beginning of the main script, although this is often not strictly necessary for unfrozen scripts.
error AttributeError: Can't pickle local object '...' (e.g., 'lambda', 'MyClass.<locals>.my_function')
cause Although `multiprocess` uses `dill` for enhanced serialization, which is generally more capable than `pickle` at handling local objects, closures, and lambda functions, this error can still occur in specific complex scenarios. It typically means that `dill` was unable to serialize an object (often a function or a method) that was defined locally within another function or is not accessible in the global scope of the module when the child process tries to unpickle it.
fix
Ensure that functions or methods intended for multiprocessing are defined at the top level of a module, or within a class whose instances are properly structured for serialization. If a lambda function is causing the issue, refactor it into a regular, globally defined function. For class methods, ensure the class itself is picklable and the method does not rely on non-picklable internal state that isn't handled by __getstate__ and __setstate__ methods.
error ModuleNotFoundError: No module named 'multiprocess'
cause This error indicates that the `multiprocess` library is either not installed in the Python environment being used or Python cannot find it in its `sys.path`.
fix
Install the multiprocess library using pip: pip install multiprocess. Verify the installation by running python -c "import multiprocess; print(multiprocess.__version__)" in your terminal. If using a virtual environment, ensure it's activated before installation.
breaking Python 3.7 and older versions are no longer formally supported. Version 0.70.19 requires Python >=3.9. Previous versions had different minimum requirements (e.g., 0.70.18 required >=3.8, 0.70.16 dropped 3.7).
fix Ensure your environment uses Python 3.9 or newer. Upgrade your Python interpreter if necessary.
breaking The minimum required version for the `dill` dependency has steadily increased across `multiprocess` releases. Version 0.70.19 requires `dill >=0.4.1`. Older `multiprocess` versions may have different `dill` requirements.
fix Upgrade your `dill` package to the latest version (`pip install --upgrade dill`) or at least to the version specified by your `multiprocess` installation.
gotcha When using `multiprocess` (or standard `multiprocessing`), the code that spawns child processes must be protected inside an `if __name__ == '__main__':` block. This is crucial for Windows compatibility and to prevent infinite recursion when child processes import the main script.
fix Always wrap your process-spawning logic within `if __name__ == '__main__':`.
gotcha For 'frozen executables' (e.g., created with PyInstaller, cx_Freeze), `multiprocess` requires explicit early import of `multiprocess.forking` to set up proper process spawning mechanisms, otherwise processes may fail to start or behave unexpectedly.
fix Include `import multiprocess.forking` at the very beginning of your main script when creating frozen executables.
gotcha While `multiprocess` aims to be a 'drop-in replacement' and offers 'better serialization' compared to the standard `multiprocessing` module, users migrating or relying on very specific `pickle` behaviors might encounter subtle differences. `multiprocess` uses `dill` which has broader serialization capabilities, but this can also lead to different behavior for edge cases or non-standard objects.
fix Thoroughly test your application after switching from `multiprocessing` to `multiprocess`, especially if you pass complex or custom objects between processes. Review `dill`'s documentation for its serialization specifics.
python os / libc status wheel install import disk
3.10 alpine (musl) - - 0.24s 19.7M
3.10 slim (glibc) - - 0.18s 20M
3.11 alpine (musl) - - 0.37s 22.3M
3.11 slim (glibc) - - 0.28s 23M
3.12 alpine (musl) - - 0.27s 14.1M
3.12 slim (glibc) - - 0.28s 15M
3.13 alpine (musl) - - 0.26s 13.8M
3.13 slim (glibc) - - 0.35s 14M
3.9 alpine (musl) - - 0.21s 19.2M
3.9 slim (glibc) - - 0.22s 20M

This quickstart demonstrates how to create and manage processes using `multiprocess.Process`. Each worker function runs in its own process, executing the `worker_function` with a given name. The `if __name__ == "__main__":` block is essential for proper process spawning on certain operating systems (especially Windows) and to prevent recursive imports.

import multiprocess
import os
import time

def worker_function(name):
    """A function to be run in a separate process."""
    print(f"Process {os.getpid()}: Hello, {name}!")
    time.sleep(0.5)
    print(f"Process {os.getpid()}: Goodbye, {name}!")

if __name__ == "__main__":
    print(f"Main process: {os.getpid()}")
    processes = []
    names = ["Alice", "Bob", "Charlie"]

    for name in names:
        p = multiprocess.Process(target=worker_function, args=(name,))
        processes.append(p)
        p.start()

    for p in processes:
        p.join()

    print("Main process: All workers finished.")