jobflow

0.3.1 · active · verified Thu Apr 16

jobflow is a free, open-source Python library (v0.3.1) for writing and executing computational workflows. It enables defining complex workflows using simple Python functions and executing them locally or on remote resources via managers like `jobflow-remote` or `FireWorks`. Key features include dynamic workflows, easy compositing and nesting of workflows, and the ability to store workflow outputs across various databases (MongoDB, S3, GridFS, etc.) through the `Maggma` package. The library is actively maintained with regular updates.

Common errors

Warnings

Install

Imports

Quickstart

This quickstart defines two simple jobs using the `@job` decorator. These jobs are then chained together using `Flow` objects, where the output of one job serves as the input for the next. The `run_locally` function executes the entire flow in your local environment. For persistent storage and more advanced execution, a `JobStore` (e.g., MongoDB, S3) should be configured, typically via a `~/.jobflow.yaml` file.

from jobflow import job, Flow
from jobflow.managers.local import run_locally

@job
def add(a, b):
    return a + b

@job
def multiply(a, b):
    return a * b

# Create Job objects
job1 = add(1, 2)
job2 = multiply(job1.output, 3)
job3 = add(job2.output, 10)

# Create a Flow from the jobs
flow = Flow([job1, job2, job3], name="my_first_flow")

# Run the Flow locally
# For persistent storage, configure ~/.jobflow.yaml with a JobStore
# e.g., JOB_STORE: { _fw_name: "JSONStore", path: "./jobstore" }
responses = run_locally(flow)

# Access results
final_result = responses[job3.uuid].output
print(f"Final result: {final_result}")

view raw JSON →