{"id":5172,"library":"databricks-bundles","title":"Databricks Bundles (Declarative Automation Bundles)","description":"Databricks Bundles, recently renamed to Declarative Automation Bundles, provides Python support for defining, dynamically creating, and modifying Databricks jobs and pipelines. It extends the core Declarative Automation Bundles functionality, allowing users to apply software engineering best practices like source control, code review, testing, and CI/CD to their data and AI projects. The library is currently at version 0.296.0 and is actively maintained, with a focus on streamlining deployments and enabling programmatic configuration through Python and YAML files, orchestrated via the Databricks CLI.","status":"active","version":"0.296.0","language":"en","source_language":"en","source_url":"https://github.com/databricks/databricks-cli","tags":["databricks","mlops","ci-cd","infrastructure-as-code","data-engineering","python","yaml"],"install":[{"cmd":"pip install databricks-bundles","lang":"bash","label":"Install Python package (for Python-defined bundles)"},{"cmd":"curl -fsSL https://raw.githubusercontent.com/databricks/setup-cli/main/install.sh | sh\ndatabricks -v # Verify installation\ndatabricks auth login --host https://<your-workspace-url>","lang":"bash","label":"Install and Configure Databricks CLI (required)"}],"dependencies":[{"reason":"Required to initialize, validate, deploy, and run bundles. Version 0.296.0 or above is recommended.","package":"Databricks CLI","optional":false},{"reason":"Used by Python for Declarative Automation Bundles to create virtual environments and install dependencies (can be configured to use venv as an alternative).","package":"uv","optional":true}],"imports":[{"note":"User code defines resources (e.g., jobs, pipelines) in Python files that the CLI then interprets, rather than importing and calling classes/functions from `databricks-bundles` directly within a running Python application.","symbol":"Not applicable for direct application-level imports","correct":"The 'databricks-bundles' Python package is primarily used by the Databricks CLI internally when processing Python-defined bundle resources, rather than direct 'from pkg import ClassName' statements in end-user application code."}],"quickstart":{"code":"# 1. Initialize a new bundle project (select 'Default Python' template when prompted)\ndatabricks bundle init --template experimental-jobs-as-code\n\n# 2. Navigate into the new project directory\ncd <your-bundle-project-name>\n\n# 3. Create a Python file for a job task (e.g., src/my_job.py)\n#    Content for src/my_job.py:\n#    print(\"Hello from my Databricks Bundle!\")\n\n# 4. Define a simple job in databricks.yml (or a Python resource definition if using Python bundles)\n#    Example databricks.yml snippet defining a job running my_job.py (ensure 'target: dev' matches your config)\n#    bundle:\n#      name: my-first-bundle\n#    resources:\n#      jobs:\n#        my_example_job:\n#          name: MyExampleJob\n#          tasks:\n#            - task_key: run_script\n#              python_file_task:\n#                python_file: src/my_job.py\n#              new_cluster:\n#                spark_version: 13.3.x-scala2.12\n#                node_type_id: Standard_DS3_v2\n#                num_workers: 1\n#    targets:\n#      dev:\n#        default: true\n#        workspace:\n#          host: https://<your-workspace-url>\n\n# 5. Validate the bundle configuration\ndatabricks bundle validate\n\n# 6. Deploy the bundle to your Databricks workspace\ndatabricks bundle deploy --target dev\n\n# 7. Run the deployed job\ndatabricks bundle run my_example_job --target dev","lang":"bash","description":"To get started with Databricks Bundles, first ensure the Databricks CLI is installed and authenticated. Then, initialize a new bundle project using a template, typically `experimental-jobs-as-code` for Python support. Define your Databricks resources (like jobs or pipelines) in `databricks.yml` or dedicated Python files within the bundle structure. Finally, use the CLI commands `databricks bundle validate`, `databricks bundle deploy`, and `databricks bundle run` to manage and execute your project on Databricks."},"warnings":[{"fix":"No breaking changes to existing CLI commands or configurations are expected, but be aware of the new terminology in documentation and communications.","message":"The product name changed from 'Databricks Asset Bundles' to 'Declarative Automation Bundles'. While the `bundle` CLI command remains the same, this indicates a conceptual shift and continuous evolution of the platform.","severity":"breaking","affected_versions":"All versions (name change occurred March 2026)"},{"fix":"Always make changes in your local bundle project files (Python or YAML) and then `databricks bundle deploy`. For debugging, copy the notebook to a scratch folder outside the bundle path or debug locally.","message":"Directly editing deployed notebooks or jobs in the Databricks workspace UI can lead to configuration drift and unexpected behavior during subsequent bundle deployments. The local bundle repository is considered the source of truth.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Explicitly grant all required permissions to the deploying identity (Service Principal for CI/CD, human user for local development). Ensure Unity Catalog objects (catalogs, schemas) exist and have appropriate access controls.","message":"Permission denied errors (e.g., `CAN MANAGE`, `USE CATALOG`) are common if the service principal or user deploying the bundle lacks the necessary permissions on jobs, Unity Catalog, or other resources.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Update bundle configurations to remove explicit `/Workspace` prefixes. Paths should start with `${workspace.root_path}/...` or relative paths within the bundle structure. Validate your bundle to catch warnings.","message":"Workspace paths in bundle configurations are now automatically prefixed with `/Workspace` (Databricks CLI 0.230.0+). Using path strings like `/Workspace/${workspace.root_path}/...` will generate a warning and be replaced.","severity":"breaking","affected_versions":"Databricks CLI >= 0.230.0"},{"fix":"Review and update bundle configurations that rely on implicit fallback paths, ensuring explicit and unambiguous path definitions for all resources and overrides.","message":"The fallback path resolution behavior for resources defined in one file and overridden in another was removed in Databricks CLI 0.266.0. This could lead to confusing and error-prone path resolution in older configurations.","severity":"breaking","affected_versions":"Databricks CLI >= 0.266.0"},{"fix":"Clearly define your target environments in `databricks.yml`, mapping them explicitly to your desired Databricks workspaces and configuration settings. Avoid assumptions about the 'dev' target's behavior.","message":"The default 'dev' target created by `databricks bundle init` might not automatically align with your development Git branch or expected development environment. This requires careful manual configuration.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-13T00:00:00.000Z","next_check":"2026-07-12T00:00:00.000Z"}