{"id":474,"library":"databricks-sdk","title":"Databricks SDK for Python","description":"The Databricks SDK for Python is the official client library for interacting with the Databricks Lakehouse platform. It provides a comprehensive interface for all public Databricks REST API operations, designed for robust and intelligent retries, and aims to accelerate development with Python. As of April 2026, it is actively maintained with frequent releases, currently in a Beta status but supported for production use cases.","status":"active","version":"0.126.0","language":"python","source_language":"en","source_url":"https://github.com/databricks/databricks-sdk-py","tags":["databricks","sdk","cloud","api","data-lakehouse","platform-tools"],"install":[{"cmd":"pip install databricks-sdk","lang":"bash","label":"Install latest version"},{"cmd":"%pip install databricks-sdk","lang":"python","label":"Install in Databricks Notebook"}],"dependencies":[{"reason":"Required for authentication mechanisms.","package":"google-auth","optional":false},{"reason":"Required for specific integrations, likely related to AI/ML features.","package":"langchain-openai","optional":false},{"reason":"Underlying data serialization, common in gRPC/API clients.","package":"protobuf","optional":false},{"reason":"HTTP client for API communication.","package":"requests","optional":false},{"reason":"Required for efficient Arrow-based data exchange with the Databricks SQL Connector.","package":"pyarrow","optional":true}],"imports":[{"note":"The primary client for interacting with Databricks APIs.","symbol":"WorkspaceClient","correct":"from databricks.sdk import WorkspaceClient"}],"quickstart":{"code":"import os\nfrom databricks.sdk import WorkspaceClient\n\n# Databricks SDK automatically handles authentication via environment variables\n# (DATABRICKS_HOST, DATABRICKS_TOKEN) or .databrickscfg file.\n# Ensure these are set in your environment or Databricks workspace.\n# For local development, you might set:\n# os.environ['DATABRICKS_HOST'] = 'https://<your-databricks-instance>'\n# os.environ['DATABRICKS_TOKEN'] = os.environ.get('DATABRICKS_TOKEN', 'YOUR_DATABRICKS_PAT')\n\nw = WorkspaceClient()\n\nprint(\"Listing clusters:\")\nfor c in w.clusters.list():\n    print(f\"  - {c.cluster_name} (ID: {c.cluster_id})\")\n","lang":"python","description":"This quickstart initializes the Databricks SDK's WorkspaceClient and lists available clusters. It assumes Databricks authentication is configured, typically via environment variables (DATABRICKS_HOST, DATABRICKS_TOKEN) or a .databrickscfg file."},"warnings":[{"fix":"Uninstall `databricks-api` and install `databricks-sdk`: `pip uninstall databricks-api && pip install databricks-sdk`. Update imports from `databricks_api` to `databricks.sdk`.","message":"The older `databricks-api` package is deprecated. Users should switch to the official `databricks-sdk` for all new and existing projects to leverage modern features and continued support.","severity":"deprecated","affected_versions":"<=0.9.0 of databricks-api"},{"fix":"Always consult the official GitHub CHANGELOG or PyPI release history before upgrading to understand API changes and necessary code modifications. Pinning to a specific minor version (`pip install databricks-sdk==X.Y.Z`) is recommended for production environments.","message":"The `databricks-sdk` is currently in Beta, meaning its interface is subject to change across versions. Numerous breaking changes have been documented in release changelogs.","severity":"breaking","affected_versions":"All Beta versions (0.x.x)"},{"fix":"Explicitly define library versions in your `requirements.txt` or Databricks job configurations. Regularly test upgrades in a staging environment.","message":"When deploying Python libraries as part of Databricks jobs, it is highly recommended to specify the exact library version (e.g., `databricks-sdk==0.126.0`). Failing to do so can lead to non-reproducible environments or unexpected breaking changes if a new library version is released between job runs.","severity":"gotcha","affected_versions":"All versions, especially with Databricks Runtime 15.1 and above"},{"fix":"Verify network access, ensure adequate cluster resources, confirm library compatibility with your Databricks Runtime. For private PyPI, use init scripts to configure `pip config` with `extra-index-url` and secure credentials via Databricks Secrets. For persistent issues, clear pip cache using `--no-cache-dir`.","message":"Common issues with installing PyPI libraries on Databricks clusters include network connectivity problems, insufficient cluster resources (disk, memory, CPU), incompatibility with the Databricks Runtime version, issues with private PyPI repository configurations (e.g., authentication, index URLs), or pip caching conflicts.","severity":"gotcha","affected_versions":"All versions, general Databricks environment"},{"fix":"Upload all libraries to workspace files, Unity Catalog volumes, or use supported library package repositories (PyPI, Maven, CRAN).","message":"Storing library files in the DBFS root is deprecated and disabled by default in Databricks Runtime 15.1 and above due to security concerns.","severity":"deprecated","affected_versions":"Databricks Runtime 15.1 and above"}],"env_vars":null,"last_verified":"2026-05-12T14:06:14.253Z","next_check":"2026-07-11T00:00:00.000Z","problems":[{"fix":"Verify that your Databricks host URL is correctly set and that the authentication method (e.g., personal access token, Azure AD credentials, environment variables, or `.databrickscfg` profile) has the required permissions for the API operation. Ensure no firewalls or private link configurations are redirecting API traffic to login pages.","cause":"This error typically indicates an issue with the Databricks SDK for Python's authentication configuration, where the SDK cannot properly interpret the response from the Databricks API.","error":"Error: Unable to parse response"},{"fix":"Install the Databricks SDK for Python using pip: `pip install databricks-sdk`. If running in a Databricks notebook, ensure the library is attached to the cluster or upgrade to a Databricks Runtime version where it's pre-installed, then use `%pip install --upgrade databricks-sdk` and `dbutils.library.restartPython()` if necessary.","cause":"This error occurs when the `databricks-sdk` package is not installed in the Python environment where the code is being executed, or the environment is not correctly configured to find the installed package.","error":"ModuleNotFoundError: No module named 'databricks.sdk'"},{"fix":"Upgrade the `databricks-sdk` to the latest version using `pip install --upgrade databricks-sdk` or downgrade to a specific compatible version if your code relies on an older API. Consult the `databricks-sdk` changelog or documentation for breaking changes related to the imported object.","cause":"This error often arises due to a version mismatch where your installed `databricks-sdk` version is older or newer than the version expected by the code, meaning the specific object or function you are trying to import has been renamed, moved, or removed in a different SDK version.","error":"ImportError: cannot import name '...' from 'databricks.sdk.core'"},{"fix":"Instead of passing a dictionary, instantiate the appropriate Databricks SDK data class (e.g., `clusters.CreateCluster`, `statement_execution.Statement`, or relevant service-specific request objects) and populate it with your data. Convert your dictionary to the expected SDK object type.","cause":"This error occurs when you are passing a plain Python dictionary to a Databricks SDK method that expects a specific SDK data class or object, which often has an `as_dict` method for internal serialization.","error":"AttributeError: 'dict' object has no attribute 'as_dict'"},{"fix":"Verify that the identity used for authentication has the correct and sufficient permissions (e.g., `CAN USE` on clusters, `EXECUTE` on models, or appropriate access on files/paths) configured in Databricks. For service principals, ensure Git credentials and job permissions are also correctly set.","cause":"This `PermissionDenied` error indicates that the authenticated user or service principal lacks the necessary access rights (e.g., 'CAN USE', 'CAN MANAGE', 'EXECUTE') to perform the requested operation on the specified resource or path within Databricks.","error":"databricks.sdk.errors.mapping.PermissionDenied: No operations allowed on this path..."}],"ecosystem":"pypi","meta_description":null,"install_score":95,"install_tag":"verified","quickstart_score":0,"quickstart_tag":"stale","pypi_latest":null,"install_checks":{"last_tested":"2026-05-12","tag":"verified","tag_description":"installs cleanly on critical runtimes, fast import, recently tested","results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":3.3,"mem_mb":40,"disk_size":"52.8M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.27,"mem_mb":39,"disk_size":"54M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":6.25,"mem_mb":45.6,"disk_size":"58.5M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":5.05,"mem_mb":44.7,"disk_size":"59M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":5.22,"mem_mb":46,"disk_size":"49.8M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":5.05,"mem_mb":45.1,"disk_size":"51M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":4.97,"mem_mb":47.7,"disk_size":"49.5M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":4.61,"mem_mb":46.8,"disk_size":"50M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.96,"mem_mb":40.6,"disk_size":"53.1M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.55,"mem_mb":39.6,"disk_size":"54M"}]},"quickstart_checks":{"last_tested":"2026-04-23","tag":"stale","tag_description":"widespread failures or data too old to trust","results":[{"runtime":"python:3.10-alpine","exit_code":1},{"runtime":"python:3.10-slim","exit_code":1},{"runtime":"python:3.11-alpine","exit_code":1},{"runtime":"python:3.11-slim","exit_code":1},{"runtime":"python:3.12-alpine","exit_code":1},{"runtime":"python:3.12-slim","exit_code":1},{"runtime":"python:3.13-alpine","exit_code":1},{"runtime":"python:3.13-slim","exit_code":1},{"runtime":"python:3.9-alpine","exit_code":1},{"runtime":"python:3.9-slim","exit_code":1}]}}