{"id":987,"library":"google-cloud-dataplex","title":"Google Cloud Dataplex","description":"Google Cloud Dataplex is a unified data governance platform that provides an intelligent data fabric to centrally manage, monitor, and govern data across data lakes, data warehouses, and data marts. It enables consistent controls, trusted data access, and powers analytics at scale. The Python client library is currently at version 2.17.0 and is actively maintained with frequent releases.","status":"active","version":"2.17.0","language":"python","source_language":"en","source_url":"https://github.com/googleapis/google-cloud-python/tree/main/packages/google-cloud-dataplex","tags":["google-cloud","dataplex","data governance","data lake","data catalog"],"install":[{"cmd":"pip install google-cloud-dataplex","lang":"bash","label":"Install stable version"}],"dependencies":[],"imports":[{"note":"The v1 suffix indicates the API version. Always import the versioned client.","symbol":"DataplexServiceClient","correct":"from google.cloud import dataplex_v1"}],"quickstart":{"code":"import os\nfrom google.cloud import dataplex_v1\n\ndef list_lakes(project_id: str, location: str):\n    \"\"\"Lists Dataplex lakes in a given project and location.\"\"\"\n    try:\n        client = dataplex_v1.DataplexServiceClient()\n        parent = f\"projects/{project_id}/locations/{location}\"\n\n        print(f\"Listing lakes in {parent}:\")\n        # API calls often return an iterable (pager) for list methods\n        for lake in client.list_lakes(parent=parent):\n            print(f\"- {lake.name} (State: {lake.state.name})\")\n        print(\"Lakes listed successfully.\")\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n        print(\"Ensure 'gcloud auth application-default login' has been run or GOOGLE_APPLICATION_CREDENTIALS is set.\")\n        print(\"Also, verify that the Dataplex API is enabled for your project and the service account has necessary permissions.\")\n\nif __name__ == \"__main__\":\n    PROJECT_ID = os.environ.get(\"GOOGLE_CLOUD_PROJECT\", \"your-gcp-project-id\")\n    LOCATION = \"us-central1\" # Or your desired region, e.g., \"global\" for some resources\n\n    if PROJECT_ID == \"your-gcp-project-id\":\n        print(\"Please set the 'GOOGLE_CLOUD_PROJECT' environment variable or replace 'your-gcp-project-id' with your actual GCP project ID.\")\n    else:\n        list_lakes(PROJECT_ID, LOCATION)","lang":"python","description":"This quickstart demonstrates how to instantiate the Dataplex client and list existing lakes within a specified Google Cloud project and location. Ensure your Google Cloud project ID and an appropriate location are set."},"warnings":[{"fix":"Review release notes and documentation for specific metadata changes and update your code to reflect the new structure or consistency with source systems.","message":"Some metadata stored in Dataplex Universal Catalog changed on January 12, 2026, to align with original source systems (e.g., Vertex AI, Bigtable, Spanner). Workloads that depend on the specific structure or content of this metadata will need to be adjusted to preserve continuity.","severity":"breaking","affected_versions":"All versions consuming Dataplex Catalog metadata after 2026-01-12"},{"fix":"Ensure that the location of your Dataplex zones and the underlying data assets (GCS buckets, BigQuery datasets) are compatible and correctly aligned according to Dataplex's strict location hierarchy rules.","message":"Dataplex enforces strict location constraints for resources. Zones (regional or multi-regional) and their associated assets (e.g., GCS buckets, BigQuery datasets) must strictly match the zone's location type. Attempting to add an asset that violates these constraints (e.g., a 'EU' multi-region BigQuery dataset to a 'europe-west1' regional zone) will result in asset attachment failures.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Migrate any existing Dataplex Explore workloads or functionalities to BigQuery Studio as per the official migration instructions.","message":"Dataplex Explore was deprecated on July 22, 2024. Functionality provided by Dataplex Explore is now expected to be handled by BigQuery Studio.","severity":"deprecated","affected_versions":"All versions after 2024-07-22"},{"fix":"To retrieve the full Aspect values, ensure you set the `view` parameter (e.g., `EntryView.FULL` or `EntryView.ALL`) when calling methods like `CatalogServiceClient.get_entry` or `CatalogServiceClient.list_entries`.","message":"When programmatically querying Dataplex Catalog Entries using the Python client, you might only retrieve custom Aspect *names* but not their corresponding *values* by default.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Upgrade your Python environment to version 3.10 or higher, and then ensure `google-cloud-dataplex_v1` and its dependencies (e.g., `google-api-core`, `google-auth`) are updated to their latest compatible versions.","message":"Running `google-cloud-dataplex_v1` on Python 3.9 or older versions will trigger `FutureWarning` messages because these Python versions are unsupported by the library and its core dependencies. Google will not post any further updates for these older Python versions, potentially leading to critical bug fixes or features being missed.","severity":"gotcha","affected_versions":"All versions of `google-cloud-dataplex_v1` when used with Python 3.9 or older."},{"fix":"Ensure the 'GOOGLE_CLOUD_PROJECT' environment variable is set, or provide the project ID explicitly when initializing Dataplex client objects (e.g., `client = dataplex.DataplexServiceClient(project='your-gcp-project-id')`).","message":"The Dataplex client library requires a Google Cloud Project ID for most operations. If not explicitly provided in the client configuration, it defaults to checking the 'GOOGLE_CLOUD_PROJECT' environment variable or the project associated with the default credentials. Failure to provide a project ID will prevent successful API calls.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-05-12T22:15:24.136Z","next_check":"2026-06-27T00:00:00.000Z","problems":[{"fix":"Grant the appropriate IAM roles (e.g., `roles/dataplex.admin`, `roles/dataplex.viewer`, `roles/dataplex.editor`, or specific granular permissions like `dataplex.metadataFeeds.create`, `bigquery.dataOwner`) to the service account or user on the relevant Google Cloud project, lake, zone, or resource via the Google Cloud Console or `gcloud` CLI.","cause":"The service account or user account performing the operation lacks the necessary IAM permissions for the requested Dataplex resource or action.","error":"Permission denied"},{"fix":"Enable the required API(s) through the Google Cloud Console by navigating to 'APIs & Services' > 'Library', searching for the specific API (e.g., 'Dataplex API'), and clicking 'Enable'. Allow a few minutes for the API to fully activate before retrying.","cause":"The Google Cloud Dataplex API, or a related required API such as the Data Lineage API, has not been enabled in your Google Cloud project.","error":"API not enabled or Service unavailable"},{"fix":"Either manually delete all nested resources (zones, assets) within the lake before attempting to delete the lake, or, if using the Python client library, set the `force=True` parameter in the `DeleteLakeRequest` to perform a cascading deletion.","cause":"You are attempting to delete a Dataplex lake that still contains dependent nested resources (such as zones or assets) without specifying a cascading delete operation.","error":"Resource 'projects/<project_id>/locations/<location>/lakes/<lake_name>' has nested resources. If the API supports cascading delete, set 'force' to true to delete it and its nested resources."},{"fix":"Ensure the `DataplexServiceClient` is initialized with the correct region where your Dataplex resources are located. You can specify the endpoint by setting `client_options={'api_endpoint': 'dataplex.<REGION>.googleapis.com'}` during client initialization. Also, verify the project ID is correct and the service account has permissions in that specific region.","cause":"This error typically indicates that the `DataplexServiceClient` is attempting to access Dataplex resources in an incorrect, unsupported, or misconfigured Google Cloud region.","error":"google.api_core.exceptions.MethodNotImplemented: 501 Received http2 header with status: 404"}],"ecosystem":"pypi","meta_description":null,"install_score":95,"install_tag":"verified","quickstart_score":null,"quickstart_tag":null,"pypi_latest":"2.19.0","cli_name":"","install_checks":{"last_tested":"2026-05-12","tag":"verified","tag_description":"installs cleanly on critical runtimes, fast import, recently tested","results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":null,"import_time_s":3.23,"mem_mb":46.2,"disk_size":"78.5M"},{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.97,"mem_mb":45.7,"disk_size":"77.4M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":6.3,"import_time_s":1.29,"mem_mb":31.9,"disk_size":"76M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":1.37,"mem_mb":31.5,"disk_size":"75M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":null,"import_time_s":3.69,"mem_mb":48.3,"disk_size":"84.7M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":4.01,"mem_mb":47.9,"disk_size":"83.6M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":5.6,"import_time_s":2.26,"mem_mb":35.3,"disk_size":"82M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2,"mem_mb":35,"disk_size":"81M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":null,"import_time_s":3.58,"mem_mb":47.8,"disk_size":"76.0M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":3.68,"mem_mb":47.4,"disk_size":"74.9M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":4.7,"import_time_s":2.32,"mem_mb":35,"disk_size":"74M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.5,"mem_mb":34.7,"disk_size":"73M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":null,"import_time_s":3.17,"mem_mb":48.3,"disk_size":"75.5M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":3.5,"mem_mb":47.9,"disk_size":"74.2M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":4.8,"import_time_s":2.25,"mem_mb":35.3,"disk_size":"73M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.41,"mem_mb":34.9,"disk_size":"72M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":null,"import_time_s":2.82,"mem_mb":46.5,"disk_size":"78.8M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":2.98,"mem_mb":46.2,"disk_size":"77.7M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":"wheel","failure_reason":null,"install_time_s":7.3,"import_time_s":1.59,"mem_mb":32.2,"disk_size":"77M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"default","exit_code":0,"wheel_type":null,"failure_reason":null,"install_time_s":null,"import_time_s":1.41,"mem_mb":32,"disk_size":"75M"}]},"quickstart_checks":{"last_tested":"2026-04-24","tag":null,"tag_description":null,"results":[{"runtime":"python:3.10-alpine","exit_code":0},{"runtime":"python:3.10-slim","exit_code":0},{"runtime":"python:3.11-alpine","exit_code":0},{"runtime":"python:3.11-slim","exit_code":0},{"runtime":"python:3.12-alpine","exit_code":0},{"runtime":"python:3.12-slim","exit_code":0},{"runtime":"python:3.13-alpine","exit_code":0},{"runtime":"python:3.13-slim","exit_code":0},{"runtime":"python:3.9-alpine","exit_code":0},{"runtime":"python:3.9-slim","exit_code":0}]}}