{"id":761,"library":"google-cloud-bigquery-storage","title":"Google Cloud BigQuery Storage Client Library","description":"The Google Cloud BigQuery Storage API client library enables high-throughput data transfer from BigQuery tables. It leverages a binary serialization format (like Apache Arrow or Protobuf) for efficient data transfer and is ideal for analytical workloads requiring large-scale data extraction. The library is currently at version 2.36.2 and is released as part of the `google-cloud-python` monorepo, following a frequent release cadence.","status":"active","version":"2.36.2","language":"python","source_language":"en","source_url":"https://github.com/googleapis/google-cloud-python/tree/main/packages/google-cloud-bigquery-storage","tags":["google-cloud","bigquery","data-streaming","etl","data-warehouse","cloud-storage"],"install":[{"cmd":"pip install google-cloud-bigquery-storage","lang":"bash","label":"Install stable version"},{"cmd":"pip install google-cloud-bigquery-storage[fastavro,pandas,pyarrow]","lang":"bash","label":"Install with common data processing dependencies"}],"dependencies":[{"reason":"Required for `to_arrow()` method to read data in Apache Arrow format, which is highly efficient.","package":"pyarrow","optional":true},{"reason":"Required for `to_dataframe()` method to convert read data directly into a Pandas DataFrame. Requires `pyarrow` or `fastavro` as well.","package":"pandas","optional":true},{"reason":"Required for `rows()` method to parse protobuf messages into Python dictionaries when not using Arrow format. Useful when `pandas` or `pyarrow` are not installed.","package":"fastavro","optional":true},{"reason":"Often used in conjunction to manage BigQuery tables, run standard SQL queries, or perform operations not related to high-throughput data transfer. Not a direct dependency for `bigquery-storage` itself.","package":"google-cloud-bigquery","optional":true}],"imports":[{"note":"The canonical import path changed in version 2.0.0. The `_v1` suffix is no longer recommended.","wrong":"from google.cloud.bigquery_storage_v1 import BigQueryReadClient","symbol":"BigQueryReadClient","correct":"from google.cloud.bigquery_storage import BigQueryReadClient"},{"note":"The canonical import path changed in version 2.0.0. The `_v1` suffix is no longer recommended.","wrong":"from google.cloud.bigquery_storage_v1 import BigQueryWriteClient","symbol":"BigQueryWriteClient","correct":"from google.cloud.bigquery_storage import BigQueryWriteClient"},{"note":"Enum types are now accessed via the `types` module (e.g., `types.DataFormat.ARROW`). Direct import from `enums` or via the client (`BigQueryReadClient.enums`) is deprecated since v2.0.0.","wrong":"from google.cloud.bigquery_storage_v1 import enums","symbol":"types","correct":"from google.cloud.bigquery_storage import types"}],"quickstart":{"code":"import os\nfrom google.cloud.bigquery_storage import BigQueryReadClient, types\n\n# Your Google Cloud project ID. If not set, it will default to the project\n# defined in your environment (e.g., GOOGLE_CLOUD_PROJECT, gcloud config).\nproject_id = os.environ.get('GOOGLE_CLOUD_PROJECT', '')\nif not project_id:\n    raise ValueError(\"GOOGLE_CLOUD_PROJECT environment variable must be set\")\n\n# Public BigQuery dataset and table to read from\nparent = f\"projects/{project_id}\"\ndataset_id = \"google_trends\"\ntable_id = \"international_top_rising_terms\"\ntable = f\"projects/bigquery-public-data/datasets/{dataset_id}/tables/{table_id}\"\n\ndef read_bigquery_table_storage(project_id, table):\n    \"\"\"Reads data from a BigQuery table using the BigQuery Storage Read API.\"\"\"\n    client = BigQueryReadClient()\n\n    # Specify the table and desired data format (Arrow recommended for performance)\n    read_options = types.ReadSession.TableReadOptions(selected_fields=[\"country_name\", \"region_name\"])\n    requested_session = types.ReadSession(\n        table=table,\n        data_format=types.DataFormat.ARROW, # Or types.DataFormat.AVRO\n        read_options=read_options,\n    )\n\n    # Create a read session\n    read_session = client.create_read_session(\n        parent=parent,\n        read_session=requested_session,\n        max_stream_count=1 # Adjust for parallelism if needed\n    )\n\n    print(f\"Read session created: {read_session.name}\")\n\n    # Read from the first stream (assuming max_stream_count=1)\n    stream_name = read_session.streams[0].name\n    reader = client.read_rows(stream_name)\n\n    # Convert to Pandas DataFrame (requires pandas and pyarrow installed)\n    import pandas as pd\n    try:\n        dataframe = reader.to_dataframe()\n        print(\"Successfully read data into Pandas DataFrame.\")\n        print(dataframe.head())\n        return dataframe\n    except ImportError as e:\n        print(f\"Could not convert to DataFrame: {e}. Try iterating rows manually.\")\n        print(\"Reading rows directly:\")\n        for row_message in reader.rows(session=read_session):\n            # row_message will be a dict if fastavro is installed and format is AVRO, \n            # or a protobuf message if format is ARROW (requires manual parsing for dicts)\n            print(row_message)\n            break # Print first row and exit\n\nif __name__ == \"__main__\":\n    # Ensure you have authenticated to GCP (e.g., `gcloud auth application-default login`)\n    # and enabled the BigQuery Storage API for your project.\n    # Replace 'your-gcp-project-id' with your actual project ID or set GOOGLE_CLOUD_PROJECT env var.\n    # Example uses a public dataset, so you mainly need read access to your billing project.\n    df = read_bigquery_table_storage(project_id, table)\n","lang":"python","description":"This quickstart demonstrates how to use `BigQueryReadClient` to read data from a public BigQuery table using the Storage API. It configures a read session, reads data in Apache Arrow format, and attempts to convert it to a Pandas DataFrame. Remember to set the `GOOGLE_CLOUD_PROJECT` environment variable and ensure the BigQuery Storage API is enabled for your project."},"warnings":[{"fix":"Update import statements to `from google.cloud.bigquery_storage import ...` and access enums via `from google.cloud.bigquery_storage import types`.","message":"Major breaking changes occurred in version 2.0.0. The primary import path for clients changed from `google.cloud.bigquery_storage_v1` to `google.cloud.bigquery_storage`. Enum types moved from direct import or client access (e.g., `BigQueryReadClient.enums`) to the `types` module (e.g., `types.DataFormat.ARROW`). Existing code using the `_v1` suffix or direct enum access will fail.","severity":"breaking","affected_versions":">=2.0.0"},{"fix":"Remove these parameters from client instantiation. Customize retry and timeout settings directly when invoking methods (e.g., `client.create_read_session(..., timeout=60)`).","message":"The `client_config` and `channel` parameters for client constructors have been removed.","severity":"deprecated","affected_versions":">=2.0.0"},{"fix":"Use the `google-cloud-bigquery` client library for standard SQL queries, small data extractions, or metadata operations. Reserve `google-cloud-bigquery-storage` for high-volume data ingestion or extraction workflows.","message":"The BigQuery Storage API is optimized for high-throughput, large-scale data transfer, not for small, interactive queries or single-row lookups. Using it for small datasets may introduce unnecessary overhead compared to the standard BigQuery client library.","severity":"gotcha","affected_versions":"All"},{"fix":"Install `google-cloud-bigquery-storage` with appropriate extras (e.g., `pip install google-cloud-bigquery-storage[fastavro,pandas,pyarrow]`) and use methods like `reader.to_dataframe()` or `reader.rows()`.","message":"Data read from the Storage API is returned in a binary format (Protobuf or Apache Arrow). To easily work with this data in Python, you typically need to convert it. This often requires additional dependencies like `pandas` and `pyarrow` (for `to_dataframe()`) or `fastavro` (for `rows()` to get dicts from AVRO).","severity":"gotcha","affected_versions":"All"},{"fix":"For optimal parallel performance when reading multiple streams, consider using Python's `multiprocessing` module or an asynchronous framework if your I/O operations are truly non-blocking across network requests.","message":"While `BigQueryReadClient.create_read_session` allows specifying `max_stream_count` for parallelism, achieving true concurrent data processing in Python often requires using the `multiprocessing` module rather than simple threading due to Python's Global Interpreter Lock (GIL).","severity":"gotcha","affected_versions":"All"},{"fix":"Upgrade your Python environment to 3.9 or higher.","message":"The `google-cloud-bigquery` client library, which often complements `google-cloud-bigquery-storage`, has ended support for Python 3.7 and 3.8. Although `google-cloud-bigquery-storage` officially supports Python >=3.7, it is highly recommended to upgrade to Python 3.9+ to maintain compatibility with the broader Google Cloud client ecosystem and ensure ongoing support.","severity":"gotcha","affected_versions":"All (especially for Python 3.7, 3.8 users)"},{"fix":"Ensure the `GOOGLE_CLOUD_PROJECT` environment variable is set to your Google Cloud project ID (e.g., `export GOOGLE_CLOUD_PROJECT='your-project-id'`) before running your application. Alternatively, explicitly pass project credentials to the client constructor, or ensure `GOOGLE_APPLICATION_CREDENTIALS` points to a valid service account key file.","message":"Most Google Cloud client libraries, including `google-cloud-bigquery-storage`, require proper authentication. In many development environments, this is achieved by setting the `GOOGLE_CLOUD_PROJECT` environment variable to specify the project ID, or by providing explicit credentials (e.g., via `GOOGLE_APPLICATION_CREDENTIALS`). Failure to do so will result in authentication errors or errors like 'GOOGLE_CLOUD_PROJECT environment variable must be set'.","severity":"gotcha","affected_versions":"All"},{"fix":"Ensure the `GOOGLE_CLOUD_PROJECT` environment variable is set to your Google Cloud project ID, or explicitly pass the `project` argument to the client constructor (e.g., `BigQueryReadClient(project='your-project-id')`).","message":"Google Cloud client libraries often require a project ID for operations. If not explicitly provided during client instantiation or via default credentials (e.g., service account JSON), the library typically looks for the `GOOGLE_CLOUD_PROJECT` environment variable. Failure to set this will result in a `ValueError`.","severity":"gotcha","affected_versions":"All"}],"env_vars":null,"last_verified":"2026-05-12T18:44:14.797Z","next_check":"2026-06-27T00:00:00.000Z","problems":[{"fix":"Install the library using pip: `pip install google-cloud-bigquery-storage`","cause":"The `google-cloud-bigquery-storage` library is not installed in the Python environment, or the Python interpreter cannot find it in its path.","error":"ModuleNotFoundError: No module named 'google.cloud.bigquery_storage'"},{"fix":"Update your import statements to use the current API version or the top-level namespace: `from google.cloud.bigquery_storage import BigQueryReadClient, types` or `from google.cloud import bigquery_storage`","cause":"This error typically occurs when trying to import an older, deprecated version of the BigQuery Storage API client (`v1beta1`) after upgrading the `google-cloud-bigquery-storage` library to a newer version (2.x or later), which primarily uses the `v1` or top-level `bigquery_storage` namespace.","error":"ImportError: cannot import name 'bigquery_storage_v1beta1' from 'google.cloud'"},{"fix":"Install the `pyarrow` library: `pip install pyarrow` or install with the BigQuery client extra: `pip install google-cloud-bigquery[bqstorage,pandas]`","cause":"The `google-cloud-bigquery-storage` library relies on `pyarrow` for efficient data serialization when working with Apache Arrow format or converting results to Pandas DataFrames, but `pyarrow` is not installed as a dependency.","error":"ValueError: The pyarrow library is not installed, please install pyarrow to use the to_arrow() function."},{"fix":"Ensure that the query or read session is expected to return data. If no data is expected, handle the case where the `ReadRowsIterable` might be empty before attempting to convert to a DataFrame. This might involve checking for data presence or wrapping the conversion in a `try-except` block. Additionally, verify `fastavro` is installed as it's required for Avro serialization, `pip install fastavro`.","cause":"This error can occur when using `reader.to_dataframe()` on a `ReadRowsIterable` object that returns no rows, or when there's an issue parsing the Avro schema, sometimes related to specific data types or filters on the table.","error":"AttributeError: 'NoneType' object has no attribute '_parse_avro_schema'"}],"ecosystem":"pypi","meta_description":null,"install_score":95,"install_tag":"verified","quickstart_score":0,"quickstart_tag":"stale","pypi_latest":"2.38.0","cli_name":"","cli_version":null,"install_checks":{"last_tested":"2026-05-12","tag":"verified","tag_description":"installs cleanly on critical runtimes, fast import, recently tested","installed_version":null,"pypi_latest":"2.38.0","is_stale":null,"results":[{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.65,"mem_mb":22.8,"disk_size":"70.2M"},{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.76,"mem_mb":49.3,"disk_size":"400.0M"},{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.54,"mem_mb":22.9,"disk_size":"69.0M"},{"runtime":"python:3.10-alpine","python_version":"3.10","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.65,"mem_mb":49.3,"disk_size":"393.8M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":6.5,"import_time_s":1.04,"mem_mb":20.2,"disk_size":"68M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":14.8,"import_time_s":1.98,"mem_mb":46.7,"disk_size":"369M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.02,"mem_mb":20.3,"disk_size":"67M"},{"runtime":"python:3.10-slim","python_version":"3.10","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.83,"mem_mb":46.7,"disk_size":"364M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.23,"mem_mb":24.7,"disk_size":"74.9M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":3.83,"mem_mb":56.2,"disk_size":"418.3M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.54,"mem_mb":24.8,"disk_size":"73.7M"},{"runtime":"python:3.11-alpine","python_version":"3.11","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":4.32,"mem_mb":57.3,"disk_size":"412.0M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":5.1,"import_time_s":1.57,"mem_mb":22.3,"disk_size":"73M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":13.9,"import_time_s":2.74,"mem_mb":54.1,"disk_size":"387M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.55,"mem_mb":22.4,"disk_size":"71M"},{"runtime":"python:3.11-slim","python_version":"3.11","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.84,"mem_mb":54.9,"disk_size":"382M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.38,"mem_mb":24,"disk_size":"66.3M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":3.51,"mem_mb":54.3,"disk_size":"403.0M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.53,"mem_mb":24.6,"disk_size":"65.2M"},{"runtime":"python:3.12-alpine","python_version":"3.12","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":4.74,"mem_mb":56.4,"disk_size":"396.7M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":4.6,"import_time_s":1.87,"mem_mb":21.6,"disk_size":"64M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":12.6,"import_time_s":3.19,"mem_mb":51.9,"disk_size":"371M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.28,"mem_mb":22.3,"disk_size":"63M"},{"runtime":"python:3.12-slim","python_version":"3.12","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":4.18,"mem_mb":54,"disk_size":"366M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.35,"mem_mb":25,"disk_size":"66.0M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":3.28,"mem_mb":55.3,"disk_size":"401.8M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.59,"mem_mb":25.4,"disk_size":"64.8M"},{"runtime":"python:3.13-alpine","python_version":"3.13","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":4.56,"mem_mb":57.5,"disk_size":"395.4M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":5,"import_time_s":1.78,"mem_mb":22.6,"disk_size":"64M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":12.9,"import_time_s":2.98,"mem_mb":52.9,"disk_size":"370M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.15,"mem_mb":22.9,"disk_size":"62M"},{"runtime":"python:3.13-slim","python_version":"3.13","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":4.22,"mem_mb":55,"disk_size":"365M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.58,"mem_mb":22.7,"disk_size":"70.2M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.77,"mem_mb":49.6,"disk_size":"389.5M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.42,"mem_mb":22.6,"disk_size":"69.1M"},{"runtime":"python:3.9-alpine","python_version":"3.9","os_libc":"alpine (musl)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.44,"mem_mb":49.4,"disk_size":"388.6M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":7,"import_time_s":1.3,"mem_mb":20.1,"disk_size":"68M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":"wheel","failure_reason":null,"import_side_effects":null,"install_time_s":17.2,"import_time_s":2.32,"mem_mb":47,"disk_size":"367M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"google-cloud-bigquery-storage","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":1.11,"mem_mb":20.4,"disk_size":"67M"},{"runtime":"python:3.9-slim","python_version":"3.9","os_libc":"slim (glibc)","variant":"fastavro,pandas,pyarrow","exit_code":0,"wheel_type":null,"failure_reason":null,"import_side_effects":null,"install_time_s":null,"import_time_s":2.03,"mem_mb":46.8,"disk_size":"366M"}]},"quickstart_checks":{"last_tested":"2026-04-24","tag":"stale","tag_description":"widespread failures or data too old to trust","results":[{"runtime":"python:3.10-alpine","exit_code":1},{"runtime":"python:3.10-slim","exit_code":1},{"runtime":"python:3.11-alpine","exit_code":1},{"runtime":"python:3.11-slim","exit_code":1},{"runtime":"python:3.12-alpine","exit_code":1},{"runtime":"python:3.12-slim","exit_code":1},{"runtime":"python:3.13-alpine","exit_code":1},{"runtime":"python:3.13-slim","exit_code":1},{"runtime":"python:3.9-alpine","exit_code":1},{"runtime":"python:3.9-slim","exit_code":1}]}}