{"library":"adlfs","title":"adlfs","description":"adlfs provides an fsspec-compatible interface for accessing Azure Blob Storage, Azure Data Lake Storage Gen2, and the now-deprecated Azure Data Lake Storage Gen1. It allows Python users to interact with Azure storage as if it were a local filesystem, integrating seamlessly with data processing libraries like Dask, Pandas, and Ray. The library is currently at version 2026.2.0 and maintains an active release schedule.","status":"active","version":"2026.2.0","language":"en","source_language":"en","source_url":"https://github.com/fsspec/adlfs","tags":["azure","blob storage","data lake","adls gen2","fsspec","dask","cloud storage"],"install":[{"cmd":"pip install adlfs","lang":"bash","label":"Install with pip"},{"cmd":"conda install -c conda-forge adlfs","lang":"bash","label":"Install with conda"}],"dependencies":[{"reason":"Core filesystem interface, adlfs is an fsspec implementation.","package":"fsspec","optional":false},{"reason":"Internal implementation for Azure Blob Storage operations.","package":"azure-storage-blob","optional":false},{"reason":"Used for `DefaultAzureCredential` for authenticated access.","package":"azure-identity","optional":true},{"reason":"For Dask integration and distributed data processing.","package":"dask","optional":true}],"imports":[{"note":"This is the primary class for interacting with Azure Blob Storage and ADLS Gen2.","symbol":"AzureBlobFileSystem","correct":"from adlfs import AzureBlobFileSystem"},{"note":"The `AzureDatalakeFileSystem` class and `adl://` protocol were for ADLS Gen1, which is now deprecated and removed. For ADLS Gen2, use `AzureBlobFileSystem` with `abfs://` or `az://` protocols.","wrong":"from adlfs import AzureDatalakeFileSystem","symbol":"AzureDatalakeFileSystem","correct":"from adlfs import AzureDatalakeFileSystem"}],"quickstart":{"code":"import os\nfrom adlfs import AzureBlobFileSystem\n\n# Recommended: Use environment variables for credentials\n# e.g., AZURE_STORAGE_ACCOUNT_NAME, AZURE_STORAGE_ACCOUNT_KEY, AZURE_STORAGE_SAS_TOKEN\n# For DefaultAzureCredential, ensure AZURE_STORAGE_ACCOUNT_NAME is set and anon=False\n\naccount_name = os.environ.get('AZURE_STORAGE_ACCOUNT_NAME', 'your_account_name')\n# For demonstration, using anonymous access to a public container\n# In real scenarios, provide proper credentials (account_key, sas_token, or use anon=False)\nfs = AzureBlobFileSystem(account_name=account_name, anon=True)\n\n# Example: List contents of a public container\ncontainer_name = \"azureopendatastorage\" # A known public container\npath_to_list = f\"az://{container_name}/\" \n\ntry:\n    print(f\"Listing contents of {path_to_list}:\")\n    contents = fs.ls(path_to_list, detail=False)\n    for item in contents[:5]: # Print first 5 items\n        print(item)\n    if not contents: print(\"Container is empty or access denied (check credentials/permissions).\")\nexcept Exception as e:\n    print(f\"An error occurred: {e}\")\n    print(\"Please ensure 'AZURE_STORAGE_ACCOUNT_NAME' is set, or if accessing a private container, provide valid credentials.\")\n\n# Example: Read a file (requires appropriate permissions)\n# Replace with a real path if you have authenticated access\n# file_path = f\"az://{container_name}/path/to/your/file.txt\"\n# try:\n#     with fs.open(file_path, 'rb') as f:\n#         data = f.read()\n#         print(f\"\\nContent of {file_path[:50]}...: {data.decode()[:100]}...\")\n# except Exception as e:\n#     print(f\"\\nCould not read file {file_path[:50]}...: {e}\")","lang":"python","description":"This quickstart demonstrates how to initialize `adlfs.AzureBlobFileSystem` and list contents of a public Azure Blob Storage container. For private containers, authentication relies on `storage_options` parameters (like `account_key`, `sas_token`, `connection_string`, or service principal details) or automatic credential resolution via `DefaultAzureCredential` (by setting `anon=False` and ensuring `AZURE_STORAGE_ACCOUNT_NAME` is set)."},"warnings":[{"fix":"Migrate to the `az://` or `abfs://` protocols and use the `adlfs.AzureBlobFileSystem` class for Azure Blob Storage and ADLS Gen2.","message":"ADLS Gen1 (adl:// protocol and AzureDatalakeFileSystem class) has been officially retired. Operations using these interfaces are obsolete.","severity":"breaking","affected_versions":"All versions since 2026.2.0 (deprecation warning since earlier, full removal in future)"},{"fix":"If your application relies on anonymous access, explicitly set `anon=True` when creating `adlfs.AzureBlobFileSystem` instances.","message":"Starting in a future release, adlfs will require explicit credentials by default. Anonymous access will no longer be the implicit default.","severity":"breaking","affected_versions":"Future releases (warning present since 2026.2.0)"},{"fix":"Review the `adlfs` and Azure SDK documentation for the specific authentication method you intend to use. Ensure all required credentials are provided and have the necessary permissions for the operations. For `DefaultAzureCredential`, ensure `AZURE_STORAGE_ACCOUNT_NAME` is set and `anon=False`.","message":"Authentication can be complex due to multiple methods (account key, SAS token, service principal, `DefaultAzureCredential`, connection string). Misconfiguring these or incorrect permissions are common issues.","severity":"gotcha","affected_versions":"All versions"},{"fix":"Upgrade your Python environment to Python 3.10 or newer.","message":"Support for Python 3.9 has been removed.","severity":"breaking","affected_versions":"2026.2.0 and later"},{"fix":"Plan your data ingestion strategy around BlockBlobs for most writes. If appending is critical, verify your storage account configuration and consider alternatives if hierarchical namespaces prevent AppendBlobs.","message":"By default, write operations create BlockBlobs which cannot be appended to. AppendBlobs (using `mode=\"ab\"`) are also not available if hierarchical namespaces are enabled on the storage account.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-06T19:12:56.809Z","next_check":"2026-07-05T00:00:00.000Z"}