{"id":2463,"library":"dbt-fabric","title":"dbt-fabric adapter for Microsoft Fabric Synapse Data Warehouse","description":"dbt-fabric is a dbt adapter plugin that enables dbt to connect to and manage data models within Microsoft Fabric Synapse Data Warehouses. It extends dbt-core with specific materializations, incremental strategies, and connection logic tailored for Fabric. The library is actively maintained, with version 1.9.9 being the latest, and sees frequent minor releases incorporating new features and compatibility updates.","status":"active","version":"1.9.9","language":"en","source_language":"en","source_url":"https://github.com/microsoft/dbt-fabric","tags":["dbt","data warehousing","microsoft fabric","synapse","analytics"],"install":[{"cmd":"pip install dbt-fabric","lang":"bash","label":"Install `dbt-fabric`"}],"dependencies":[{"reason":"Core dbt utilities and shared components.","package":"dbt-common","optional":false},{"reason":"Base classes and interfaces for dbt adapters.","package":"dbt-adapters","optional":false},{"reason":"The core dbt framework.","package":"dbt-core","optional":false},{"reason":"ODBC driver for connecting to SQL-based data sources.","package":"pyodbc","optional":false},{"reason":"Microsoft Authentication Library for Python, used for Active Directory authentication.","package":"msal","optional":false},{"reason":"Azure Identity client library for authentication against Azure services.","package":"azure-identity","optional":false}],"imports":[{"note":"Users typically interact with dbt adapters through the `dbt` command-line interface and project configuration, not via direct Python imports in their models or scripts.","symbol":"dbt-fabric","correct":"dbt-fabric is primarily used via dbt CLI and profiles.yml configuration. Direct Python imports of classes from `dbt_fabric` are uncommon for standard dbt operations."}],"quickstart":{"code":"import os\n\n# Example profiles.yml content for dbt-fabric\nprofiles_yaml_content = f'''\nfabric:\n  target: dev\n  outputs:\n    dev:\n      type: fabric\n      method: odbc\n      driver: \"{{ODBC Driver 18 for SQL Server}}\" # Ensure this driver is installed\n      server: \"{os.environ.get('DBT_FABRIC_SERVER', 'your_workspace_name.datawarehouse.fabric.microsoft.com')}\"\n      port: 1433\n      database: \"{os.environ.get('DBT_FABRIC_DATABASE', 'your_data_warehouse_name')}\"\n      schema: \"{{{{ env_var('DBT_FABRIC_SCHEMA', 'dbt_schema') }}}}\"\n      authentication: \"ActiveDirectoryInteractive\" # Or ServicePrincipal, CLI, ManagedIdentity, etc.\n      client_id: \"{os.environ.get('FABRIC_CLIENT_ID', '')}\" # Required for ServicePrincipal\n      client_secret: \"{os.environ.get('FABRIC_CLIENT_SECRET', '')}\" # Required for ServicePrincipal\n      tenant_id: \"{os.environ.get('FABRIC_TENANT_ID', '')}\" # Required for ServicePrincipal\n      host_name_in_certificate: \"*.datawarehouse.fabric.microsoft.com\" # Recommended\n      query_timeout: 300\n'''\n\n# In a real scenario, this content would be saved to ~/.dbt/profiles.yml\n# or a file referenced by DBT_PROFILES_DIR. Then you would run dbt commands:\n# dbt debug --target dev\n# dbt run\n\nprint(\"Generated profiles.yml content (replace placeholders and ensure ODBC driver is installed):\\n\")\nprint(profiles_yaml_content)\nprint(\"\\nTo use: Save this to your profiles.yml and run 'dbt debug --target dev' or 'dbt run' from your dbt project directory.\")\n","lang":"python","description":"To use `dbt-fabric`, configure your `profiles.yml` with the `type: fabric` adapter. Ensure you have the necessary ODBC driver (e.g., 'ODBC Driver 18 for SQL Server') installed and configured on your system. Connection details like server, database, and authentication method (e.g., ActiveDirectoryInteractive, ServicePrincipal) should be provided. For sensitive credentials, always use environment variables. After configuration, you can run `dbt debug --target dev` to test the connection and `dbt run` to execute your models."},"warnings":[{"fix":"Re-implement any custom schema generation logic directly within your dbt project's macros or adjust your schema configuration to use standard dbt functionalities.","message":"The `generate_custom_schema` macro was removed in dbt-fabric v1.9.3. Projects relying on this macro for custom schema generation will encounter errors.","severity":"breaking","affected_versions":">=1.9.3"},{"fix":"If encountering issues, consider refactoring your ephemeral models to avoid nested CTEs, or materialize them as tables instead of views if possible within your project's constraints. Simplify complex logic or break down into multiple models.","message":"Ephemeral models with nested Common Table Expressions (CTEs) are not fully supported when materialized as views, due to limitations within Microsoft Fabric Synapse Data Warehouse.","severity":"gotcha","affected_versions":">=1.9.1 (inherent platform limitation)"},{"fix":"Exercise caution when using complex or deeply nested CTEs, especially with ephemeral models. Test thoroughly and simplify query logic where possible. Refer to the latest dbt-fabric documentation and GitHub issues for updates on CTE support.","message":"There have been multiple changes and reversions related to nested CTE support (e.g., in v1.9.4, changes were reverted). This indicates ongoing instability or limitations with complex CTE structures within Fabric through the adapter.","severity":"gotcha","affected_versions":"All versions"}],"env_vars":null,"last_verified":"2026-04-10T00:00:00.000Z","next_check":"2026-07-09T00:00:00.000Z"}