dbt-fabric adapter for Microsoft Fabric Synapse Data Warehouse

1.9.9 · active · verified Fri Apr 10

dbt-fabric is a dbt adapter plugin that enables dbt to connect to and manage data models within Microsoft Fabric Synapse Data Warehouses. It extends dbt-core with specific materializations, incremental strategies, and connection logic tailored for Fabric. The library is actively maintained, with version 1.9.9 being the latest, and sees frequent minor releases incorporating new features and compatibility updates.

Warnings

Install

Imports

Quickstart

To use `dbt-fabric`, configure your `profiles.yml` with the `type: fabric` adapter. Ensure you have the necessary ODBC driver (e.g., 'ODBC Driver 18 for SQL Server') installed and configured on your system. Connection details like server, database, and authentication method (e.g., ActiveDirectoryInteractive, ServicePrincipal) should be provided. For sensitive credentials, always use environment variables. After configuration, you can run `dbt debug --target dev` to test the connection and `dbt run` to execute your models.

import os

# Example profiles.yml content for dbt-fabric
profiles_yaml_content = f'''
fabric:
  target: dev
  outputs:
    dev:
      type: fabric
      method: odbc
      driver: "{{ODBC Driver 18 for SQL Server}}" # Ensure this driver is installed
      server: "{os.environ.get('DBT_FABRIC_SERVER', 'your_workspace_name.datawarehouse.fabric.microsoft.com')}"
      port: 1433
      database: "{os.environ.get('DBT_FABRIC_DATABASE', 'your_data_warehouse_name')}"
      schema: "{{{{ env_var('DBT_FABRIC_SCHEMA', 'dbt_schema') }}}}"
      authentication: "ActiveDirectoryInteractive" # Or ServicePrincipal, CLI, ManagedIdentity, etc.
      client_id: "{os.environ.get('FABRIC_CLIENT_ID', '')}" # Required for ServicePrincipal
      client_secret: "{os.environ.get('FABRIC_CLIENT_SECRET', '')}" # Required for ServicePrincipal
      tenant_id: "{os.environ.get('FABRIC_TENANT_ID', '')}" # Required for ServicePrincipal
      host_name_in_certificate: "*.datawarehouse.fabric.microsoft.com" # Recommended
      query_timeout: 300
'''

# In a real scenario, this content would be saved to ~/.dbt/profiles.yml
# or a file referenced by DBT_PROFILES_DIR. Then you would run dbt commands:
# dbt debug --target dev
# dbt run

print("Generated profiles.yml content (replace placeholders and ensure ODBC driver is installed):\n")
print(profiles_yaml_content)
print("\nTo use: Save this to your profiles.yml and run 'dbt debug --target dev' or 'dbt run' from your dbt project directory.")

view raw JSON →