dbt-fabric adapter for Microsoft Fabric Synapse Data Warehouse
dbt-fabric is a dbt adapter plugin that enables dbt to connect to and manage data models within Microsoft Fabric Synapse Data Warehouses. It extends dbt-core with specific materializations, incremental strategies, and connection logic tailored for Fabric. The library is actively maintained, with version 1.9.9 being the latest, and sees frequent minor releases incorporating new features and compatibility updates.
Warnings
- breaking The `generate_custom_schema` macro was removed in dbt-fabric v1.9.3. Projects relying on this macro for custom schema generation will encounter errors.
- gotcha Ephemeral models with nested Common Table Expressions (CTEs) are not fully supported when materialized as views, due to limitations within Microsoft Fabric Synapse Data Warehouse.
- gotcha There have been multiple changes and reversions related to nested CTE support (e.g., in v1.9.4, changes were reverted). This indicates ongoing instability or limitations with complex CTE structures within Fabric through the adapter.
Install
-
pip install dbt-fabric
Imports
- dbt-fabric
dbt-fabric is primarily used via dbt CLI and profiles.yml configuration. Direct Python imports of classes from `dbt_fabric` are uncommon for standard dbt operations.
Quickstart
import os
# Example profiles.yml content for dbt-fabric
profiles_yaml_content = f'''
fabric:
target: dev
outputs:
dev:
type: fabric
method: odbc
driver: "{{ODBC Driver 18 for SQL Server}}" # Ensure this driver is installed
server: "{os.environ.get('DBT_FABRIC_SERVER', 'your_workspace_name.datawarehouse.fabric.microsoft.com')}"
port: 1433
database: "{os.environ.get('DBT_FABRIC_DATABASE', 'your_data_warehouse_name')}"
schema: "{{{{ env_var('DBT_FABRIC_SCHEMA', 'dbt_schema') }}}}"
authentication: "ActiveDirectoryInteractive" # Or ServicePrincipal, CLI, ManagedIdentity, etc.
client_id: "{os.environ.get('FABRIC_CLIENT_ID', '')}" # Required for ServicePrincipal
client_secret: "{os.environ.get('FABRIC_CLIENT_SECRET', '')}" # Required for ServicePrincipal
tenant_id: "{os.environ.get('FABRIC_TENANT_ID', '')}" # Required for ServicePrincipal
host_name_in_certificate: "*.datawarehouse.fabric.microsoft.com" # Recommended
query_timeout: 300
'''
# In a real scenario, this content would be saved to ~/.dbt/profiles.yml
# or a file referenced by DBT_PROFILES_DIR. Then you would run dbt commands:
# dbt debug --target dev
# dbt run
print("Generated profiles.yml content (replace placeholders and ensure ODBC driver is installed):\n")
print(profiles_yaml_content)
print("\nTo use: Save this to your profiles.yml and run 'dbt debug --target dev' or 'dbt run' from your dbt project directory.")