From astronomer-data
Configures Astronomer Cosmos 1.11+ for dbt Fusion projects on Snowflake, Databricks, BigQuery, Redshift with local execution. Covers binary install, RenderConfig, ProfileConfig setup.
npx claudepluginhub astronomer/agents --plugin astronomer-dataThis skill uses the workspace's default tool permissions.
Execute steps in order. This skill covers Fusion-specific constraints only.
Implements dbt Core projects as Airflow DAGs or TaskGroups using Astronomer Cosmos. Provides checklists for ProjectConfig, RenderConfig, ExecutionConfig, and pre-implementation verifications.
Guides dbt project migration across data platforms (e.g., Snowflake to Databricks) using dbt Fusion compilation to identify and fix SQL dialect differences, with unit tests for validation.
Triages dbt-core to Fusion migration errors: runs dbt-autofix first, validates credentials with dbt debug, then classifies remaining issues into auto-fixable, guided fixes, needs input, or blocked.
Share bugs, ideas, or general feedback.
Execute steps in order. This skill covers Fusion-specific constraints only.
Version note: dbt Fusion support was introduced in Cosmos 1.11.0. Requires Cosmos ≥1.11.
Reference: See reference/cosmos-config.md for ProfileConfig, operator_args, and Airflow 3 compatibility details.
Before starting, confirm: (1) dbt engine = Fusion (not Core → use cosmos-dbt-core), (2) warehouse = Snowflake, Databricks, Bigquery and Redshift only.
| Constraint | Details |
|---|---|
| No async | AIRFLOW_ASYNC not supported |
| No virtualenv | Fusion is a binary, not a Python package |
| Warehouse support | Snowflake, Databricks, Bigquery and Redshift support while in preview |
CRITICAL: Cosmos 1.11.0 introduced dbt Fusion compatibility.
# Check installed version
pip show astronomer-cosmos
# Install/upgrade if needed
pip install "astronomer-cosmos>=1.11.0"
Validate: pip show astronomer-cosmos reports version ≥ 1.11.0
dbt Fusion is NOT bundled with Cosmos or dbt Core. Install it into the Airflow runtime/image.
Determine where to install the Fusion binary (Dockerfile / base image / runtime).
USER root
RUN apt-get update && apt-get install -y curl
ENV SHELL=/bin/bash
RUN curl -fsSL https://public.cdn.getdbt.com/fs/install/install.sh | sh -s -- --update
USER astro
| Environment | Typical path |
|---|---|
| Astro Runtime | /home/astro/.local/bin/dbt |
| System-wide | /usr/local/bin/dbt |
Validate: The dbt binary exists at the chosen path and dbt --version succeeds.
Parsing strategy is the same as dbt Core. Pick ONE:
| Load mode | When to use | Required inputs |
|---|---|---|
dbt_manifest | Large projects; fastest parsing | ProjectConfig.manifest_path |
dbt_ls | Complex selectors; need dbt-native selection | Fusion binary accessible to scheduler |
automatic | Simple setups; let Cosmos pick | (none) |
from cosmos import RenderConfig, LoadMode
_render_config = RenderConfig(
load_method=LoadMode.AUTOMATIC, # or DBT_MANIFEST, DBT_LS
)
Reference: See reference/cosmos-config.md for full ProfileConfig options and examples.
from cosmos import ProfileConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
CRITICAL: dbt Fusion with Cosmos requires
ExecutionMode.LOCALwithdbt_executable_pathpointing to the Fusion binary.
from cosmos import ExecutionConfig
from cosmos.constants import InvocationMode
_execution_config = ExecutionConfig(
invocation_mode=InvocationMode.SUBPROCESS,
dbt_executable_path="/home/astro/.local/bin/dbt", # REQUIRED: path to Fusion binary
# execution_mode is LOCAL by default - do not change
)
from cosmos import ProjectConfig
_project_config = ProjectConfig(
dbt_project_path="/path/to/dbt/project",
# manifest_path="/path/to/manifest.json", # for dbt_manifest load mode
# install_dbt_deps=False, # if deps precomputed in CI
)
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig, RenderConfig
from cosmos.profiles import SnowflakeUserPasswordProfileMapping
from pendulum import datetime
_project_config = ProjectConfig(
dbt_project_path="/usr/local/airflow/dbt/my_project",
)
_profile_config = ProfileConfig(
profile_name="default",
target_name="dev",
profile_mapping=SnowflakeUserPasswordProfileMapping(
conn_id="snowflake_default",
),
)
_execution_config = ExecutionConfig(
dbt_executable_path="/home/astro/.local/bin/dbt", # Fusion binary
)
_render_config = RenderConfig()
my_fusion_dag = DbtDag(
dag_id="my_fusion_cosmos_dag",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
render_config=_render_config,
start_date=datetime(2025, 1, 1),
schedule="@daily",
)
from airflow.sdk import dag, task # Airflow 3.x
# from airflow.decorators import dag, task # Airflow 2.x
from airflow.models.baseoperator import chain
from cosmos import DbtTaskGroup, ProjectConfig, ProfileConfig, ExecutionConfig
from pendulum import datetime
_project_config = ProjectConfig(dbt_project_path="/usr/local/airflow/dbt/my_project")
_profile_config = ProfileConfig(profile_name="default", target_name="dev")
_execution_config = ExecutionConfig(dbt_executable_path="/home/astro/.local/bin/dbt")
@dag(start_date=datetime(2025, 1, 1), schedule="@daily")
def my_dag():
@task
def pre_dbt():
return "some_value"
dbt = DbtTaskGroup(
group_id="dbt_fusion_project",
project_config=_project_config,
profile_config=_profile_config,
execution_config=_execution_config,
)
@task
def post_dbt():
pass
chain(pre_dbt(), dbt, post_dbt())
my_dag()
Before finalizing, verify:
If user reports dbt Core regressions after enabling Fusion:
AIRFLOW__COSMOS__PRE_DBT_FUSION=1