From astronomer-data
Initializes and configures Astro CLI Airflow projects: creates structure with astro dev init, adds Python/OS dependencies, sets up connections/variables in airflow_settings.yaml.
npx claudepluginhub astronomer/agents --plugin astronomer-dataThis skill uses the workspace's default tool permissions.
This skill helps you initialize and configure Airflow projects using the Astro CLI.
Deploys Airflow DAGs and projects via Astro CLI (full, DAG-only, image-only, dbt) or open-source Docker Compose/Kubernetes Helm. Guides CI/CD strategies and GitHub integration.
Builds production Apache Airflow DAGs with best practices for operators, sensors, testing, and deployment. Use for data pipelines, workflow orchestration, or batch jobs.
Builds production-ready Apache Airflow DAGs with patterns for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch jobs.
Share bugs, ideas, or general feedback.
This skill helps you initialize and configure Airflow projects using the Astro CLI.
To run the local environment, see the managing-astro-local-env skill. To write DAGs, see the authoring-dags skill. Open-source alternative: If the user isn't on Astro, guide them to Apache Airflow's Docker Compose quickstart for local dev and the Helm chart for production. For deployment strategies, use the
deploying-airflowskill.
astro dev init
Don't pass
--airflow-versionor--runtime-versionunless the user explicitly asks for a specific pin. Plainastro dev initresolves to the latest Astro Runtime — that's the right default. Specifying a version risks pinning to a stale value from training data. If the user wants to know what was installed, read the generatedDockerfileafterward instead of guessing.
Creates this structure:
project/
├── dags/ # DAG files
├── include/ # SQL, configs, supporting files
├── plugins/ # Custom Airflow plugins
├── tests/ # Unit tests
├── Dockerfile # Image customization
├── packages.txt # OS-level packages
├── requirements.txt # Python packages
└── airflow_settings.yaml # Connections, variables, pools
apache-airflow-providers-snowflake==5.3.0
pandas==2.1.0
requests>=2.28.0
gcc
libpq-dev
For complex setups (private PyPI, custom scripts):
FROM quay.io/astronomer/astro-runtime:12.4.0
RUN pip install --extra-index-url https://pypi.example.com/simple my-package
After modifying dependencies: Run astro dev restart
Loaded automatically on environment start:
airflow:
connections:
- conn_id: my_postgres
conn_type: postgres
host: host.docker.internal
port: 5432
login: user
password: pass
schema: mydb
variables:
- variable_name: env
variable_value: dev
pools:
- pool_name: limited_pool
pool_slot: 5
# Export from running environment
astro dev object export --connections --file connections.yaml
# Import to environment
astro dev object import --connections --file connections.yaml
Parse DAGs to catch errors without starting the full environment:
astro dev parse