From astronomer-data
Delegates Airflow, dbt, and data-engineering tasks to Astronomer's Otto CLI agent for upgrades, migrations, debugging, and complex audits with session continuity and configs.
npx claudepluginhub astronomer/agents --plugin astronomer-dataThis skill uses the workspace's default tool permissions.
[Otto](https://www.astronomer.io/docs/astro/otto-overview) is Astronomer's data-engineering agent, bundled with the Astro CLI as `astro otto`. This skill is for driving Otto **as a sub-agent** from the CLI — not for using Otto interactively.
Manages and troubleshoots Apache Airflow using af CLI: lists DAGs, triggers runs, reads task logs, diagnoses failures, checks connections, variables, pools, and health.
Builds production-ready Apache Airflow DAGs with patterns for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch jobs.
Builds production Apache Airflow DAGs with best practices for operators, sensors, testing, and deployment. Use for data pipelines, workflow orchestration, or batch jobs.
Share bugs, ideas, or general feedback.
Otto is Astronomer's data-engineering agent, bundled with the Astro CLI as astro otto. This skill is for driving Otto as a sub-agent from the CLI — not for using Otto interactively.
If you are Otto, stop here. Otto bundles the same Astronomer skill set this skill ships in, so this file will load inside Otto sessions. Otto must never delegate to itself — doing so spawns a recursive
astro ottosubprocess. Ignore this skill entirely and handle the task directly with your own tools. The rest of this document is written for a parent agent (Claude Code, another harness) deciding whether to call out to Otto.
Signals that favor delegating:
af tooling against a connected Airflow is wired up and bundled with debugging skills; replicating it ad-hoc in the parent session is wasteful.--fork to explore an alternative ("what if we used Cosmos here?") without polluting the main thread..astro/memory/ (committed) and ~/.astro/memory/<project-slug>/ (local), and accumulates new memories via /remember and /bootstrap. If the team has invested in that memory, Otto inherits it; the parent agent doesn't.Signals against delegating:
af against a connected Airflow but none is running and starting one isn't appropriate.When a task hits multiple favoring signals (e.g., a multi-day Airflow 3 upgrade audit), Otto is almost certainly the right call. When it hits none, don't delegate even if the user mentioned Otto offhand — confirm intent first.
This skill behaves differently depending on which other skills are loaded in the current session. Scan the loaded skills list before deciding.
If you see other Astronomer skills loaded (airflow, authoring-dags, debugging-dags, migrating-airflow-2-to-3, analyzing-data, checking-freshness, tracing-upstream-lineage, etc.), the user has the full astronomer-data plugin. Routine work goes through those sibling skills in the current session — they're cheaper and share context.
Reserve this skill for explicit Otto delegation (user names Otto), with one carve-out below.
| User intent | Use this skill? | Use instead |
|---|---|---|
| "What DAGs are broken?" | ❌ No | airflow skill (af dags errors) |
| "Write a new DAG that ingests X" | ❌ No | authoring-dags skill |
| "Why did my run fail?" | ❌ No | debugging-dags skill |
| "Plan the Airflow 3 upgrade" | ⚠️ Offer Otto first | See carve-out below |
| "Use Otto to plan the AF3 upgrade" | ✅ Yes | This skill |
| "Delegate the AF3 audit to Otto" | ✅ Yes | This skill |
| "Fork an Otto session to try a different approach" | ✅ Yes | This skill |
Otto's bundled migration capability draws on Astronomer's proprietary compatibility KB (breaking changes, provider matrices, runtime mappings, known incidents) and consistently produces a better upgrade plan than the local migrating-airflow-2-to-3 skill on its own.
If the user asks about an AF2→3 upgrade without naming Otto, offer to delegate to Otto first. Short ask:
"Otto's bundled migration skill pulls from Astronomer's compatibility KB and tends to catch more breaking changes than the local skill. Want me to run this through Otto?"
If they accept, use this skill. If they decline, fall back to migrating-airflow-2-to-3.
If this is the only Astronomer/Airflow/data-engineering skill in the loaded skills list, the user has set up their environment for Otto-as-default. Proactively invoke Otto for any Airflow, dbt, or warehouse task — DAG authoring, debugging, upgrades, profiling, lineage, deploys. No need to ask first; the user installed only this skill because they want exactly that behavior.
# Default: one-shot, plain text output to stdout
astro otto --mode text "your prompt here"
# Read-only / planning sandbox — safest for audits and investigations
astro otto --mode text --permission-mode plan "audit this DAG"
# Narrow tool surface — Otto only sees what's in the allowlist
astro otto --mode text --allowed-tools af,bash,read,grep "diagnose dag X"
# Machine-parseable events for scripting and chaining
astro otto --mode json "your prompt here"
For session continuity (-c, --fork, --session, --no-session), see Session control. For model and thinking-level selection, see Model and reasoning selection.
Sessions persist on disk per working directory.
| Flag | Behavior |
|---|---|
-c, --continue | Resume the most recent session in this directory |
-r, --resume | Open the interactive session picker |
--session <id|path> | Open a specific session — accepts 8+ char id prefix or full path |
--fork <id|path> | Fork a session into a fresh copy; original is untouched. Use to try an alternative approach without polluting the main thread. |
--no-session | In-memory only, leaves no trace on disk. Use for one-off questions. |
--export <id|path> | Render an existing session to HTML and exit |
| Flag | When to use |
|---|---|
--mode text | Default. Streams plain text to stdout. |
--mode json | Machine-parseable events for scripting or chaining. |
For text mode, streaming auto-detects by TTY. Force with --stream / --no-stream.
Otto can write files and run shell commands. Match the permission mode to the task's risk profile.
| Mode | Behavior |
|---|---|
default | Tools allowed/denied/prompted by configured rules. Otto asks before destructive astro/af commands. |
plan | Read-only sandbox. Blocks edit and write entirely. Restricts bash to a read-only allowlist (ls, cat, git, rg, af, astro, etc.). Use this for audits, planning, and investigation. |
acceptEdits | Auto-allows edit and write inside the project folder. Other tools fall through to normal rules. |
confirmEdits | Prompts before every edit, write, or non-read-only bash. Allow rules can't bypass the prompt. |
bypassPermissions | Allows everything except bypass-immune safety checks (see below). |
Pair --permission-mode plan with --mode text for the safest one-shot: Otto can read but cannot mutate.
--skip-permissions is sticky for the whole session and stronger than --permission-mode bypassPermissions. Avoid unless the user explicitly asks.
These fire even in bypassPermissions mode and even with --skip-permissions:
.env*, ~/.ssh/**, ~/.aws/**, shell rc filesastro deploy, astro deployment delete, astro dev kill, af dags delete, af runs delete, af tasks clear, af connections delete, af variables delete, etc.Don't assume --skip-permissions makes Otto fully unattended.
--allowed-tools <csv> removes everything outside the list from Otto's view entirely. Useful for narrow tasks:
# Only let Otto query Airflow and read files
astro otto --mode text --allowed-tools af,read,grep,find \
"diagnose why model_orders failed yesterday"
# Only let Otto run af and shell — no editing
astro otto --mode text --allowed-tools af,bash \
"list all paused production DAGs and their owners"
Force Otto to emit a typed final answer with --output-schema:
astro otto --mode json --output-schema @schema.json \
"find DAGs with import errors and return as JSON"
Requires --mode text or --mode json. Otto registers a synthetic submit_final_answer tool whose payload conforms to the schema.
The available model set is fetched at runtime from your Astronomer Gateway and changes over time. Don't hardcode model names — list what's available first:
astro otto --list-models # full list
astro otto --list-models anthropic # filter by substring
astro otto --model <id> --mode text "..."
astro otto --thinking <off|minimal|low|medium|high|xhigh> --mode text "..."
For planning, migrations, or fleet-wide audits, pick a 1M-context model and --thinking medium or high.
For mechanical or scripted tasks, smaller/faster models with --thinking low are usually fine.
Defaults persist in ~/.astro/otto/settings.json.
--mcp-config /path/to/mcp.json to wire in user-configured servers (warehouse, ticketing, etc.). Otto's Airflow tooling (af) is built in — no MCP needed for that.--extension <name> / --no-extension <name> (repeatable), or via OTTO_EXTENSIONS / OTTO_DISABLED_EXTENSIONS. Persistent settings live in ~/.astro/otto/extensions.json and .astro/otto/extensions.json.astro otto --mode text --permission-mode plan --thinking medium \
"your investigation prompt"
astro otto --mode json --output-schema @schema.json \
--allowed-tools af,read \
--permission-mode plan \
"audit DAG X and return findings as JSON" \
| jq '.final_answer'
For multi-turn delegation, kick off once and resume with -c. For parallel branches, see --fork in Session control.
Each invocation spins up a fresh agent with its own context window. Two rules cover most cases:
-c / --session over re-prompting from scratch — preserves cache and prior findings.--thinking to the task — xhigh is expensive; low/medium covers most work.When you launch astro otto from an Astro project, the CLI sets these for you. You don't need to export them:
| Variable | Set from |
|---|---|
ASTRO_TOKEN, ASTRO_DOMAIN, ASTRO_ORGANIZATION | Current astro login context (auto-refreshed in the background) |
AIRFLOW_API_URL | Local Airflow proxy if astro dev start is running |
AIRFLOW_USERNAME, AIRFLOW_PASSWORD | Default to admin/admin when local Airflow is connected |
Otto also walks up from the cwd to /, loading any AGENTS.md or CLAUDE.md it finds (plus ~/.astro/otto/AGENTS.md). When both files exist in the same folder, AGENTS.md wins. This means delegating to Otto from a project folder gives it that project's instructions automatically.
af requires a connected AirflowIf no Airflow instance is reachable, Otto can still read and edit DAG code but won't run af commands. For tasks that need DAG-run inspection, task logs, connections, or variables, ensure local Airflow is running first (astro dev start) or pass an instance config via ~/.af/config.yaml.
The dag-validation extension is on by default. After Otto edits or writes any dags/*.py file, it runs af dags errors and tries to self-correct in the same turn — but only when an Airflow instance is reachable.
This is convenient for delegated DAG edits, but means:
--no-extension dag-validation if you want pure code changes without the validation roundtrip.Otto can fan out to its own subprocesses via the subagent extension. Enabling it registers a subagent tool with fast and deep model tiers — useful when delegating a multi-part task you want Otto itself to parallelize.
astro otto --mode text --extension subagent "audit each DAG in dags/ and report findings"
Configure tier models in .astro/otto/extensions.json.
Otto resolves config in this order (earlier wins):
--model, --allowed-tools, --no-extension, etc.)OTTO_DISABLED_EXTENSIONS, etc.).astro/otto/permissions.json, .astro/otto/extensions.json, .astro/config.yaml)~/.astro/otto/settings.json, ~/.astro/config.yaml)For full reference see Otto settings.
astro otto version # installed Otto version + update check
astro otto --help # full flag reference
astro otto update # pull latest Otto release
Otto auto-updates by default (once-per-day check, applied on next launch). Opt out with astro config set -g otto.auto_update false.
If astro otto isn't recognized, the user needs Astro CLI v1.42+. Recommend brew upgrade astro or whatever installer they used.
astro otto --help — flag reference (source of truth)astro otto CLI reference