From databricks-skills
Databricks CLI operations: auth, profiles, data exploration, and bundles. Contains up-to-date guidelines for Databricks-related CLI tasks.
npx claudepluginhub databricks/databricks-agent-skillsThis skill uses the workspace's default tool permissions.
Core skill for Databricks CLI, authentication, and data exploration.
Guides Databricks development using Python SDK for workspace/account ops, Connect for local Spark, CLI commands, and direct REST APIs.
Create, configure, validate, deploy, run, and manage DABs — Declarative Automation Bundles (formerly Databricks Asset Bundles) — for Databricks resources including dashboards, jobs, pipelines, alerts, volumes, and apps
Creates minimal Databricks single-node cluster and Spark notebook via REST API, CLI, or Python SDK. For new projects, setup testing, and basic Delta Lake patterns.
Share bugs, ideas, or general feedback.
Core skill for Databricks CLI, authentication, and data exploration.
For specific products, use dedicated skills:
CLI installed: Run databricks --version to check.
Authenticated: databricks auth profiles
NEVER auto-select a profile.
databricks auth profilesEach Bash command runs in a separate shell session.
# WORKS: --profile flag
databricks apps list --profile my-workspace
# WORKS: chained with &&
export DATABRICKS_CONFIG_PROFILE=my-workspace && databricks apps list
# DOES NOT WORK: separate commands
export DATABRICKS_CONFIG_PROFILE=my-workspace
databricks apps list # profile not set!
Use these instead of manually navigating catalogs/schemas/tables:
# discover table structure (columns, types, sample data, stats)
databricks experimental aitools tools discover-schema catalog.schema.table --profile <PROFILE>
# run ad-hoc SQL queries
databricks experimental aitools tools query "SELECT * FROM table LIMIT 10" --profile <PROFILE>
# find the default warehouse
databricks experimental aitools tools get-default-warehouse --profile <PROFILE>
See Data Exploration for details.
⚠️ CRITICAL: Some commands use positional arguments, not flags
# current user
databricks current-user me --profile <PROFILE>
# list resources
databricks apps list --profile <PROFILE>
databricks jobs list --profile <PROFILE>
databricks clusters list --profile <PROFILE>
databricks warehouses list --profile <PROFILE>
databricks pipelines list --profile <PROFILE>
databricks serving-endpoints list --profile <PROFILE>
# ⚠️ Unity Catalog — POSITIONAL arguments (NOT flags!)
databricks catalogs list --profile <PROFILE>
# ✅ CORRECT: positional args
databricks schemas list <CATALOG> --profile <PROFILE>
databricks tables list <CATALOG> <SCHEMA> --profile <PROFILE>
databricks tables get <CATALOG>.<SCHEMA>.<TABLE> --profile <PROFILE>
# ❌ WRONG: these flags/commands DON'T EXIST
# databricks schemas list --catalog-name <CATALOG> ← WILL FAIL
# databricks tables list --catalog <CATALOG> ← WILL FAIL
# databricks sql-warehouses list ← doesn't exist, use `warehouses list`
# databricks execute-statement ← doesn't exist, use `experimental aitools tools query`
# databricks sql execute ← doesn't exist, use `experimental aitools tools query`
# When in doubt, check help:
# databricks schemas list --help
# get details
databricks apps get <NAME> --profile <PROFILE>
databricks jobs get --job-id <ID> --profile <PROFILE>
databricks clusters get --cluster-id <ID> --profile <PROFILE>
# bundles
databricks bundle init --profile <PROFILE>
databricks bundle validate --profile <PROFILE>
databricks bundle deploy -t <TARGET> --profile <PROFILE>
databricks bundle run <RESOURCE> -t <TARGET> --profile <PROFILE>
| Error | Solution |
|---|---|
cannot configure default credentials | Use --profile flag or authenticate first |
PERMISSION_DENIED | Check workspace/UC permissions |
RESOURCE_DOES_NOT_EXIST | Verify resource name/id and profile |
| Task | READ BEFORE proceeding |
|---|---|
| First time setup | CLI Installation |
| Auth issues / new workspace | CLI Authentication |
| Exploring tables/schemas | Data Exploration |
| Deploying jobs/pipelines | Use /databricks-dabs |