From workflows
Manages Databricks CLI commands with intelligent parsing and safety validation. Use when managing clusters, jobs, workspace, Unity Catalog, SQL, or any Databricks resource via CLI.
npx claudepluginhub andercore-labs/claudes-kitchen --plugin workflowsThis skill uses the workspace's default tool permissions.
| Operation | Command | Unsafe |
Implements structured self-debugging workflow for AI agent failures: capture errors, diagnose patterns like loops or context overflow, apply contained recoveries, and generate introspection reports.
Monitors deployed URLs for regressions in HTTP status, console errors, performance metrics, content, network, and APIs after deploys, merges, or upgrades.
Provides React and Next.js patterns for component composition, compound components, state management, data fetching, performance optimization, forms, routing, and accessible UIs.
| Operation | Command | Unsafe |
|---|---|---|
| List clusters | databricks clusters list | ✓ |
| Create cluster | databricks clusters create --json @config.json | ✓ |
| Delete cluster | databricks clusters delete --cluster-id CLUSTER_ID | ✗ |
| List jobs | databricks jobs list | ✓ |
| Run job | databricks jobs run-now --job-id JOB_ID | ✓ |
| List notebooks | databricks workspace list /path | ✓ |
| Sync workspace | databricks workspace sync --source local --dest /workspace | ✗ |
| List catalogs | databricks unity-catalog list catalogs | ✓ |
| Create table | databricks unity-catalog create table --catalog c1 --schema s1 --name t1 | ✓ |
| List secrets | databricks secrets list-scopes | ✓ |
| Store secret | databricks secrets put --scope SCOPE --key KEY | ✓ |
| Trigger | Resource |
|---|---|
| User mentions "databricks CLI", "databricks command" | This skill |
| Task requires 100+ CLI commands | This skill |
| Need workspace sync, git credentials, libraries | This skill |
| Complex workflows: multi-step operations | This skill |
| Direct API access preferred | databricks-mcp (MCP server) |
CLI available? → VALIDATE: databricks --version
Env vars set? → VALIDATE: DATABRICKS_HOST, DATABRICKS_TOKEN
Credentials valid? → TEST: databricks workspaces list
Ready → EXECUTE: User command
Validation early → prevents cascade failures
databricks <group> <command> [<subcommand>] [arguments] [--flags]
| Position | Example | Required |
|---|---|---|
| group | clusters, jobs, fs, unity-catalog, sql | ✓ |
| command | list, create, delete, run, get | ✓ |
| subcommand | Optional for nested commands | - |
| arguments | --cluster-id ABC, --job-id 123 | Context-dependent |
| flags | --json, --output json | Context-dependent |
| Format | Handling | Use Case |
|---|---|---|
| JSON | Parse JSON → structured dict | Programmatic output |
| Table | Parse rows → list of dicts | Human-readable display |
| Text | Return as-is | Status messages |
Pattern: (stdout | stderr) → format_detected → parsed → return
| Exit Code | Meaning | Action |
|---|---|---|
| 0 | Success | Return parsed output |
| 1 | General error | Extract error message → suggest fix |
| 2 | Usage error | Display usage pattern → request retry |
| 127 | CLI not found | Suggest pip install databricks-cli |
Suggested fix: Check common errors in commands-reference.md
UNSAFE (require confirmation):
clusters delete, jobs delete, workspace delete,
unity-catalog delete, secrets delete, policies delete,
instance-pools delete
Pattern:
User requests UNSAFE op
→ Display: operation details, impacted resources
→ Request: explicit "yes" confirmation
→ Execute ONLY after confirmation
Flow: safety-patterns.md
export DATABRICKS_HOST=https://your-instance.cloud.databricks.com
export DATABRICKS_TOKEN=dapi123...
export DATABRICKS_CONFIG_PROFILE=default # optional
# Verify
databricks workspaces list
Store in ~/.databrickscfg for persistence:
[DEFAULT]
host = https://your-instance.cloud.databricks.com
token = dapi123...
| Group | Commands | Status |
|---|---|---|
| clusters | list, create, delete, start, stop, get, edit, resize | ✓ Core |
| jobs | list, create, delete, run-now, cancel, get, update | ✓ Core |
| fs, workspace | ls, cp, rm, mv, mkdir, export, import | ✓ Core |
| unity-catalog | list (catalogs/schemas/tables), create, delete, get | ✓ Core |
| sql | queries, warehouses, alerts | ✓ Core |
| secrets | put, get, delete, list-scopes, list-secrets | ✓ Core |
| policies | list, create, update, delete, get | ✓ Supported |
| users, groups, service-principals | CRUD operations | ✓ Supported |
| libraries, git-credentials | manage dependencies, credentials | ✓ Supported |
| experiments, models, serving-endpoints | ML lifecycle | ✓ Supported |
interface CLIExecution {
group: string;
command: string;
args?: Record<string, string | number | boolean>;
flags?: Record<string, string | boolean>;
requiresConfirmation?: boolean; // Auto-detect from unsafe list
}
async function execute(exec: CLIExecution): Promise<CLIResult> {
if (isUnsafe(exec)) {
await requestConfirmation(exec); // blocks until approved
}
validatePrerequisites(); // CLI, env vars, credentials
const cmd = buildCommand(exec);
const result = await runProcess(cmd);
return parseOutput(result);
}
| Workflow | Steps | Docs |
|---|---|---|
| Create cluster + wait | 1. Create 2. Poll status 3. Return cluster ID | examples.md |
| Sync workspace bidirectional | 1. Sync local→remote 2. Sync remote→local 3. Report changes | examples.md |
| Bulk job execution | 1. List jobs 2. Filter 3. Run each 4. Monitor | examples.md |
| Unity Catalog setup | 1. Create catalog 2. Create schema 3. Create table | examples.md |
| Secret rotation | 1. Generate new 2. Store in scope 3. Notify | examples.md |
| Problem | Cause | Fix |
|---|---|---|
databricks: command not found | CLI not installed | pip install databricks-cli>=0.205 |
Error: 401 Unauthorized | Invalid/missing token | Set DATABRICKS_TOKEN, verify in ~/.databrickscfg |
Error: 404 Not Found | Wrong DATABRICKS_HOST | Set correct workspace URL |
Error: Timed out | Long-running operation | Increase timeout, run async |
Error: Permission denied | Insufficient permissions | Check user role/permissions |
| Large output truncated | Output exceeds buffer | Use pagination, filters, or save to file |
For more: commands-reference.md
✓ Databricks CLI v0.205+ installed locally
✓ DATABRICKS_HOST environment variable set
✓ DATABRICKS_TOKEN environment variable set
✓ Credentials validated via databricks workspaces list
See referenced files for:
| Phase | Action |
|---|---|
| 1. SCOPE | Extract context: command, mode (informative | executive), sessionId |
| 2. VERIFY | Validate CLI installed, env vars set, credentials valid |
| 3. SAFETY | Check if destructive op → request confirmation |
| 4. EXECUTE | Run command via subprocess |
| 5. METRICS | Call mcp__agent-orchestrator__store-skill-metrics with execution stats |
| 6. PARSE | Detect format (JSON/table/text) → parse output |
| 7. OUTPUT | Return parsed result or error with suggestions |
Metrics Structure:
{
sessionId: string;
skill: "workflows:databricks-cli-recipe";
initialViolations: number; // Validation errors before execution
iterations: number; // Number of retry attempts
fixesApplied: number; // Fixes applied during execution
finalViolations: number; // Remaining validation errors
mode: "informative" | "executive";
duration: number; // Execution time in ms
}
Output Format:
{
status: "success" | "failed";
output: unknown; // Parsed result
error?: {
code: number;
message: string;
suggestion: string;
};
metrics: {
initialViolations: number;
finalViolations: number;
fixRate: number; // (initial - final) / initial
};
}