Executes FABER plans by spawning faber-manager agents. Simple and reliable.
Executes FABER plans by spawning faber-manager agents for each plan item. Claude uses this when a plan file exists and needs execution, triggered by `/fractary-faber:execute` commands with a plan_id.
/plugin marketplace add fractary/claude-plugins/plugin install fractary-faber@fractaryThis skill inherits all available tools. When active, it can use any tool Claude has access to.
Your job is intentionally simple:
This simplicity is by design - so simple it can't fail. </CONTEXT>
<CRITICAL_RULES>
logs/fractary/plugins/faber/plans/{plan_id}.jsonlogs/fractary/plugins/faber/runs/{plan_id}/
</CRITICAL_RULES>Security: Validate plan_id format to prevent path traversal attacks:
# plan_id must match pattern: {org}-{project}-{subproject}-{timestamp}
# Only alphanumeric, hyphens, and underscores allowed
if plan_id contains ".." or "/" or "\" or special characters:
ERROR: Invalid plan_id format
Read plan from logs/fractary/plugins/faber/plans/{plan_id}.json
If not found, error:
❌ Plan not found: {plan_id}
Check available plans:
ls logs/fractary/plugins/faber/plans/
If items parameter provided:
items_to_run = plan.items.filter(i => items.includes(i.work_id))
Else:
items_to_run = plan.items
If resume is true:
For each item in items_to_run:
state_path = "logs/fractary/plugins/faber/runs/{plan_id}/items/{item.work_id}/state.json"
IF state file exists:
state = read(state_path)
item.resume_from = {
"phase": state.current_phase,
"step_index": state.current_step_index,
"steps_completed": state.steps_completed,
"run_id": state.run_id
}
item.is_resume = true
ELSE:
item.is_resume = false
LOG "⚠️ No state found for item {item.work_id}, starting fresh"
State structure (read from file):
{
"run_id": "fractary-claude-plugins-abc123",
"plan_id": "fractary-claude-plugins-csv-export-20251208T160000",
"work_id": 123,
"current_phase": "build",
"current_step_index": 2,
"steps_completed": ["generate-spec", "create-branch"],
"status": "in_progress"
}
Update plan file:
{
"execution": {
"status": "running",
"started_at": "2025-12-08T16:30:00Z",
"is_resume": {resume}
}
}
Spawn ALL managers in ONE message for parallel execution:
For each item in items_to_run:
Task(
subagent_type="fractary-faber:faber-manager",
description="Execute FABER for #{item.work_id}",
run_in_background=true,
prompt='{
"target": "{item.target}",
"work_id": "{item.work_id}",
"workflow_id": "{plan.workflow.id}",
"resolved_workflow": {plan.workflow},
"autonomy": "{plan.autonomy}",
"phases": {plan.phases_to_run},
"step_id": {plan.step_to_run},
"additional_instructions": "{plan.additional_instructions}",
"worktree": "{item.worktree}",
"is_resume": {item.branch.status == "resume"},
"resume_context": {item.branch.resume_from},
"issue_data": {item.issue},
"working_directory": "{working_directory}"
}'
)
Wait for all background agents to complete using the AgentOutputTool (retrieve results from Task agents spawned with run_in_background=true).
Spawn managers one at a time, wait for each before next.
For each manager result:
{
"work_id": "123",
"status": "success|failed",
"pr_url": "https://github.com/...",
"error": null
}
Update plan file:
{
"execution": {
"status": "completed|partial|failed",
"completed_at": "2025-12-08T17:00:00Z",
"results": [
{"work_id": "123", "status": "success", "pr_url": "..."},
{"work_id": "124", "status": "failed", "error": "..."}
]
}
}
For each successful item:
/repo:worktree-remove if worktree existsOutput aggregated results.
</WORKFLOW><COMPLETION_CRITERIA> This skill is complete when:
🎯 FABER Execution Complete
Plan: fractary-claude-plugins-csv-export-20251208T160000
Duration: 15m 32s
Results (3/3 successful):
✅ #123 Add CSV export → PR #150
✅ #124 Add PDF export → PR #151
✅ #125 Fix export bug → PR #152
All PRs ready for review.
🎯 FABER Execution Complete
Plan: fractary-claude-plugins-csv-export-20251208T160000
Duration: 12m 45s
Results (2/3 successful):
✅ #123 Add CSV export → PR #150
✅ #124 Add PDF export → PR #151
❌ #125 Fix export bug → Failed at evaluate:test
Error: Tests failed (3 failures)
To retry failed item:
/fractary-faber:execute {plan_id} --items 125
❌ FABER Execution Failed
Plan: fractary-claude-plugins-csv-export-20251208T160000
Results (0/3 successful):
❌ #123 Add CSV export → Failed at build:implement
❌ #124 Add PDF export → Failed at architect:generate-spec
❌ #125 Fix export bug → Failed at frame:fetch-issue
Check individual errors above for details.
❌ Plan not found: invalid-plan-id
Check available plans:
ls logs/fractary/plugins/faber/plans/
Or create a new plan:
/fractary-faber:plan --work-id 123
</OUTPUTS>
<ERROR_HANDLING>
| Error | Action |
|---|---|
| Plan not found | Report error, abort |
| Plan already running | Report error, abort |
| Manager spawn failed | Mark item failed, continue others |
| Manager timeout | Mark item failed, continue others |
| All items failed | Report summary, don't abort mid-execution |
</ERROR_HANDLING>
<NOTES>This executor is intentionally simple:
Complexity is in the planning phase (faber-planner).
Parallel (default):
Serial (--serial):
The --resume flag enables resuming execution from the exact step where it stopped:
How it works:
logs/fractary/plugins/faber/runs/{plan_id}/items/{work_id}/state.jsoncurrent_phase and current_step_indexState tracking:
plan_id for bidirectional linkingExample resume flow:
# Original execution stopped at build step 2
state.json:
current_phase: "build"
current_step_index: 2
steps_completed: ["generate-spec", "create-branch"]
# Resume execution
/fractary-faber:execute {plan_id} --resume
# Manager receives:
is_resume: true
resume_from: {phase: "build", step_index: 2, ...}
# Manager skips steps 0-1, starts at step 2
Worktrees are cleaned up automatically when:
This uses the existing /repo:pr-merge --worktree-cleanup behavior.
Plans: logs/fractary/plugins/faber/plans/{plan_id}.json
Runs: logs/fractary/plugins/faber/runs/{plan_id}/
items/{work_id}/state.jsonaggregate.jsonLookup by issue:
faber:plan={plan_id}runs/{plan_id}/items/{work_id}/Invoked by:
/fractary-faber:execute command/fractary-faber:run command (after plan creation)Invokes:
faber-manager agent (via Task tool)/repo:worktree-remove (for cleanup)Concurrent Plan Updates: No file locking for plan updates. If multiple executors
update the same plan simultaneously, data may be lost. Workaround: Use --serial mode
or ensure only one executor runs per plan.
Git Remote Parsing: Currently optimized for GitHub HTTPS URLs. SSH URLs and other platforms (GitLab, Bitbucket) may not parse correctly for metadata extraction.
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
Create beautiful visual art in .png and .pdf documents using design philosophy. You should use this skill when the user asks to create a poster, piece of art, design, or other static piece. Create original visual designs, never copying existing artists' work to avoid copyright violations.