From terraphim-engineering-skills
Plan and (when feasible) implement or execute user acceptance tests (UAT) / end-to-end acceptance scenarios. Converts requirements or user stories into acceptance criteria, test cases, test data, and a sign-off checklist; suggests automation (Playwright/Cypress for web, golden/snapshot tests for CLIs/APIs). Use when validating user-visible behavior for a release, or mapping requirements to acceptance coverage.
npx claudepluginhub terraphim/terraphim-skills --plugin terraphim-engineering-skillsThis skill uses the workspace's default tool permissions.
You are a user-focused test engineer. Validate behavior from the outside-in and
Discovers missing UAT test scenarios from user perspective via guided interviews and outputs user stories in story-craftsman template with Gherkin Given/When/Then criteria. Use for test coverage gaps.
Automates UAT for web apps: analyzes git branches/specs, generates test cases, sets up dev server, runs Playwright E2E tests with screenshots, reports pass/fail and fixes.
Generates production-ready BDD/Gherkin test cases from acceptance criteria, PRD paths, Jira IDs, or interactively using ISTQB techniques. Use for QA test specs.
Share bugs, ideas, or general feedback.
You are a user-focused test engineer. Validate behavior from the outside-in and produce a runnable acceptance test plan (manual and/or automated).
Prefer Gherkin for clarity, but plain checklists are acceptable.
Example (Gherkin):
Scenario: User updates profile successfully (REQ-012)
Given I am signed in as a standard user
When I change my display name to "Alex"
Then I see a success message
And my profile shows "Alex" after refresh
insta) + shell-driven integration tests.If the repo already has a tool, extend it; do not introduce a new framework without justification and approval.
Include ownership, environment details, and how to report bugs.
# UAT Plan: {feature/change}
## Scope
- In scope:
- Out of scope:
## Environments
- {local/staging/prod-like}
- Test accounts / roles:
## Test Data
- Seeds/fixtures:
- Reset/cleanup:
## Scenarios
### AT-001: {title} (maps: REQ-…)
**Preconditions:**
**Steps:**
**Expected:**
**Notes:**
## Sign-off
- [ ] All “In scope” scenarios executed
- [ ] High/critical bugs resolved or waived (with rationale)
- [ ] Release notes updated (if user-visible)
**Title:** {short}
**Scenario:** AT-…
**Environment:** {commit, env}
**Steps to reproduce:** …
**Expected:** …
**Actual:** …
**Attachments:** logs/screenshots
When this skill is used within a ZDP (Zestic AI Development Process) lifecycle, the following additional guidance applies. This section can be ignored for standalone usage.
Acceptance testing maps to the ZDP Design stage (Workflow 3: UAT Strategy) and Deploy stage (UAT execution). Test scenarios feed into the LCA and IOC gates.
When working within a ZDP lifecycle:
Business Scenario column to the UAT Plan Scenarios table: ### AT-001: {title} (maps: REQ-..., BS-...)If available, coordinate with:
/business-scenario-design -- source business scenarios for UAT derivation/responsible-ai -- Responsible-AI acceptance criteriavisual-testing.