From magic-powers
Synthesize multiple session replays into a UX friction map identifying systemic usability issues. Uses mcp__Amplitude__list_session_replays, mcp__Amplitude__get_session_replay_events, mcp__Amplitude__get_session_replays.
npx claudepluginhub kienbui1995/magic-powers --plugin magic-powersThis skill uses the workspace's default tool permissions.
- A product team wants to understand why users struggle with a specific flow before redesigning it
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Clarify the focus before sampling sessions. The audit should target one of:
Define:
Use mcp__Amplitude__list_session_replays to find sessions matching the scope:
Sampling strategy — always mix success and failure:
This balance is critical: studying only failures shows what's broken, but studying successful sessions reveals what workarounds users invented, which signals non-obvious friction.
Sample size guidance:
For each session, use mcp__Amplitude__get_session_replay_events to extract the interaction timeline.
The 5 friction signals — what to look for:
Rage Clicks — rapid repeated clicks on the same element
Signal: 3+ clicks within 2 seconds on same element
Meaning: User expects interaction that isn't happening
Examples: disabled button that looks enabled, form submit that silently fails
Dead Clicks — clicks on non-interactive elements
Signal: Click with no page response or state change
Meaning: Element looks clickable but isn't; or link is broken
Examples: text that looks like a link, icon without hover state, grayed-out element
U-Turns — backward navigation immediately after forward navigation
Signal: User goes to Page B then immediately returns to Page A
Meaning: Page B didn't have what they expected, or they made a mistake and noticed immediately
Examples: clicking wrong menu item, seeing an unexpected modal, wrong filter applied
Hesitation — long pause before taking an action (>10s with cursor movement but no click)
Signal: Mouse movement without clicks, sustained time on element
Meaning: User is unsure what to do next; decision anxiety or unclear affordance
Examples: ambiguous CTA labels, multiple similar options, unclear next step
Form Abandonment — user starts form input then leaves
Signal: Focus on form field, then session ends or navigation away
Meaning: Form is too long, too confusing, requires unavailable information, or failed
Examples: unexpected required fields, confusing validation errors, too many steps
After reviewing all sessions, build a structured map of where friction occurs:
Flow: Checkout → Payment → Confirmation
Step 1: /checkout/cart [friction: LOW]
- Clear path forward
- Minor: some users don't notice "Apply Coupon" link (dead click on cart total)
Step 2: /checkout/shipping [friction: MEDIUM]
- 23% of sessions show hesitation on "Same as billing" checkbox
- 12% of sessions show u-turn (back to cart, then forward again)
- Form validation error on ZIP code triggers rage clicks in 18% of sessions
Step 3: /checkout/payment [friction: HIGH]
- Rage clicks on "Submit Payment" in 34% of sessions
- 41% of abandonment happens at this step
- Key signal: users click submit, wait 5s, click again (double submit pattern)
Step 4: /checkout/confirmation [friction: NONE]
- Sessions that reach here show smooth behavior
- Concern: only 59% of sessions that start checkout reach this step
For each identified friction signal, calculate:
Friction: Rage clicks on "Submit Payment"
- Frequency: 34% of sessions (17/50 sampled)
- Recurrence: 71% of affected users click 3+ times
- Abandon correlation: 68% of rage-clicking users abandon the session
- Estimated impact: 34% × 68% = 23% of checkout sessions lost to this friction
Rank issues using a simple matrix:
| Friction Point | Frequency | Severity | Priority Score | Recommendation |
|---|---|---|---|---|
| Payment double-submit | 34% | High (abandonment) | 9/10 | Fix: show loading state on submit |
| ZIP validation rage click | 18% | Medium (frustration) | 6/10 | Fix: inline validation, clear format guide |
| Shipping checkbox hesitation | 23% | Low (slows down) | 4/10 | Improve: better label text |
| Coupon dead click | 8% | Low (confusion) | 2/10 | Improve: make coupon field more visible |
For each high-priority friction point, provide a specific recommendation:
Recommendation format:
mcp__Amplitude__list_session_replays — search and filter sessions for the audit scope (by page, user segment, time range, completion status)mcp__Amplitude__get_session_replays — get session metadata for the sampled set (duration, pages visited, device type)mcp__Amplitude__get_session_replay_events — extract the full interaction timeline per session to identify friction signals## UX Audit: <flow or page name>
Sessions reviewed: N (XX successful, XX abandoned, XX partial)
Time window: <date range>
### Friction Map by Flow Step
<table or list of steps with friction level and key signals>
### Top Friction Issues (ranked by priority)
**Issue 1:** <name> [Priority: X/10]
Frequency: XX% of sessions
Signal: <specific behavior observed>
Root cause: <hypothesis>
Fix: <specific recommendation>
Success metric: <how to measure improvement>
**Issue 2:** ...
### Summary Metrics
- Steps with HIGH friction: N
- Estimated sessions lost to friction: XX%
- Top 3 improvement opportunities: <list>
### Recommended Next Actions
1. <highest impact fix — estimate: X% conversion lift>
2. <second fix>
3. <third fix>