From motioneyes
Diagnose, fix, or validate SwiftUI animation and scroll bugs by instrumenting views with MotionEyes, capturing console traces via XcodeBuildMCP or CLI, and comparing motion data to expectations.
npx claudepluginhub edwardsanchez/motioneyes --plugin motioneyesThis skill uses the workspace's default tool permissions.
Use MotionEyes as temporary observability for SwiftUI animation debugging and regression validation. Instrument targeted values and geometry, capture time-series logs, compare observed motion against expected motion, apply fixes or assertions, re-validate, and clean up all agent-added tracing.
Debugs UIKit CAAnimation issues like missing completion handlers, duration mismatches, device jank, gesture stuttering, and simulator discrepancies via layer checks, timing logs, and CATransaction patterns.
Reviews, refactors, and implements SwiftUI code for iOS/macOS apps, covering state management, view composition, performance, Liquid Glass, and Instruments trace analysis for hangs, hitches, and hotspots.
Reviews iOS animation code in Swift files for correctness, performance, accessibility, and Apple API best practices including withAnimation, PhaseAnimator, matchedGeometryEffect.
Share bugs, ideas, or general feedback.
Use MotionEyes as temporary observability for SwiftUI animation debugging and regression validation. Instrument targeted values and geometry, capture time-series logs, compare observed motion against expected motion, apply fixes or assertions, re-validate, and clean up all agent-added tracing.
Follow these rules for every run:
Follow this exact order:
.motionTrace(...) instrumentation with Trace.value, Trace.geometry, and (for scroll issues) Trace.scrollGeometry metrics named after user intent.Choose the geometry mode that matches what the user actually cares about.
Trace.geometry(..., space: .screen, source: .presentation)Trace.geometry(..., space: .swiftUI(.global), source: .layout)Trace.geometry(..., space: .swiftUI(.local), source: .layout) or a named coordinate spaceIf you are unsure, start with on-screen motion and add layout geometry only if you need relationships.
Trace.value.Trace.geometry using space: .screen, source: .presentation.Trace.geometry with space: .swiftUI(.global), source: .layout (or .local for local container relationships).Trace.scrollGeometry on the ScrollView container.Trace.value("opacity", opacity).Trace.value("offset", CGPoint(x: offset.width, y: offset.height)).Trace.geometry in .screen + .presentation (add layout if needed).Trace.geometry for both views in the same space that matches intent.Trace.value plus one geometry metric.Trace.scrollGeometry with contentOffset and visibleRect metrics.Trace.geometry in .screen + .presentation.Add instrumentation only to the minimum set of views needed to test the complaint.
.screen + .presentation) when the user is describing what they see.ScrollView offset, visible region, content size, insets, or restoration behavior.Trace.scrollGeometry on the ScrollView container or an immediate descendant bound to the same scroll context.Choose geometry mode based on intent:
space: .swiftUI(.global), source: .layoutspace: .swiftUI(.local), source: .layout (or a named coordinate space)space: .window, source: .layoutspace: .screen, source: .presentationimport MotionEyes
import SwiftUI
struct CardMotionExample: View {
@State private var opacity = 1.0
@State private var offset = CGSize.zero
var body: some View {
RoundedRectangle(cornerRadius: 16)
.fill(.orange)
.frame(width: 180, height: 120)
.opacity(opacity)
.offset(offset)
.motionTrace("Card Motion", fps: 30) {
Trace.value("opacity", opacity)
Trace.value("offset", CGPoint(x: offset.width, y: offset.height))
Trace.geometry(
"cardFrame",
properties: [.minX, .minY, .width, .height],
space: .screen,
source: .presentation
)
}
.onTapGesture {
withAnimation(.easeInOut(duration: 0.6)) {
opacity = opacity == 1 ? 0.4 : 1
offset = offset == .zero ? CGSize(width: 0, height: 36) : .zero
}
}
}
}
ScrollView {
content
}
.motionTrace("Chat Scroll", fps: 30) {
Trace.scrollGeometry(
"scrollMetrics",
properties: [.contentOffsetY, .visibleRectMinY, .visibleRectHeight]
)
}
Prefer XcodeBuildMCP:
mcp__XcodeBuildMCP__session_show_defaults.mcp__XcodeBuildMCP__session_set_defaults.mcp__XcodeBuildMCP__build_run_sim if needed.mcp__XcodeBuildMCP__start_sim_log_cap using captureConsole: true and subsystemFilter: "MotionEyes".mcp__XcodeBuildMCP__stop_sim_log_cap and inspect returned logs.Fallback CLI if MCP is unavailable:
xcrun simctl spawn booted log stream \
--style compact \
--level debug \
--predicate 'subsystem == "MotionEyes"'
Use these signatures:
[MotionEyes][View][Metric] key=value ...[MotionEyes][View][Metric] -- Start timestamp --[MotionEyes][View][Metric] -- End timestamp --Analyze:
Start and to End.Do not force fixed thresholds globally; evaluate against the user’s stated expectation.
Use this mode when the user wants a testable pass or fail signal rather than a one-off debugging read.
Start to End for that specific motion burst.Start, End, or duration match expectationMotionSmoothness on the extracted scalar samples.MotionSmoothness as a local continuity heuristic only:
Threshold guidance:
Start and End markers align with the expected interaction timeline.At the end of every run:
import MotionEyes only if it was added solely for temporary tracing and is no longer needed.opacity and verify fade begins and ends when expected.Trace.scrollGeometry on the ScrollView and verify contentOffset and visibleRect progression through navigation and return paths.MotionSmoothness, plus separate timing or monotonicity checks when needed.Load references/motioneyes-observability-patterns.md when choosing metrics, interpreting trace behavior, or defining MotionEyes-based assertions.