This skill should be used when sprint metrics, quality metrics, AI/ML model metrics, or governance metrics need to be collected, calculated against thresholds, or reported. Triggers when a user asks for a sprint metrics report, wants to check if velocity is healthy, needs an AI monitoring report, or detects a threshold breach that requires escalation. Also applies when baselining metrics at the start of Phase 4.
From agile-lifecyclenpx claudepluginhub nsalvacao/nsalvacao-claude-code-plugins --plugin agile-lifecycleThis skill uses the workspace's default tool permissions.
references/metrics-catalog.mdGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Configures VPN and dedicated connections like Direct Connect, ExpressRoute, Interconnect for secure on-premises to AWS, Azure, GCP, OCI hybrid networking.
Metrics provide objective evidence of delivery health, product quality, and AI/ML system performance. This skill defines which metrics to collect, how to calculate them, when to collect them, and how to report them for gate reviews and stakeholder reporting. Metrics without targets are observations; metrics with targets and thresholds are governance tools.
Determine which categories apply to the current context:
For each metric, identify the data source:
Collect data at the defined cadence (see metrics catalog for each metric's collection cadence).
Use the formula from references/metrics-catalog.md. Do not estimate or approximate — use actual measurements. If data is unavailable, document the data gap as a risk and note the metric as "not available — data gap."
For each metric:
Generate the metrics report using the appropriate template:
templates/phase-6/service-report.md.templatetemplates/phase-6/ai-monitoring-report.md.templateThe report must include: metric name, value, target, status (On Target/Warning/Breached), trend (improving/stable/degrading), and action (if any).
If any metric is Breached:
After producing the metrics report:
references/metrics-catalog.md — Complete catalog with formulas, targets, collection methods, and ownerstemplates/phase-6/service-report.md.template — Operational metrics report templatetemplates/phase-6/ai-monitoring-report.md.template — AI monitoring report template