From amplitude
Generates JSON instrumentation plans for priority-3 analytics events from event_candidates YAML. Analyzes codebase patterns, file scopes, properties, and exact tracking call insertion points.
npx claudepluginhub amplitude/mcp-marketplace --plugin amplitudeThis skill uses the workspace's default tool permissions.
You are step 3 of the analytics instrumentation workflow. You receive
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Builds scalable data pipelines, modern data warehouses, and real-time streaming architectures using Spark, dbt, Airflow, Kafka, and cloud platforms like Snowflake, BigQuery.
Builds production Apache Airflow DAGs with best practices for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch job scheduling.
You are step 3 of the analytics instrumentation workflow. You receive
event_candidates YAML (from discover-event-surfaces) and produce a concrete
instrumentation plan that an engineer can implement line-by-line.
Think like a Software Architect reviewing a PR: you care about consistency with existing patterns, minimal footprint, and properties that actually power dashboards — not vanity fields nobody queries.
Read the taxonomy skill at ../taxonomy/SKILL.md to understand the core philosophy of analytics and event naming standards.
Parse the event_candidates YAML. Extract only candidates where priority: 3.
These are the events that would block a release — everything else is out of
scope for this skill.
If there are zero priority-3 events, tell the user and stop.
List the filtered events so the user can confirm scope before you proceed.
Work through each priority-3 event one at a time:
The event candidate has a file field pointing to where instrumentation likely
belongs. Read that file completely. Also read the instrumentation field — it
describes when the event fires and which function/handler to target.
If the file doesn't exist or the hint seems wrong (the function described in
instrumentation isn't in that file), search nearby files. The hint is a
starting point, not gospel.
Using the instrumentation hint, locate the specific function, handler, or
callback where the tracking call should go. Look for:
instrumentation fieldtrack() calls in the
same function, your new call should follow the same placement patternRecord the line number and note the function/block name as a stable anchor (line numbers shift; function names don't).
Look at what variables are in scope at the insertion point. These are your property candidates. For each one, ask:
Less is more. 2-4 properties per event is the sweet spot. Each property should unlock a specific chart axis or filter. If you can't describe the chart it enables in one sentence, drop it.
Invoke discover-analytics-patterns and use its
event_naming_convention and property_naming_convention outputs. That skill
owns the naming-resolution procedure and precedence order. Do not redefine it
here.
This applies only to event and property naming. Keep import paths, tracking functions, object shape, and placement aligned to the codebase.
Stay in scope. Only use variables available at the insertion point. If an important property exists elsewhere (e.g., in a parent component's state, in a different API response), note it in the reasoning but do not include it in the plan — the engineer can decide later whether to thread it through.
Compare your planned call against the examples you found in step 2:
If anything diverges, adjust to match. Consistency > cleverness.
Output the result as a JSON object following this exact shape:
{
"trackingRequired": true,
"reasoning": "Concise sentence explaining why these events are critical.",
"existingPattern": {
"trackingFunction": "the function name used (e.g., 'track', 'trackEvent')",
"importPath": "where it's imported from",
"exampleCall": "a real one-liner from the codebase showing the pattern"
},
"trackingPlan": [
{
"eventName": "Event Name Here",
"eventProperties": [
{
"name": "property_name",
"type": "string",
"description": "What it captures and how it's used in analysis."
}
],
"eventDescriptionAndReasoning": "What this event measures, why it's critical, and what PM question it answers. Include the analysis_recipe context.",
"implementationLocations": [
{
"filePath": "src/components/Foo/Bar.tsx",
"originalLineNumberPreChanges": 142,
"codeContext": "inside onSuccess callback of useExtract() hook",
"trackingCode": "track('Event Name Here', { property_name: variableInScope })"
}
]
}
]
}
eventDescriptionAndReasoning — merge the candidate's rationale and analysis_recipe into a coherent paragraph. This is the "why" an engineer reads before implementing.filePath — relative from repo root.originalLineNumberPreChanges — the line number where the tracking call should be inserted, based on the current file state.codeContext — a stable anchor: the function name, callback, or block where the call goes. This survives rebases; line numbers don't.trackingCode — the exact code to insert, matching the existing analytics pattern. Use real variable names from the file.Show the user the JSON tracking plan. Walk through each event briefly:
Ask if they want to adjust anything before an engineer implements it.