Open-source AI agent skills that make SaaS products data-ready for product analytics. Scan your codebase, design a tracking plan, generate instrumentation code.
npx claudepluginhub accoil/product-tracking-skillsAI agent skills that make SaaS products data-ready for product analytics — from codebase scan to tracking plan to working instrumentation code.
Your analytics tool isn't the problem. Your product tracking is.
Most SaaS products have inconsistent events, missing context, and no real tracking plan. You're paying for Amplitude, Mixpanel, or PostHog — they work fine. You still can't answer basic questions because the instrumentation feeding them is broken.
Product Tracking Skills scans your codebase, audits what's tracked, and generates the instrumentation needed to make your analytics tools actually work — for 25+ platforms, in any AI agent tool.
Works in: Claude Code · Codex · VS Code · any tool with AI agent support
Open your codebase in any AI agent tool and start talking:
You: audit tracking
AI: [Finds every tracking call, identifies gaps and issues]
Found 14 events across 8 files. Saved to .telemetry/current-state.yaml
You: design tracking plan
AI: [Designs best-practice tracking plan, produces delta from current state]
22 events. Delta: add 10, rename 3, change 4, remove 1. Review and adjust.
You: implement tracking
AI: [Generates typed wrapper functions, delivery infrastructure, event constants]
Tracking code ready in tracking/
Seven skills and a tracking watchdog agent. Your analytics tools finally work.
Most B2B products have one of these situations:
No tracking. You know you need it. It's been on the backlog for six months. It never happens.
Broken tracking. 14 events across 23 files. Some camelCase, some snake_case. No account context. Three events that do the same thing. Five that nobody uses.
Decayed tracking. Someone set it up 18 months ago. Twelve features have shipped since. None were instrumented. The tracking plan — if one exists — is a lie.
In all three cases, your CS team can't see which accounts are healthy. Your product team can't measure feature adoption. You can't give investors real usage numbers. The analytics tool you're paying for works fine — it just can't help when the tracking feeding it is missing, inconsistent, or broken.
They fix the instrumentation layer feeding your analytics tools — so your product is properly tracked and any analytics tool downstream can answer real questions about how customers use your product.
The focus is users, accounts, features, and lifecycle events. The raw signals your product emits. Not vanity pageviews. Not generic clicks. Semantic events with meaning, properties, and account attribution.
The boundary is deliberate. This produces instrumentation. What happens downstream — scoring, dashboards, alerts — belongs to tools like Amplitude, PostHog, Mixpanel, or Accoil.
These aren't thin prompts. Each skill includes a built-in reference library:
report.created), snake_case for properties and traits (signup_source)The skills encode the kind of knowledge that usually lives in a senior analytics engineer's head — except it doesn't walk out the door when they leave.
Seven skills plus a background tracking watchdog. Each skill produces artifacts that feed the next. Everything version-controlled in your repo.
Business Case ──▶ Model ──▶ Audit ──▶ Design ──▶ Instrument ──▶ Implement ──▶ Maintain