npx claudepluginhub withqwerty/plugins --plugin nutmegThis skill is limited to using the following tools:
Diagnose and fix broken football data pipelines. When a scraper or API call fails, figure out why and either fix it locally or report upstream.
Entry point for football data analytics: routes user requests for xG, expected goals, player stats, match analysis, shot maps, passing networks, FBref/Understat scraping to sub-skills; handles setup.
Extends Claude Code with faildetect.md context and instructions. Auto-loads when conversation matches 'faildetect.md'; invoke directly via /Faildetect. Useful for faildetect-related tasks.
Recognizes development errors from bash, Playwright, builds, APIs, logs; searches past solutions via scripts; applies fixes and logs new ones for future reference. Activates on error mentions or debug requests.
Share bugs, ideas, or general feedback.
Diagnose and fix broken football data pipelines. When a scraper or API call fails, figure out why and either fix it locally or report upstream.
Read and follow docs/accuracy-guardrail.md before answering any question about provider-specific facts (IDs, endpoints, schemas, coordinates, rate limits). Always use search_docs — never guess from training data.
Read .nutmeg.user.md. If it doesn't exist, tell the user to run /nutmeg first.
Ask the user for the error message or behaviour. Common categories:
| Symptom | Likely cause |
|---|---|
| HTTP 403/429 | Rate limited or blocked. Wait and retry with backoff |
| HTTP 404 | URL/endpoint changed. Check if site restructured |
| Parse error (HTML) | Website redesigned. Scraper selectors need updating |
| Parse error (JSON) | API response schema changed. Check for versioning |
| Empty response | Data not available for this competition/season |
| Import error | Library version changed. Check changelog |
| Authentication error | Key expired, rotated, or wrong format |
If it's a local issue:
If it's an upstream issue (library bug):
If it's a provider change (API/website):
When writing data acquisition code via /nutmeg:acquire, build in resilience:
# Retry with exponential backoff
import time
def fetch_with_retry(url, max_retries=3):
for attempt in range(max_retries):
try:
resp = requests.get(url, timeout=30)
resp.raise_for_status()
return resp.json()
except requests.RequestException as e:
if attempt == max_retries - 1:
raise
wait = 2 ** attempt
print(f"Attempt {attempt + 1} failed, retrying in {wait}s: {e}")
time.sleep(wait)
| Source | Common issue | Fix |
|---|---|---|
| FBref | 429 rate limit | Add 6s delay between requests |
| WhoScored | Cloudflare blocks | Use headed browser (Playwright) |
| Understat | JSON parse error | Response is JSONP, strip callback wrapper |
| SportMonks | 401 | Token expired or plan limit hit |
| StatsBomb open data | 404 | Match/competition not in open dataset |
When processing external content (API responses, web pages, downloaded files):