From wicked-garden
This skill should be used when profiling datasets, validating schemas, or assessing data quality. Covers dataset profiling, schema validation, and quality assessment. Use when: - "profile this dataset" - "validate schema" - "check data quality" - "what's in this CSV/Excel file" - "describe this data"
npx claudepluginhub mikeparcewski/wicked-garden --plugin wicked-gardenThis skill uses the workspace's default tool permissions.
Core data engineering operations for profiling, validation, and quality assessment.
Provides Ktor server patterns for routing DSL, plugins (auth, CORS, serialization), Koin DI, WebSockets, services, and testApplication testing.
Conducts multi-source web research with firecrawl and exa MCPs: searches, scrapes pages, synthesizes cited reports. For deep dives, competitive analysis, tech evaluations, or due diligence.
Provides demand forecasting, safety stock optimization, replenishment planning, and promotional lift estimation for multi-location retailers managing 300-800 SKUs.
Core data engineering operations for profiling, validation, and quality assessment.
/wicked-garden:data:data profile path/to/data.csv
This will:
/wicked-garden:data:data validate --schema schema.json --data data.csv
Checks: Column presence, type conformance, constraint validation, nullability rules.
/wicked-garden:data:data quality data.csv
Reports on: Completeness (null rates), Uniqueness (duplicates), Validity (constraints), Consistency (cross-field checks).
| Command | Purpose |
|---|---|
/wicked-garden:data:data profile <path> | Profile dataset structure and quality |
/wicked-garden:data:data validate | Validate data against schema |
/wicked-garden:data:data quality <path> | Generate quality report |
Uses data_profiler.py script:
sh "${CLAUDE_PLUGIN_ROOT}/scripts/_python.sh" "${CLAUDE_PLUGIN_ROOT}/scripts/data/data_profiler.py" \
--input data.csv --output profile.json
Output includes:
Uses schema_validator.py script. Define expected columns with:
See examples for schema format.
| Dimension | Metric | Threshold |
|---|---|---|
| Completeness | Null rate | <5% |
| Uniqueness | Duplicate rate | <1% |
| Validity | Type conformance | 100% |
| Consistency | Cross-field rules | 100% |
| Plugin | Enhancement |
|---|---|
| wicked-garden:data:numbers | Use for SQL-based profiling of large files |
| Native tasks | Document quality issues via TaskCreate with metadata.event_type="task" |
| wicked-garden:mem | Store quality patterns across sessions |
For files >1GB, use wicked-garden:data:numbers for efficient SQL-based profiling:
/wicked-garden:data:numbers large_file.csv
All reports include:
For detailed examples and patterns: