From jarvis
Use when Chadron wants to analyze ServiceNow incident/request data for patterns, resolution times, automation opportunities, or application rankings.
npx claudepluginhub chadronbryant/napa-cowork-plugins --plugin jarvisThis skill uses the workspace's default tool permissions.
Analyze ServiceNow ITSM data to surface patterns, opportunities, and insights.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Analyze ServiceNow ITSM data to surface patterns, opportunities, and insights.
servicenow_data/napaanesthesiatest/ (JSON exports)ITSM Process and Standards/00 - Deliverables/NAPA_ITSM_Enhanced_Analysis.xlsxscripts/data-analysis/ and scripts/servicenow/Ask Chadron what type of analysis they need. Present options:
ANALYSIS MENU
─────────────────────────────────
[1] Pattern Analysis - Top incident categories, repeat issues, clustering
[2] Resolution Times - Median, P90, by priority/group/application
[3] Automation Scan - Identify automatable workflows with ROI estimates
[4] Application Ranking - Volume by business app, trend analysis
[5] Assignment Groups - Workload distribution, reassignment rates
[6] Custom Query - Free-form analysis on any data dimension
─────────────────────────────────
Read the relevant JSON exports from servicenow_data/napaanesthesiatest/. Key files:
incidents_*.json - Incident recordsrequest_items_*.json - RITM recordscatalog_items.json - Catalog item definitionsassignment_groups.json - Group structureIf JSON files aren't found, check ITSM Process and Standards/00 - Deliverables/NAPA_ITSM_Enhanced_Analysis.xlsx for the compiled analysis.
Use Python (pandas) for data processing. Start a Python REPL process if needed:
import pandas as pd
import json
# Load and analyze
For each analysis type, produce:
Use structured, scannable format:
ANALYSIS: [Type] | [Date]
─────────────────────────────────
KEY FINDINGS
1. [Finding with number]
2. [Finding with number]
3. [Finding with number]
DETAIL TABLE
[Formatted table with data]
RECOMMENDED ACTIONS
- [Action] → [Expected impact in hours saved at $50/hr]
─────────────────────────────────
After presenting analysis, offer: