Root cause analysis plugin integrating Prometheus, GitHub, and GitLab for production incident investigation
npx claudepluginhub evangelosmeklis/thufir --plugin thufirAnalyze a GitHub issue reporting a production incident to perform root cause analysis
Analyze a GitLab issue reporting a production incident to perform root cause analysis
Analyze a Prometheus alert to perform root cause analysis
Interactive wizard for general root cause analysis investigation
This skill should be used when the user asks to "use git blame", "check git history", "find git commits", "use git log", "use git diff", "use git bisect", "trace code changes", "find who changed this code", or mentions using git commands for investigating code history and changes during root cause analysis.
This skill should be used when the user asks to "fetch GitHub issue", "get GitLab issue", "analyze GitHub PR", "search GitHub repo", "check GitLab commits", "use GitHub API", "use GitLab API", or mentions fetching data from GitHub or GitLab for incident investigation. Provides guidance for integrating with GitHub and GitLab APIs for RCA.
This skill should be used when the user asks to "query Prometheus", "analyze Prometheus metrics", "check Prometheus alerts", "write PromQL", "interpret Prometheus data", "fetch metrics", or mentions Prometheus querying, alerting, or metrics analysis. Provides guidance for querying and interpreting Prometheus metrics for root cause analysis.
This skill should be used when the user asks to "perform root cause analysis", "investigate production issue", "analyze incident", "find root cause", "debug production error", "trace the cause", or mentions investigating production problems, alerts, or outages. Provides systematic RCA methodology and investigation workflows.
Autonomous root cause analysis for production incidents. Integrates with Prometheus, GitHub, and GitLab to investigate alerts, analyze metrics, search code, and generate comprehensive RCA reports.

# Add the marketplace
/plugin marketplace add evangelosmeklis/thufir
git clone https://github.com/evangelosmeklis/thufir.git
cc --plugin-dir ./thufir
Create .claude/thufir.local.md:
---
prometheus:
endpoint: "https://prometheus.example.com"
github:
token: "ghp_your_token"
default_repo: "owner/repo"
gitlab:
token: "glpat_your_token"
default_project: "group/project"
---
Get tokens:
repo scope)api scope)** Just write this in claude code**
/thufir
and a drop down will appear
# Root Cause Analysis Report
Date: 2025-12-19
Alert: HighErrorRate - api-service
## Summary
95% error rate at 14:32 UTC due to database connection pool exhaustion
## Root Cause
File: src/database/connection.js:45
Commit: abc123 by John Doe on 2025-12-19
Issue: Connection pool reduced from 100 to 10 connections
## Fix
Revert pool size change or increase based on load testing
Made with Claude Code with ❤️ in Athens, Greece
Use when the user asks to inspect Sentry issues or events, summarize recent production errors, or pull basic Sentry health data via the Sentry API; perform read-only queries with the bundled script and require `SENTRY_AUTH_TOKEN`. Originally from OpenAI's curated skills catalog.
Admin access level
Server config contains admin-level keywords
Requires secrets
Needs API keys or credentials to function
Share bugs, ideas, or general feedback.
Monitor and analyze application error rates
DevsForge incident report generator with root cause analysis, timeline tracking, and post-mortem documentation
Comprehensive skill pack with 66 specialized skills for full-stack developers: 12 language experts (Python, TypeScript, Go, Rust, C++, Swift, Kotlin, C#, PHP, Java, SQL, JavaScript), 10 backend frameworks, 6 frontend/mobile, plus infrastructure, DevOps, security, and testing. Features progressive disclosure architecture for 50% faster loading.
Access thousands of AI prompts and skills directly in your AI coding assistant. Search prompts, discover skills, save your own, and improve prompts with AI.
Upstash Context7 MCP server for up-to-date documentation lookup. Pull version-specific documentation and code examples directly from source repositories into your LLM context.
Share bugs, ideas, or general feedback.