From datadog-log-analyst
Formats and delivers Datadog Prismatic log analysis results. Takes the analysis object from /tmp/dd-analysis/analysis.json and outputs it to the user's requested destination: chat (default), Slack, Notion, email, or docx file. Called internally by datadog-log-analysis as the final step — not triggered directly. Also use this skill whenever you need to format or re-deliver a previously completed analysis, or when the user says "send that to Slack", "put that in Notion", "DM me the results", or "write that up as a report".
npx claudepluginhub p3nj/p3nj-market --plugin datadog-log-analystThis skill uses the workspace's default tool permissions.
Final skill in the pipeline. Reads the completed analysis object and formats it
Conducts multi-round deep research on GitHub repos via API and web searches, generating markdown reports with executive summaries, timelines, metrics, and Mermaid diagrams.
Dynamically discovers and combines enabled skills into cohesive, unexpected delightful experiences like interactive HTML or themed artifacts. Activates on 'surprise me', inspiration, or boredom cues.
Generates images from structured JSON prompts via Python script execution. Supports reference images and aspect ratios for characters, scenes, products, visuals.
Final skill in the pipeline. Reads the completed analysis object and formats it for the user's requested delivery channel.
Input:
/tmp/dd-analysis/analysis.json(built by dd-analyse-core, optionally extended by dd-analyse-sap) Output: Formatted report delivered to the requested channel
From the user's original message:
| User says | Delivery |
|---|---|
| "send to Slack" / "DM me" / "post to #channel" | Slack via Slack MCP |
| "put it in Notion" / "create a page" | Notion page via Notion MCP |
| "write a report" / "save as doc" | .md or .docx file |
| "email the summary" / "send via email" | Outlook via Microsoft 365 MCP |
| (nothing specified) | Present in chat (default) |
import json
with open('/tmp/dd-analysis/analysis.json') as f:
analysis = json.load(f)
**<CLIENT_CODE>** · <DD Mon> · <HH:MM – HH:MM AEST> (<duration>) · <Integration type> integration
**Health:** <total> logs — <error_rate> issues
Info <N> · Debug <N> · Warn <N> · Error <N>
**Fetch stats:** Phase 1 (error+warn): <N> · Phase 2 (info+debug): <N> · Pages: <N>
**What's happening:**
<2-4 sentence analyst interpretation>
**Top issues:**
1. <issue> — **<count>** hits
2. ...
**SAP / Integration:**
← INCLUDE THIS SECTION ONLY for SAP integrations where sap_integration exists.
← OMIT ENTIRELY for AMT, Maximo, generic, or when sap_integration is absent.
← Do NOT write "N/A" — simply leave the section out.
- <label> — <count> — WOs: ...
**Monitors:** (only if monitors is non-empty)
- <monitor name> — <status>
**Recent Events:** (only if events is non-empty)
- <event title> — <alert_type> — <date>
Important formatting rules:
Same structure adapted for Slack mrkdwn:
*bold* instead of **bold**:warning: :red_circle: :white_check_mark: emojis for severitySlack:slack_send_message or Slack:slack_send_message_draft*<CLIENT_CODE>* · <DD Mon> · <HH:MM – HH:MM AEST> · <Integration type>
*Health:* <total> logs — <error_rate> issues
Info <N> · Debug <N> · Warn <N> · Error <N>
*What's happening:*
<interpretation>
*Top issues:*
1. <issue> — *<count>* hits
2. ...
Create a Notion page with:
<CLIENT_CODE> Log Analysis — <DD Mon YYYY>Notion:notion-create-pages toolSubject: [Datadog] <CLIENT_CODE> Log Analysis — <DD Mon>
Body: Same structure as chat format, formatted as HTML.
Use Microsoft 365 MCP tools to send.
.md: Write the chat format to a file.docx: Read the docx skill and produce a formatted Word documentExecute delivery based on detected intent. For Slack and email, ask for user confirmation before sending.
For chat delivery, output the formatted text directly in the conversation.
If the user asks to re-deliver a previous analysis ("send that to Slack", "now put it
in Notion"), check if /tmp/dd-analysis/analysis.json still exists. If so, re-read
it and format for the new channel. No need to re-run the pipeline.
If the file doesn't exist (new session), inform the user that the analysis needs to be re-run first.