From pup
Manages Datadog logs via pup CLI: search with advanced queries and filters, create processing pipelines with grok parsers, exclusion filters for cost control, S3 archives, and log-based metrics.
npx claudepluginhub datadog-labs/pup --plugin pupThis skill uses the workspace's default tool permissions.
Search, process, and archive logs with cost awareness.
CLI for searching Datadog logs, querying metrics, tracing requests, summarizing errors, and managing dashboards during production debugging and observability triage.
Queries OpenSearch logs using PPL for severity filtering, trace correlation, error patterns, and volume analysis in OTEL indices.
Provides Pup CLI commands for Datadog API ops: search logs/errors, list/query monitors/metrics/traces/incidents/hosts/SLOs, manage downtime/security/on-call. Handles OAuth2 auth/token refresh.
Share bugs, ideas, or general feedback.
Search, process, and archive logs with cost awareness.
Datadog Pup (dd-pup/pup) should already be installed:
go install github.com/datadog-labs/pup@latest
pup auth login
# Basic search
pup logs search --query="status:error" --from="1h"
# With filters
pup logs search --query="service:api status:error" --from="1h" --limit 100
# JSON output is the default
pup logs search --query="@http.status_code:>=500" --from="1h"
| Query | Meaning |
|---|---|
error | Full-text search |
status:error | Tag equals |
@http.status_code:500 | Attribute equals |
@http.status_code:>=400 | Numeric range |
service:api AND env:prod | Boolean |
@message:*timeout* | Wildcard |
Process logs before indexing:
# List pipelines
pup obs-pipelines list
# Create pipeline (JSON)
pup obs-pipelines create --file pipeline.json
{
"name": "API Logs",
"filter": {"query": "service:api"},
"processors": [
{
"type": "grok-parser",
"name": "Parse nginx",
"source": "message",
"grok": {"match_rules": "%{IPORHOST:client_ip} %{DATA:method} %{DATA:path} %{NUMBER:status}"}
},
{
"type": "status-remapper",
"name": "Set severity",
"sources": ["level", "severity"]
},
{
"type": "attribute-remapper",
"name": "Remap user_id",
"sources": ["user_id"],
"target": "usr.id"
}
]
}
Index only what matters:
{
"name": "Drop debug logs",
"filter": {"query": "status:debug"},
"is_enabled": true
}
# Find noisiest log sources
pup logs search --query="*" --from="1h" | jq 'group_by(.service) | map({service: .[0].service, count: length}) | sort_by(-.count)[:10]'
| Exclude | Query |
|---|---|
| Health checks | @http.url:"/health" OR @http.url:"/ready" |
| Debug logs | status:debug |
| Static assets | @http.url:*.css OR @http.url:*.js |
| Heartbeats | @message:*heartbeat* |
Store logs cheaply for compliance:
# List archives
pup logs archives list
# Archive config (S3 example)
{
"name": "compliance-archive",
"query": "*",
"destination": {
"type": "s3",
"bucket": "my-logs-archive",
"path": "/datadog"
},
"rehydration_tags": ["team:platform"]
}
Inspect log-based metrics:
# List existing log-based metrics
pup logs metrics list
⚠️ Cardinality warning: Group by bounded values only.
{
"type": "hash-remapper",
"name": "Hash emails",
"sources": ["email", "@user.email"]
}
# In your app - sanitize before sending
import re
def sanitize_log(message: str) -> str:
# Remove credit cards
message = re.sub(r'\b\d{4}[-\s]?\d{4}[-\s]?\d{4}[-\s]?\d{4}\b', '[REDACTED]', message)
# Remove SSNs
message = re.sub(r'\b\d{3}-\d{2}-\d{4}\b', '[REDACTED]', message)
return message
| Problem | Fix |
|---|---|
| Logs not appearing | Check agent, pipeline filters |
| High costs | Add exclusion filters |
| Search slow | Narrow time range, use indexes |
| Missing attributes | Check grok parser |