From agent-capability-standard
Ingest and parse incoming messages, events, or signals into structured form. Use when processing external inputs, handling API responses, parsing webhook payloads, or ingesting sensor data.
npx claudepluginhub synaptiai/synapti-marketplace --plugin agent-capability-standardThis skill is limited to using the following tools:
Execute **receive** to ingest incoming data from external sources and parse it into a structured, validated form for downstream processing.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Execute receive to ingest incoming data from external sources and parse it into a structured, validated form for downstream processing.
Success criteria:
Compatible schemas:
schemas/output_schema.yamlreference/event_schema.yaml| Parameter | Required | Type | Description |
|---|---|---|---|
input_source | Yes | string|object | Source of incoming data: file path, URL, inline data, or stream identifier |
expected_format | No | enum | Expected data format: json, xml, yaml, text, csv, binary. Default: auto-detect |
validation_schema | No | string | Schema to validate parsed data against |
encoding | No | string | Character encoding. Default: utf-8 |
max_size | No | string | Maximum input size to accept (e.g., "10MB"). Default: "100MB" |
Identify input source: Determine where data is coming from
Validate source accessibility: Verify source can be read
Detect format: Identify data format if not specified
Parse input: Transform raw data into structured form
Validate structure: Check parsed data against schema
Extract events/messages: Identify discrete units in the data
Ground evidence: Record provenance for ingested data
Format output: Structure results according to output contract
Return a structured object:
received:
source: string # Where data came from
source_type: file | url | inline | stream
timestamp: string # ISO timestamp of receipt
format_detected: json | xml | yaml | csv | text | binary
size_bytes: integer # Size of received data
encoding: string # Character encoding used
messages:
- id: string # Unique message identifier
type: string | null # Event/message type if identifiable
payload: object # Parsed message content
timestamp: string | null # Message timestamp if present
metadata: object | null # Additional message metadata
parsed_count: integer # Number of messages/events parsed
validation:
schema_ref: string | null # Schema used for validation
valid: boolean # Whether all messages passed validation
errors:
- message_id: string
field: string
error: string
severity: error | warning
conflicts:
- type: string # Type of conflict/anomaly
description: string
affected_messages: array[string]
confidence: number # 0.0-1.0 based on parse success and validation
evidence_anchors: array[string] # Source references
assumptions: array[string] # Explicit assumptions
| Field | Type | Description |
|---|---|---|
received.source | string | Origin of the data |
messages | array[object] | Parsed messages/events |
validation.valid | boolean | Whether validation passed |
validation.errors | array | Validation error details |
conflicts | array | Anomalies or conflicts in data |
confidence | number | Lower if parse errors or validation failures |
Input:
input_source: "https://api.example.com/events"
expected_format: json
validation_schema: "schemas/event_v2.yaml"
Output:
received:
source: "https://api.example.com/events"
source_type: url
timestamp: "2024-01-16T12:00:00Z"
format_detected: json
size_bytes: 4256
encoding: "utf-8"
messages:
- id: "evt_001"
type: "order.created"
payload:
order_id: "ORD-12345"
customer_id: "CUST-789"
total: 129.99
currency: "USD"
timestamp: "2024-01-16T11:58:30Z"
metadata:
source_system: "ecommerce"
version: "2.1"
- id: "evt_002"
type: "order.paid"
payload:
order_id: "ORD-12345"
payment_method: "credit_card"
transaction_id: "TXN-ABC123"
timestamp: "2024-01-16T11:59:00Z"
metadata:
source_system: "payments"
version: "1.0"
parsed_count: 2
validation:
schema_ref: "schemas/event_v2.yaml"
valid: true
errors: []
conflicts: []
confidence: 1.0
evidence_anchors:
- "url:https://api.example.com/events"
- "schema:schemas/event_v2.yaml"
assumptions:
- "API response is authoritative source of truth"
- "Timestamps are in UTC"
Evidence pattern: Source URL recorded; schema validation confirmed; each message given unique ID.
Input:
input_source: "/var/log/sensor_data.json"
expected_format: json
validation_schema: "schemas/sensor_reading.yaml"
Output:
received:
source: "/var/log/sensor_data.json"
source_type: file
timestamp: "2024-01-16T14:30:00Z"
format_detected: json
size_bytes: 1024
encoding: "utf-8"
messages:
- id: "msg_001"
type: "sensor_reading"
payload:
sensor_id: "SENS-001"
temperature: 23.5
humidity: null
timestamp: "2024-01-16T14:29:00Z"
timestamp: "2024-01-16T14:29:00Z"
metadata: null
- id: "msg_002"
type: "sensor_reading"
payload:
sensor_id: "SENS-002"
temperature: "ERROR"
humidity: 65.2
timestamp: null
timestamp: null
metadata: null
parsed_count: 2
validation:
schema_ref: "schemas/sensor_reading.yaml"
valid: false
errors:
- message_id: "msg_001"
field: "humidity"
error: "Required field is null"
severity: warning
- message_id: "msg_002"
field: "temperature"
error: "Expected number, got string 'ERROR'"
severity: error
- message_id: "msg_002"
field: "timestamp"
error: "Required field is null"
severity: error
conflicts:
- type: "data_quality"
description: "Sensor SENS-002 reporting error state instead of reading"
affected_messages: ["msg_002"]
confidence: 0.5
evidence_anchors:
- "file:/var/log/sensor_data.json"
- "schema:schemas/sensor_reading.yaml"
assumptions:
- "File contains most recent sensor batch"
- "'ERROR' string indicates sensor malfunction"
next_actions:
- "Investigate sensor SENS-002 for malfunction"
- "Check why SENS-001 humidity is null"
Input:
input_source:
inline: |
<?xml version="1.0"?>
<notification type="alert">
<severity>high</severity>
<message>CPU usage exceeded threshold</message>
<host>server-01</host>
</notification>
expected_format: xml
Output:
received:
source: "inline"
source_type: inline
timestamp: "2024-01-16T15:00:00Z"
format_detected: xml
size_bytes: 198
encoding: "utf-8"
messages:
- id: "msg_001"
type: "notification.alert"
payload:
severity: "high"
message: "CPU usage exceeded threshold"
host: "server-01"
timestamp: null
metadata:
xml_root: "notification"
xml_attributes:
type: "alert"
parsed_count: 1
validation:
schema_ref: null
valid: true
errors: []
conflicts: []
confidence: 0.95
evidence_anchors:
- "inline:xml:notification"
assumptions:
- "XML is well-formed and complete"
- "No schema validation requested"
Verification tools: Read (to access file sources), Grep (to search for patterns in text data)
mutation: falserequires_checkpoint: falserequires_approval: falserisk: lowCapability-specific rules:
Commonly follows:
schedule - Scheduled data ingestionCommonly precedes:
transform - Normalize received data for processingvalidate - Deeper validation beyond schemaintegrate - Merge received data with existing statesynchronize - Combine with other data sourcesrecall - Check received data against prior contextAnti-patterns:
Workflow references:
reference/composition_patterns.md#digital-twin-sync-loop for receive as entry point