From agent-capability-standard
Convert data between formats, schemas, or representations with explicit loss accounting and validation. Use when reformatting data, mapping between schemas, normalizing inputs, or translating structures.
npx claudepluginhub synaptiai/synapti-marketplace --plugin agent-capability-standardThis skill is limited to using the following tools:
Transform data from one format or schema to another while tracking what information is preserved, modified, or lost during conversion. Ensure output conforms to target schema with full provenance.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Transform data from one format or schema to another while tracking what information is preserved, modified, or lost during conversion. Ensure output conforms to target schema with full provenance.
Success criteria:
Compatible schemas:
schemas/output_schema.yaml| Parameter | Required | Type | Description |
|---|---|---|---|
source | Yes | string or object | Input data to transform |
target_schema | Yes | string or object | Schema or format specification for output |
mapping | No | object | Explicit field mappings (source -> target) |
preserve_unknown | No | boolean | Keep fields not in mapping (default: false) |
strict | No | boolean | Fail on any mapping ambiguity (default: false) |
default_values | No | object | Defaults for missing required fields |
Parse source data: Load and validate input
Analyze target schema: Understand destination requirements
Build transformation map: Match source to target
Execute transformation: Apply mappings
Track losses: Document what was not preserved
Validate output: Confirm target schema conformance
Ground output: Attach provenance
Return a structured object:
transformed:
input_type: string # Format of source data
output_type: string # Format of output data
content: object # Transformed data
mapping_applied: string # Reference to mapping specification
validation:
input_valid: boolean # Was source valid?
output_valid: boolean # Does output match target schema?
schema_ref: string # Target schema reference
errors: array[string] # Validation errors if any
losses:
- field: string # Source field that was lost
reason: string # Why it was not preserved
severity: info | warning | error # Impact level
conversions:
- source_field: string
target_field: string
conversion: string # What transformation was applied
reversible: boolean # Can be converted back?
confidence: 0..1 # Confidence in transformation correctness
evidence_anchors: ["source:field", "schema:rule"]
assumptions: [] # Explicit assumptions
| Field | Type | Description |
|---|---|---|
transformed | object | Result of transformation |
validation | object | Input and output validation results |
losses | array | Fields or data not preserved |
conversions | array | Type/format conversions applied |
confidence | number | 0.0-1.0 based on mapping completeness |
evidence_anchors | array | Source references and schema rules |
assumptions | array | Assumptions made during transformation |
Input:
source:
user_id: 12345
user_name: "john_doe"
email_address: "john@example.com"
created_timestamp: 1705312800
metadata:
source: "api"
version: "2.1"
target_schema:
type: object
required: [id, name, email, created_at]
properties:
id: { type: integer }
name: { type: string }
email: { type: string, format: email }
created_at: { type: string, format: date-time }
tags: { type: array, items: { type: string } }
mapping:
user_id: id
user_name: name
email_address: email
created_timestamp: created_at
Output:
transformed:
input_type: "json"
output_type: "yaml"
content:
id: 12345
name: "john_doe"
email: "john@example.com"
created_at: "2024-01-15T10:00:00Z"
tags: []
mapping_applied: "explicit_mapping_v1"
validation:
input_valid: true
output_valid: true
schema_ref: "user_schema_v2"
errors: []
losses:
- field: "metadata.source"
reason: "No mapping to target schema"
severity: info
- field: "metadata.version"
reason: "No mapping to target schema"
severity: info
conversions:
- source_field: "created_timestamp"
target_field: "created_at"
conversion: "unix_epoch to ISO8601 datetime"
reversible: true
- source_field: "tags"
target_field: "tags"
conversion: "default empty array added"
reversible: false
confidence: 0.9
evidence_anchors:
- "source:user_id -> target:id"
- "schema:required_fields_present"
- "conversion:timestamp_to_iso8601"
assumptions:
- "Unix timestamp is in seconds, not milliseconds"
- "Timezone is UTC"
Evidence pattern: Field-by-field mapping documented, schema validation confirms conformance.
Input:
source: "name,age,city\nAlice,30,NYC\nBob,25,LA"
target_schema:
type: array
items:
type: object
properties:
full_name: { type: string }
age_years: { type: integer }
location: { type: string }
mapping:
name: full_name
age: age_years
city: location
Output:
transformed:
input_type: "csv"
output_type: "json_array"
content:
- full_name: "Alice"
age_years: 30
location: "NYC"
- full_name: "Bob"
age_years: 25
location: "LA"
mapping_applied: "csv_to_object_mapping"
validation:
input_valid: true
output_valid: true
schema_ref: "person_array_schema"
errors: []
losses: []
conversions:
- source_field: "age"
target_field: "age_years"
conversion: "string to integer"
reversible: true
confidence: 0.95
evidence_anchors:
- "source:row_count=2"
- "schema:all_required_present"
assumptions:
- "CSV uses comma delimiter"
- "First row is header"
- "No quoted fields with commas"
Verification tools: Read (for schema validation), Grep (for pattern matching)
mutation: falserequires_checkpoint: falserequires_approval: falserisk: lowCapability-specific rules:
Commonly follows:
receive - Transform incoming messages to canonical formretrieve - Transform retrieved data to expected formatinspect - Understand source before transformationCommonly precedes:
send - Format data before external transmissionintegrate - Prepare data for mergingvalidate - Check transformed outputAnti-patterns:
Workflow references:
reference/composition_patterns.md#digital-twin-sync-loop for transform in data pipelinereference/composition_patterns.md#enrichment-pipeline for transform context