Push text or content into The Hoardinator's intelligence pipeline. Creates an input record and optionally triggers classification. Use for "/push", "push this to Hoardinator", "ingest this note", "send this to the pipeline", "store this for later analysis".
From hoardinatornpx claudepluginhub jabberlockie/the-human-stack-plugins-public --plugin hoardinatorThis skill uses the workspace's default tool permissions.
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Designs scalable batch/streaming data pipelines, warehouses, lakehouses using Spark, dbt, Airflow, Kafka/Flink, and cloud platforms like Snowflake, BigQuery, Databricks.
Builds production Apache Airflow DAGs using best practices for operators, sensors, testing, and deployment. For data pipelines, workflow orchestration, and batch jobs.
Push arbitrary text, notes, or content into The Hoardinator's inputs pipeline. The content will be stored and can be classified, searched, and routed by downstream agents.
SUPABASE_PROJECT_REF environment variable set (hdhmwaldvzxwhimoemap)SUPABASE_ACCESS_TOKEN environment variable setAccept content from:
/push <text> — use the provided textAlso determine input type:
note — general text, observations, ideas (default)call — meeting/call content with participantsslack — Slack thread or messageemail — Email contentcurl -s -X POST \
"https://api.supabase.com/v1/projects/${SUPABASE_PROJECT_REF}/database/query" \
-H "Authorization: Bearer ${SUPABASE_ACCESS_TOKEN}" \
-H "Content-Type: application/json" \
-d '{
"query": "INSERT INTO public.inputs (type, status, data) VALUES ('\''<TYPE>'\'', '\''ingested'\'', '\''{\"source_type\": \"manual\", \"content\": \"<ESCAPED_CONTENT>\", \"pushed_from\": \"claude-plugin\"}'\''::jsonb) RETURNING id, created_at"
}'
Capture the returned id and created_at.
After successful insert, ask the user:
Pushed to pipeline. Input ID: <id>
Want me to classify it now? (flags, summary, action items)
If yes, invoke the classify skill with the input ID (skip steps 1-2 of classify,
jump directly to the classify-input Edge Function call with the existing input ID).
## Pushed to Hoardinator
**Input ID:** <id>
**Type:** <type>
**Created:** <timestamp>
**Status:** ingested (ready for classification)
Content stored. Use /classify to extract flags and intelligence.
ingested means it's in the pipeline but not yet classified/ingest skill instead