Batch-sweep local Claude Code session transcripts to Supabase. Supports three scope modes: current repo (default), all repos, or filtered by date. Use for "/ingest", "/ingest all", "/ingest 2026-02-23", "upload sessions", "sync sessions".
From hoardinatornpx claudepluginhub jabberlockie/the-human-stack-plugins-public --plugin hoardinatorThis skill uses the workspace's default tool permissions.
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Designs, audits, and improves analytics tracking systems using Signal Quality Index for reliable, decision-ready data in marketing, product, and growth.
Enforces A/B test setup with gates for hypothesis locking, metrics definition, sample size calculation, assumptions checks, and execution readiness before implementation.
Batch-sweep local Claude Code JSONL session files, parse them into SQL-ready payloads, and upload to Supabase. Incremental by default -- only new or updated sessions are processed.
| Invocation | Scope |
|---|---|
/ingest | All sessions for the current repo |
/ingest all | All sessions across all repos |
/ingest 2026-02-23 | Sessions from a specific date only |
python3 available in PATHSUPABASE_ACCESS_TOKEN and SUPABASE_PROJECT_REF environment variables set${CLAUDE_PLUGIN_ROOT}/scripts/parse-sessions.py and
${CLAUDE_PLUGIN_ROOT}/scripts/upload-sessions.sh presentDetermine scope from the user's argument:
<project-dir> from the current
working directory.all -- all repos on this machine. No project-dir filter.Derive the repo namespace from git remote get-url origin in the current working
directory. This is used for the Supabase query in the next step. For all mode,
skip namespace resolution.
Query the sessions table for what has already been ingested:
For current-repo and date modes:
curl -s -X POST \
"https://api.supabase.com/v1/projects/${SUPABASE_PROJECT_REF}/database/query" \
-H "Authorization: Bearer ${SUPABASE_ACCESS_TOKEN}" \
-H "Content-Type: application/json" \
-d '{"query": "SELECT session_id, last_message_at, total_messages, total_tool_calls FROM sessions WHERE repo = '"'"'<current-repo>'"'"'"}'
For all mode:
curl -s -X POST \
"https://api.supabase.com/v1/projects/${SUPABASE_PROJECT_REF}/database/query" \
-H "Authorization: Bearer ${SUPABASE_ACCESS_TOKEN}" \
-H "Content-Type: application/json" \
-d '{"query": "SELECT session_id, last_message_at, total_messages, total_tool_calls FROM sessions"}'
Build a JSON object from the results, mapping session_id to its state:
{
"abc123-def456": {
"last_message_at": "2026-02-23T14:30:00Z",
"total_messages": 42,
"total_tool_calls": 18
}
}
If the query returns zero rows, pass an empty object {}. This means all
discovered sessions will be treated as new.
python3 ${CLAUDE_PLUGIN_ROOT}/scripts/parse-sessions.py \
--output /tmp/hoardinator-sql/ \
--project-dir <project-dir> \
--incremental '<json-from-step-2>'
Variations by scope:
--project-dir <project-dir>.all mode: Omit --project-dir to scan all projects.--date <YYYY-MM-DD> to filter by date.Sessions are identified by UUID extracted from the JSONL filename. The parser compares each session against the incremental JSON to decide:
Read /tmp/hoardinator-sql/summary.json. It contains counts of new, updated,
skipped-current, and skipped-empty sessions.
Proceed to upload.
bash ${CLAUDE_PLUGIN_ROOT}/scripts/upload-sessions.sh \
--input /tmp/hoardinator-sql/
Wait for completion. Check the exit code. If the upload fails, report the error and stop.
Present results to the user in this format:
Ingested: X new, Y updated, Z skipped (current), W skipped (empty)
Total: N messages, M tool calls across P sessions
Include the scope that was used (repo name, "all repos", or date filter) so the user knows exactly what was processed.
last_message_at and counts to determine the delta.git remote get-url origin in the project directory./tmp/hoardinator-sql/ directory is used as a staging area. It is
overwritten on each run.