Upload primitives for HuggingFace Soul persistence - file, folder, snapshot, JSONL append, and dataset card management with exponential backoff. Use when persisting agent learnings, snapshots, or semantic caches to HuggingFace.
From huggingface-utilsnpx claudepluginhub richfrem/agent-plugins-skills --plugin huggingface-utilsThis skill is limited to using the following tools:
acceptance-criteria.mdevals/evals.jsonevals/results.tsvfallback-tree.mdreferences/acceptance-criteria.mdreferences/fallback-tree.mdrequirements.txtscripts/hf_config.pyscripts/hf_upload.pyGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
This skill requires Python 3.8+ and standard library only. No external packages needed.
To install this skill's dependencies:
pip-compile ./requirements.in
pip install -r ./requirements.txt
See ./requirements.txt for the dependency lockfile (currently empty — standard library only).
Status: Active
Author: Richard Fremmerlid
Domain: HuggingFace Integration
Depends on: hf-init (credentials must be configured first)
Provides consolidated upload operations for all HF-consuming plugins (Primary Agent, Orchestrator, etc.). All uploads include exponential backoff for rate-limit handling.
| Function | Description | Remote Path |
|---|---|---|
upload_file() | Upload a single file | Custom path |
upload_folder() | Upload an entire directory | Custom prefix |
upload_soul_snapshot() | Upload a sealed learning snapshot | lineage/seal_<timestamp>_*.md |
upload_semantic_cache() | Upload RLM semantic cache | data/rlm_summary_cache.json |
append_to_jsonl() | Append records to soul traces | data/soul_traces.jsonl |
ensure_dataset_structure() | Create ADR 081 folders | lineage/, data/, metadata/ |
ensure_dataset_card() | Create/verify tagged README.md | README.md |
from hf_upload import upload_file, upload_soul_snapshot, append_to_jsonl
# Upload a single file
result = await upload_file(Path("my_file.md"), "lineage/my_file.md")
# Upload a sealed learning snapshot
result = await upload_soul_snapshot(Path("snapshot.md"), valence=-0.5)
# Append records to soul_traces.jsonl
result = await append_to_jsonl([{"type": "learning", "content": "..."}])
hf-init first to validate credentials and dataset structurehuggingface_hub installed (pip install huggingface_hub)HUGGING_FACE_USERNAME, HUGGING_FACE_TOKENAll operations return HFUploadResult with:
success: bool — whether the upload succeededrepo_url: str — HuggingFace dataset URLremote_path: str — path within the dataseterror: str — error message if failedRate-limited requests retry with exponential backoff (up to 5 attempts).