From Claude-Data-Wrangler
Push a prepared dataset to Hugging Face Hub as a Dataset repository, with dataset card (README.md), config, and data files (Parquet / JSONL / CSV). Use after the dataset is cleaned and packaged (ideally via the parquet-jsonl-package skill) and the user wants it published on HF.
npx claudepluginhub danielrosehill/claude-code-plugins --plugin Claude-Data-WranglerThis skill uses the workspace's default tool permissions.
Publish a dataset to `huggingface.co/datasets/<user-or-org>/<name>`.
Conducts multi-round deep research on GitHub repos via API and web searches, generating markdown reports with executive summaries, timelines, metrics, and Mermaid diagrams.
Share bugs, ideas, or general feedback.
Publish a dataset to huggingface.co/datasets/<user-or-org>/<name>.
hf auth login (or huggingface-cli login on older versions) — if not authenticated, stop and instruct the user to log in. Do not prompt for or store tokens.parquet-jsonl-package first for better HF Datasets integration.mit, apache-2.0, cc-by-4.0, cc0-1.0).text-classification, tabular-classification, question-answering).n<1K, 1K<n<10K, 10K<n<100K, 100K<n<1M, 1M<n<10M, n>10M).README.md) with YAML frontmatter:
---
license: <spdx>
task_categories:
- <task>
language:
- en
size_categories:
- 10K<n<100K
tags:
- <tag>
---
Followed by sections: Description, Source, Data fields (copied from the data dictionary), Splits (if any), Preprocessing / provenance (from the data dictionary transformations log), Licensing, Citation.README.md (dataset card), data_dictionary.md (copy of the data dictionary).data/ folder: train.parquet / test.parquet / validation.parquet if the user has splits — otherwise a single data.parquet or data.jsonl.train-*.parquet).hf repo create <namespace>/<name> --type dataset [--private]
hf upload (preferred) or huggingface_hub.HfApi().upload_folder(...):
hf upload <namespace>/<name> ./local-folder --repo-type dataset
For large files, hf upload-large-folder handles resumable multi-part uploads.pip install huggingface_hub
# The `hf` CLI is installed with huggingface_hub >= 0.24. Older installs use `huggingface-cli`.
hf auth login. Do not handle tokens in code.hf upload-large-folder or enable LFS via hf lfs configuration.