From huggingface-skills
Builds reusable CLI scripts using Hugging Face API to fetch/enrich/process models and datasets. Ideal for chaining calls, piping, and automating repetitive tasks.
npx claudepluginhub huggingface/skills --plugin huggingface-vision-trainerThis skill uses the workspace's default tool permissions.
Your purpose is now is to create reusable command line scripts and utilities for using the Hugging Face API, allowing chaining, piping and intermediate processing where helpful. You can access the API directly, as well as use the `hf` command line tool. Model and Dataset cards can be accessed from repositories directly.
Creates reusable CLI scripts and utilities for Hugging Face API and hf tool, supporting chaining, piping, and processing for model/dataset automation.
Manages Hugging Face Hub via CLI: download/upload models/datasets/spaces/repos/buckets, handle auth/jobs/webhooks/Inference Endpoints. For HF ecosystem/AI/ML tasks.
Conducts multi-round deep research on GitHub repos via API and web searches, generating markdown reports with executive summaries, timelines, metrics, and Mermaid diagrams.
Share bugs, ideas, or general feedback.
Your purpose is now is to create reusable command line scripts and utilities for using the Hugging Face API, allowing chaining, piping and intermediate processing where helpful. You can access the API directly, as well as use the hf command line tool. Model and Dataset cards can be accessed from repositories directly.
Make sure to follow these rules:
--help command line argument to describe their inputs and outputsHF_TOKEN environment variable as an Authorization header. For example: curl -H "Authorization: Bearer ${HF_TOKEN}" https://huggingface.co/api/. This provides higher rate limits and appropriate authorization for data access.Be sure to confirm User preferences where there are questions or clarifications needed.
Paths below are relative to this skill directory.
Reference examples:
references/hf_model_papers_auth.sh — uses HF_TOKEN automatically and chains trending → model metadata → model card parsing with fallbacks; it demonstrates multi-step API usage plus auth hygiene for gated/private content.references/find_models_by_paper.sh — optional HF_TOKEN usage via --token, consistent authenticated search, and a retry path when arXiv-prefixed searches are too narrow; it shows resilient query strategy and clear user-facing help.references/hf_model_card_frontmatter.sh — uses the hf CLI to download model cards, extracts YAML frontmatter, and emits NDJSON summaries (license, pipeline tag, tags, gated prompt flag) for easy filtering.Baseline examples (ultra-simple, minimal logic, raw JSON output with HF_TOKEN header):
references/baseline_hf_api.sh — bashreferences/baseline_hf_api.py — pythonreferences/baseline_hf_api.tsx — typescript executableComposable utility (stdin → NDJSON):
references/hf_enrich_models.sh — reads model IDs from stdin, fetches metadata per ID, emits one JSON object per line for streaming pipelines.Composability through piping (shell-friendly JSON output):
references/baseline_hf_api.sh 25 | jq -r '.[].id' | references/hf_enrich_models.sh | jq -s 'sort_by(.downloads) | reverse | .[:10]'references/baseline_hf_api.sh 50 | jq '[.[] | {id, downloads}] | sort_by(.downloads) | reverse | .[:10]'printf '%s\n' openai/gpt-oss-120b meta-llama/Meta-Llama-3.1-8B | references/hf_model_card_frontmatter.sh | jq -s 'map({id, license, has_extra_gated_prompt})'The following are the main API endpoints available at https://huggingface.co
/api/datasets
/api/models
/api/spaces
/api/collections
/api/daily_papers
/api/notifications
/api/settings
/api/whoami-v2
/api/trending
/oauth/userinfo
The API is documented with the OpenAPI standard at https://huggingface.co/.well-known/openapi.json.
IMPORTANT: DO NOT ATTEMPT to read https://huggingface.co/.well-known/openapi.json directly as it is too large to process.
IMPORTANT Use jq to query and extract relevant parts. For example,
Command to Get All 160 Endpoints
curl -s "https://huggingface.co/.well-known/openapi.json" | jq '.paths | keys | sort'
Model Search Endpoint Details
curl -s "https://huggingface.co/.well-known/openapi.json" | jq '.paths["/api/models"]'
You can also query endpoints to see the shape of the data. When doing so constrain results to low numbers to make them easy to process, yet representative.
The hf command line tool gives you further access to Hugging Face repository content and infrastructure.
❯ hf --help
Usage: hf [OPTIONS] COMMAND [ARGS]...
Hugging Face Hub CLI
Options:
--help Show this message and exit.
Commands:
auth Manage authentication (login, logout, etc.).
buckets Commands to interact with buckets.
cache Manage local cache directory.
collections Interact with collections on the Hub.
datasets Interact with datasets on the Hub.
discussions Manage discussions and pull requests on the Hub.
download Download files from the Hub.
endpoints Manage Hugging Face Inference Endpoints.
env Print information about the environment.
extensions Manage hf CLI extensions.
jobs Run and manage Jobs on the Hub.
models Interact with models on the Hub.
papers Interact with papers on the Hub.
repos Manage repos on the Hub.
skills Manage skills for AI assistants.
spaces Interact with spaces on the Hub.
sync Sync files between local directory and a bucket.
upload Upload a file or a folder to the Hub.
upload-large-folder Upload a large folder to the Hub.
version Print information about the hf version.
webhooks Manage webhooks on the Hub.
The hf CLI command has replaced the now deprecated huggingface-cli command.