AI-assisted inference on NVIDIA DGX Spark - run, manage, and stop LLM workloads
npx claudepluginhub spark-arena/sparkrun --plugin sparkrunRun benchmarks against an inference workload.
Browse and search available inference recipes.
Live-monitor CPU, RAM, and GPU metrics across cluster hosts.
Manage the LiteLLM-based inference proxy gateway.
Run an inference workload on DGX Spark using a sparkrun recipe.
Install sparkrun and configure a DGX Spark cluster.
Check the status of running sparkrun inference workloads.
Stop a running inference workload.
Manage recipe registries and create inference recipes
ALWAYS invoke this skill before running any sparkrun CLI commands. Never run sparkrun directly via Bash without loading this skill first. Covers launching, monitoring, stopping, and checking status of inference workloads on NVIDIA DGX Spark.
Install sparkrun and configure DGX Spark clusters
Comprehensive UI/UX design plugin for mobile (iOS, Android, React Native) and web applications with design systems, accessibility, and modern patterns
Access thousands of AI prompts and skills directly in your AI coding assistant. Search prompts, discover skills, save your own, and improve prompts with AI.