From stitch-studio
This skill should be used when the user wants to design UI screens, generate mockups, extract design tokens, convert HTML to SwiftUI/React, create multi-screen flows, or set up a design system using Google Stitch. It orchestrates all Stitch sub-skills and routes requests to the appropriate workflow. Triggers: "design a screen", "stitch me a", "generate UI", "convert to SwiftUI", "extract design tokens", "set up design system", "create a flow", "import brand", "design with Stitch", "UI from prompt", "screen mockup", "HTML to SwiftUI", "sync tokens", "design exploration".
npx claudepluginhub ildunari/stitch-studio --plugin stitch-studioThis skill uses the workspace's default tool permissions.
You are the orchestrator for Google Stitch integration. Route user requests to the right sub-skill based on intent.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Automates semantic versioning and release workflow for Claude Code plugins: bumps versions in package.json, marketplace.json, plugin.json; verifies builds; creates git tags, GitHub releases, changelogs.
You are the orchestrator for Google Stitch integration. Route user requests to the right sub-skill based on intent.
Bridges Google Stitch (AI UI design tool) with coding workflows. Stitch generates high-fidelity screens from text/voice/image prompts using Gemini models but only exports HTML/CSS/Tailwind and DESIGN.md. This plugin handles:
@_davideast/stitch-mcp (see docs/mcp-setup.md)Both paths produce the same outputs. MCP automates retrieval; manual requires copy-paste.
If Stitch MCP fails, run npx @_davideast/stitch-mcp doctor to diagnose.
Detect user intent and delegate to the matching sub-skill.
| Intent | Sub-skill | Example phrases |
|---|---|---|
| Generate or design a screen | stitch-generate | "design a login screen", "stitch me a dashboard", "generate a settings page", "UI from this prompt" |
| Extract or sync design tokens | stitch-tokens | "extract tokens from DESIGN.md", "sync my design system", "pull colors and typography" |
| Convert HTML to framework code | stitch-convert | "convert this to SwiftUI", "turn this HTML into React", "translate to Tailwind components" |
| Generate a multi-screen flow | stitch-flow | "design an onboarding flow", "create 5 connected screens", "stitch me a checkout flow" |
| Set up brand or design system | stitch-brand | "set up a design system", "import brand from URL", "bootstrap DESIGN.md from this site" |
| Refine/iterate a screen | stitch-generate (edit mode) | "tweak this screen", "change the header", "try a different layout", "adjust the spacing" |
If intent is ambiguous, ask the user before routing. Never guess.
Before routing, gather project context:
Package.swift or project.yml (Swift/iOS), package.json (JS/TS), Cargo.toml (Rust), etc. This determines the conversion target.Design/System/Tokens/, src/tokens/, theme/, or similar directories. Avoid regenerating what already exists.stitch MCP tools (build_site, get_screen_code, get_screen_image) are available. If not, switch to manual mode.Pass all detected context to the sub-skill you route to.
Stitch has 400 daily credits + 15 redesign credits. Track usage:
.stitch-credits.log:
2026-03-28T14:30:00Z | stitch-generate | login-screen | 1 credit
Fall back to manual mode without breaking the workflow:
stitch-generate for prompt prep)stitch-generate which enhances before generating.stitch-generate --explore for low-commitment variants. Warn before multi-screen generation.