By ildunari
Google Stitch integration for Claude Code — AI-powered UI design generation, design token extraction, and framework conversion (SwiftUI, React, CSS). Bridges Stitch's visual design output with production code.
npx claudepluginhub ildunari/stitch-studio --plugin stitch-studioReview SwiftUI code generated from Stitch HTML/CSS conversion. Compares output against the original screenshot for visual accuracy, checks accessibility, and verifies design token usage. Use after stitch-convert produces SwiftUI output. <example> Context: stitch-convert just generated a SwiftUI view user: "Review this conversion — does it match the original?" assistant: "I'll use the conversion-reviewer agent to compare the generated SwiftUI against the Stitch screenshot and check for gaps." <commentary> Review is a different lens than generation. Separating review prevents the generator from self-validating. </commentary> </example>
Explore design variants in parallel by generating multiple Stitch screens with different aesthetic directions. Use when the user wants to see options, compare design approaches, or explore alternatives for a screen concept. <example> Context: User is designing a new feature screen user: "Show me 3 different takes on the settings screen" assistant: "I'll use the design-explorer agent to generate competing design variants and present them side by side." <commentary> User wants to compare design directions, which matches this agent's specialty of parallel variant generation. </commentary> </example> <example> Context: User is unsure about the right visual approach user: "I'm not sure if this should be minimal or information-dense — can you explore both?" assistant: "I'll launch the design-explorer to generate variants across that spectrum." <commentary> User explicitly wants to explore competing aesthetics, which is this agent's core purpose. </commentary> </example>
Focused design token extraction and mapping. Reads DESIGN.md and existing code tokens, detects drift, and produces platform-specific token files. Use when extracting or syncing design tokens between DESIGN.md and code. <example> Context: User has a new DESIGN.md from Stitch user: "Extract the tokens from this DESIGN.md into Swift files" assistant: "I'll use the token-mapper agent to parse the tokens and generate Swift constants." <commentary> Focused extraction task that benefits from isolated context — the agent only sees DESIGN.md and target token files, not the whole project. </commentary> </example>
This skill should be used when the user wants to bootstrap a design system from scratch using Google Stitch's Brand Kit import or Vibe Design. It generates DESIGN.md, creates platform-specific token files, and establishes a visual baseline for all future screen generation. Triggers: 'set up a design system', 'create a brand kit', 'establish visual identity', 'import brand from URL', 'start fresh with design tokens'.
This skill should be used when the user wants to convert Stitch HTML/CSS output to SwiftUI, React, or other framework code. It maps Tailwind classes and CSS properties to native framework equivalents using comprehensive mapping tables. Triggers: 'convert HTML to SwiftUI', 'translate Stitch output to code', 'implement a Stitch design', 'turn a screenshot into SwiftUI', 'convert to React'.
This skill should be used when the user wants to generate multi-screen connected UI flows using Google Stitch. It handles onboarding sequences, tab-based navigation, modal flows, and user journeys up to 5 screens, automatically chunking longer flows and maintaining design consistency. Triggers: 'design a user flow', 'create connected screens', 'prototype an onboarding', 'build a navigation flow', 'stitch me a checkout flow'.
This skill should be used when the user wants to generate UI screens from text descriptions using Google Stitch. It enhances prompts with project context, design tokens, and platform-specific conventions (iOS dimensions, safe areas, native controls). Triggers: 'design a screen', 'generate UI', 'create a mockup', 'prototype a layout', 'explore design variants', 'stitch me a'.
This skill should be used when the user wants to design UI screens, generate mockups, extract design tokens, convert HTML to SwiftUI/React, create multi-screen flows, or set up a design system using Google Stitch. It orchestrates all Stitch sub-skills and routes requests to the appropriate workflow. Triggers: "design a screen", "stitch me a", "generate UI", "convert to SwiftUI", "extract design tokens", "set up design system", "create a flow", "import brand", "design with Stitch", "UI from prompt", "screen mockup", "HTML to SwiftUI", "sync tokens", "design exploration".
This skill should be used when the user wants to extract design tokens from DESIGN.md and generate platform-specific token files (Swift, CSS custom properties, Tailwind config). It provides bidirectional sync between Stitch design tokens and code. Triggers: 'extract tokens', 'sync design system', 'generate color constants', 'generate typography constants', 'convert DESIGN.md to code', 'pull colors and typography', 'update my colors', 'regenerate theme file', 'token file is outdated', 'sync colors from DESIGN.md'.
A Claude Code plugin that integrates Google Stitch (AI-powered UI design tool) into coding workflows. Generate screens from text prompts, extract design tokens, and convert to production code — especially SwiftUI.
Stitch generates high-fidelity UI screens from text descriptions, but only exports HTML/CSS. This plugin bridges the gap:
# Install all skills
npx skills add kostamilov/stitch-studio --global
# Install a specific skill
npx skills add kostamilov/stitch-studio --skill stitch-convert --global
claude plugin install kostamilov/stitch-studio
skillsync add kostamilov/stitch-studio
The plugin works best with the Stitch MCP server for direct generation:
npm install -g @_davideast/stitch-mcp
npx @_davideast/stitch-mcp init # guided auth setup
npx @_davideast/stitch-mcp doctor # verify config
The plugin includes .mcp.json that auto-configures the server when installed
as a plugin.
All skills work without MCP. Generate screens in your browser at stitch.withgoogle.com, then paste screenshots or HTML into Claude Code.
| Skill | Trigger | Purpose |
|---|---|---|
stitch-studio | /stitch, "design a screen" | Orchestrator — routes to the right sub-skill |
stitch-generate | "generate UI", "stitch me a..." | Prompt enhancement + screen generation |
stitch-tokens | "extract tokens", "sync design system" | DESIGN.md ↔ code token synchronization |
stitch-convert | "convert to SwiftUI" | HTML/CSS → SwiftUI with mapping tables |
stitch-flow | "design a flow", "onboarding screens" | Multi-screen connected flows |
stitch-brand | "set up design system" | Brand kit + design system bootstrap |
| Agent | Purpose |
|---|---|
design-explorer | Generate competing design variants in parallel |
token-mapper | Focused token extraction from DESIGN.md |
conversion-reviewer | Review SwiftUI output against original screenshot |
The bridge between Stitch and your code. A markdown file containing design
tokens (colors, typography, spacing) that Stitch generates and your code
consumes. Place it at your project root or in a Design/ directory.
Stitch provides 400 daily credits + 15 daily redesign credits. The plugin tracks usage and warns at 75% consumption.
The stitch-convert skill includes comprehensive Tailwind-to-SwiftUI and
CSS-to-SwiftUI mapping tables covering layout, spacing, typography, colors,
sizing, borders, effects, and interactive elements.
stitch-studio/
├── .claude-plugin/
│ ├── plugin.json
│ └── marketplace.json
├── agents/
│ ├── design-explorer.md
│ ├── token-mapper.md
│ └── conversion-reviewer.md
├── hooks/
│ └── hooks.json
├── scripts/
│ ├── session-init.sh
│ └── credit-logger.sh
├── skills/
│ ├── stitch-studio/SKILL.md
│ ├── stitch-generate/
│ │ ├── SKILL.md
│ │ └── references/ios-prompting.md
│ ├── stitch-tokens/
│ │ ├── SKILL.md
│ │ ├── references/design-md-schema.md
│ │ ├── examples/DESIGN.md
│ │ └── scripts/parse-design-md.py
│ ├── stitch-convert/
│ │ ├── SKILL.md
│ │ └── references/
│ │ ├── tailwind-to-swiftui.md
│ │ └── css-to-swiftui.md
│ ├── stitch-flow/SKILL.md
│ └── stitch-brand/
│ ├── SKILL.md
│ └── examples/DESIGN.md
├── docs/
│ ├── skills-index.md
│ └── mcp-setup.md
├── .mcp.json
├── AGENTS.md
├── CONTRIBUTING.md
├── LICENSE
└── README.md
Apache 2.0 — see LICENSE.
Persistent memory system for Claude Code - seamlessly preserve context across sessions
Uses power tools
Uses Bash, Write, or Edit tools
Qiushi Skill: methodology skills for AI agents guided by seeking truth from facts, with Claude Code, Cursor, OpenClaw, Codex, OpenCode, and Hermes guidance.
Ultra-compressed communication mode. Cuts ~75% of tokens while keeping full technical accuracy by speaking like a caveman.
Comprehensive UI/UX design plugin for mobile (iOS, Android, React Native) and web applications with design systems, accessibility, and modern patterns
Search and retrieve documents from local markdown files.
Intelligent prompt optimization using skill-based architecture. Enriches vague prompts with research-based clarifying questions before Claude Code executes them