Upgrade a coded website to award-tier, editorially-crafted design using fal.ai. Takes a local HTML file or a dev-server URL, screenshots it, has an opus-4.7 vision model write a gpt-image-2 edit prompt, uses fal-ai/gpt-image-2/edit to produce the redesigned reference image, then opus-4.7 vision writes a Markdown build-spec with a "Hard constraints" section + a tokens.json. Also supports iterate (screenshot implemented site → delta-spec vs reference) and greenfield generate (brief → mockup → single-file HTML). Invoke when the user says "improve the design", "make it world-class", "redesign this landing page", "upgrade this site", "design pass", or points at a local HTML / dev server for a visual review.
npx claudepluginhub joshuarweaver/cascade-content-creation-misc-1 --plugin fal-ai-community-skillsThis skill uses the workspace's default tool permissions.
`fal-redesign` turns "I coded a site — make it look amazing" into a concrete, implementable design pass.
DEMO.mdLICENSEREADME.mdinstall.shruntime/README.mdruntime/bin/fal-site.mjsruntime/package-lock.jsonruntime/package.jsonruntime/src/brief.mjsruntime/src/directions.mjsruntime/src/fal.mjsruntime/src/hero.mjsruntime/src/implement.mjsruntime/src/mockup.mjsruntime/src/review.mjsruntime/src/upgrade.mjsruntime/src/variants.mjsruntime/src/video.mjsscripts/describe.shscripts/generate.shGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
fal-redesign turns "I coded a site — make it look amazing" into a concrete, implementable design pass.
your index.html → screenshot (1920×1200)
screenshot + brand → opus-4.7 writes a redesign prompt
screenshot + prompt → fal-ai/gpt-image-2/edit → after.png
after.png → opus-4.7 writes Markdown build-spec + tokens.json
→ returned to Claude Code / Codex
The agent reads after.png + changes.md + tokens.json, applies the spec to the real HTML, refreshes, optionally runs iterate for a residual pixel-fix pass.
index.html or a running dev server and asks to improve the design, make it world-class, redesign, polish, or run a design review.generate).Do NOT invoke for:
scripts/upgrade.sh — redesign a coded sitebash scripts/upgrade.sh --target <path-or-url> [--context "..."] [--variants N] [--out <dir>]
Pass --variants N (2–8) to fan out into N distinct design directions in parallel. You get after-01-<slug>.png, after-02-<slug>.png, … plus a gallery.html to compare them side-by-side. Pick one, then run scripts/describe.sh on the chosen PNG to produce its build-spec.
Outputs in <out>/:
before.png — current-site screenshot.after.png — redesigned reference image.edit-prompt.txt — transformation prompt fed to gpt-image-2.changes.md — Markdown build-spec with a leading "Hard constraints" section (also echoed to stdout).tokens.json — design tokens (colors, typography clamps, grid, buttons).scripts/describe.sh — re-run the build-spec on an existing after.pngUseful if the first spec was noisy or if you want to iterate on the spec without regenerating the image.
bash scripts/describe.sh --after <path/to/after.png> [--out <dir>]
scripts/iterate.sh — residual pixel-fix passAfter the agent has implemented the spec, screenshot the live site and emit a delta-spec vs the reference.
bash scripts/iterate.sh --target <path-or-url> --reference <path/to/after.png> [--out <dir>]
Outputs current.png + delta.md.
scripts/generate.sh — greenfield (brief → site)bash scripts/generate.sh --context "<freeform context>" [--variants 4] [--out <dir>]
export FAL_KEY=... # https://fal.ai/dashboard/keys
Models used:
anthropic/claude-opus-4.7 — via openrouter/router and openrouter/router/vision (overridable with FAL_SITE_MODEL).fal-ai/gpt-image-2 — greenfield hero + mockup renders.fal-ai/gpt-image-2/edit — screenshot-to-redesign transformation.FAL_KEY. If unset, ask the user to export it and stop.http://localhost:...) → scripts/upgrade.sh.scripts/generate.sh.upgrade pass takes 60–180s (1 screenshot + 2 vision calls + 1 image-edit). Emit a brief status to the user before calling.after.png (Read tool) so the user sees the new design.changes.md in the chat (or the highlights).<file> now?"Hard constraints verbatim and pulling exact values from tokens.json. For imagery in the result grid, look at after.png directly and either use <img> placeholders at the matching aspect ratio or source stock that matches the mood.scripts/iterate.sh --target <file> --reference <after.png> for a residual delta-spec. Apply deltas, refresh.FAL_SITE_MODEL=anthropic/claude-sonnet-4.6.gpt-image-2/edit is the right primitive because it edits an existing screenshot while preserving legible in-image text — avoid substituting other image models here.The skill ships a small Node 18+ runtime under runtime/ (puppeteer, @fal-ai/client, sharp). Scripts npm install on first run. Override the runtime path with FAL_SITE_RUNTIME=/abs/path.