From ai-video-producer
Two-stage pipeline that generates a still image from a text prompt (Flux/SDXL/Imagen) then animates it into a video clip (Kling/Runway/Hailuo/Wan). Use when the user wants tight visual control over the opening frame before motion.
npx claudepluginhub danielrosehill/claude-code-plugins --plugin ai-video-producerThis skill uses the workspace's default tool permissions.
Two-stage chain. Stage 1 produces a still you can iterate on cheaply; stage 2 animates the chosen still.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Two-stage chain. Stage 1 produces a still you can iterate on cheaply; stage 2 animates the chosen still.
scripts/storyboards/NN-*.md (visual prompt seed, duration, character refs).brief/tools-and-models.md (text-to-image model + image-to-video model).characters/<name>.md.brief/creative-brief.md. Show it to the user before generating.generation/text-to-image/NN-shortname-vN.png. Save prompt + model + seed to generation/prompts/NN-shortname-vN.md. Append to logs/production-log.md.vN. Don't proceed until the user approves a still.generation/image-to-video/NN-shortname-vN.mp4. Save the motion prompt and parameters to generation/prompts/NN-shortname-vN-motion.md. Log it.clips/raw/ and suggest /promote-take), retry motion, or go back to step 2 (new still).brief/creative-brief.md. Set it explicitly in the image gen call; don't rely on the model's default.