From openrouter
Generates videos from text prompts and optional reference images via OpenRouter's async API using curl, jq, and bash polling. Use for text-to-video, image animation, or clip creation requests.
npx claudepluginhub openrouterteam/skills --plugin openrouterThis skill uses the workspace's default tool permissions.
Generate videos via OpenRouter's async `POST /api/v1/videos` using `curl` + `jq`. Requires `OPENROUTER_API_KEY` (get one at https://openrouter.ai/keys). If unset, stop and ask.
Generates text-to-video and image-to-video via Together AI APIs, with keyframe control, model and dimension selection, async job polling, and video downloads. Use for motion generation on Together AI.
Generate short AI videos from text, images, or references using models like minimax, Kling, Veo, Grok, and Seedance. Supports text-to-video, image-to-video modes via node tool with hosted proxy.
Generates short AI videos from text or first-frame images using 10+ providers like OpenAI Sora, Google Veo, Runway Gen-3/4, Pika, Luma via Bun CLI. Supports text-to-video, image-to-video, optional last-frame control.
Share bugs, ideas, or general feedback.
Generate videos via OpenRouter's async POST /api/v1/videos using curl + jq. Requires OPENROUTER_API_KEY (get one at https://openrouter.ai/keys). If unset, stop and ask.
Video generation is async: submit → poll → download. A single request can't return the video because generation takes 30s–a few minutes. Tell the user the job was submitted so they know the delay is expected.
POST /api/v1/videos → { id, polling_url, status: "pending" }GET <polling_url> every ~30s until status is completed (terminal failures: failed, cancelled, expired — surface the error field verbatim)GET /api/v1/videos/{id}/content?index=0 with the auth header → MP4 bytesresolution, aspect_ratio, duration, and frame_images[].frame_type are per-model. Before the first submit for a new model (or whenever the user asks for something specific), fetch the model's capabilities and only send values from the returned sets:
curl -sS https://openrouter.ai/api/v1/videos/models \
| jq '.data[] | select(.id == "MODEL_ID")'
Fields on each model worth knowing: supported_resolutions, supported_aspect_ratios, supported_sizes, supported_durations (often discrete like [4,6,8], not a range), supported_frame_images (which frame_type values are accepted), generate_audio and seed (capability bools), pricing_skus, and allowed_passthrough_parameters. An out-of-set value returns a 400, so validate client-side.
#!/usr/bin/env bash
set -euo pipefail
PROMPT="a golden retriever playing fetch on a sunny beach"
MODEL="google/veo-3.1"
OUTPUT="video-$(date +%Y%m%d-%H%M%S).mp4"
# Build payload — extend with resolution/aspect_ratio/duration/etc. as needed.
payload=$(jq -n --arg model "$MODEL" --arg prompt "$PROMPT" \
'{model: $model, prompt: $prompt}')
submit=$(curl -sS -X POST https://openrouter.ai/api/v1/videos \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
-H "Content-Type: application/json" \
-d "$payload")
poll_url=$(echo "$submit" | jq -r '.polling_url')
echo "Submitted $(echo "$submit" | jq -r '.id')" >&2
while :; do
sleep 30
resp=$(curl -sS "$poll_url" -H "Authorization: Bearer $OPENROUTER_API_KEY")
# Avoid the name `status` — zsh treats it as read-only.
st=$(echo "$resp" | jq -r '.status')
echo "Status: $st" >&2
case "$st" in
completed) break ;;
failed|cancelled|expired)
echo "Generation $st: $(echo "$resp" | jq -r '.error // "unknown"')" >&2
exit 1 ;;
esac
done
curl -sS -L "$(echo "$resp" | jq -r '.unsigned_urls[0]')" \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
--output "$OUTPUT"
echo "$resp" | jq --arg out "$(realpath "$OUTPUT")" \
'{job_id: .id, generation_id, video_saved: $out, usage}'
Required: model, prompt. Common optional fields:
duration (int) — must be one of the model's supported_durations.resolution (string) / aspect_ratio (string) / size (string, "WxH") — size is interchangeable with resolution + aspect_ratio.generate_audio (bool) — only meaningful if the model's generate_audio capability is true.seed (int) — honored only if the model's seed capability is true.callback_url (HTTPS) — webhook instead of polling.frame_images[] — image-to-video; each entry is { type: "image_url", image_url: { url }, frame_type: "first_frame" | "last_frame" }.input_references[] — reference-to-video (style guidance); same entry shape, no frame_type. If both arrays are present, frame_images wins.provider.options.<slug>.parameters.<key> — provider passthrough, see below.Image url can be a public https:// URL or a local-file data URL: MIME=image/png; B64=$(base64 < file.png | tr -d '\n'); url="data:${MIME};base64,${B64}".
Provider-specific params go under provider.options.<slug>.parameters. The allowed keys for a given model are listed (flat) in allowed_passthrough_parameters on the models endpoint — but the meaning, value range, and required combinations come from the upstream provider's API docs (Google Vertex, Alibaba Dashscope, Kwai, ByteDance Volc Engine, MiniMax, OpenAI, etc.). Read the upstream docs before using an unfamiliar key; casing conventions differ between providers (Google/OpenAI use camelCase, most others use snake_case).
Example:
{
"model": "google/veo-3.1",
"prompt": "a time-lapse of a flower blooming",
"provider": {
"options": {
"google-vertex": {
"parameters": {
"personGeneration": "allow",
"negativePrompt": "blurry, low quality"
}
}
}
}
}
Pass callback_url (HTTPS) in the submit body. On terminal state, OpenRouter POSTs a video.generation.{completed,failed,cancelled,expired} event. Each delivery carries X-OpenRouter-Idempotency-Key: <job_id>-<status>. If a signing secret is configured on the workspace, verify X-OpenRouter-Signature: t=<ts>,v1=<hmac> — HMAC-SHA256 of <ts>,<raw_body> with the secret, reject timestamps older than ~5 minutes.
Video generation is not ZDR-eligible because the provider must temporarily retain the output for the async download step.