Use when running a prompt through OpenAI GPT directly. An alternative invocation point for /model-cli:codex — both use the same backend (codex binary or agent --model gpt-5.4-high fallback). Use this or /model-cli:codex interchangeably.
From model-clinpx claudepluginhub tony/ai-workflow-plugins --plugin model-cliThis skill is limited to using the following tools:
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Details PluginEval's skill quality evaluation: 3 layers (static, LLM judge), 10 dimensions, rubrics, formulas, anti-patterns, badges. Use to interpret scores, improve triggering, calibrate thresholds.
This is an alias for /model-cli:codex. Both entry points use the same backend.
Invoke the Codex CLI skill with $ARGUMENTS. If $ARGUMENTS is empty, ask the user what they want to run.
All triggers supported by /model-cli:codex are passed through, including timeout:<seconds>, timeout:none, and mode:plan.