npx claudepluginhub orq-ai/assistant-pluginssearch-term# Models List the AI models available in the user's orq.ai workspace. Optionally filter by a search term. ## Instructions ### 1. Parse arguments `$ARGUMENTS` is optional. If provided, use it as a search term to filter models (e.g., "gpt-4", "claude", "embedding", "anthropic"). The search should be case-insensitive and match against model name, provider, or capabilities. If the search term matches a model type (e.g., "chat", "embedding", "image", "tts", "stt", "rerank", "ocr"), use it as the `modelType` parameter directly. ### 2. Fetch data Use the `list_models` MCP tool to retrieve ...
/modelsLists all available pre-trained classification models in a formatted table with name, description/use case, and categories. Suggests relevant models based on context.
/modelsLists available AI models from providers in a numbered format, shows configuration status, and switches default model for this session via user selection.
/modelsConfigures model-tier routing for shiplog phase transitions by setting confirm, warn, or off mode in .shiplog/routing.md, creating directory if needed.
/modelsViews current Gemini model preset or switches to best, stable, or budget. Displays config summary, preset table, and confirms changes.
List the AI models available in the user's orq.ai workspace. Optionally filter by a search term.
$ARGUMENTS is optional. If provided, use it as a search term to filter models (e.g., "gpt-4", "claude", "embedding", "anthropic").
The search should be case-insensitive and match against model name, provider, or capabilities.
If the search term matches a model type (e.g., "chat", "embedding", "image", "tts", "stt", "rerank", "ocr"), use it as the modelType parameter directly.
Use the list_models MCP tool to retrieve available models. Important: The modelType parameter is always required — never call list_models without it.
modelType.list_models three times in parallel with: modelType: "chat", modelType: "completion", and modelType: "embedding".Present a clean summary using native markdown formatting (bold, headers, horizontal rules) — not inside a code block. This renders well in Claude Code's monospace terminal.
Output the models in this format:
# Orq.ai AI Router — Active Models
**12** chat · **3** embedding · **2** image
Models enabled in your workspace. Add providers or enable more models at **[AI Router → Models](https://my.orq.ai/)**.
---
### OpenAI (6)
- **gpt-5** — chat · 1M context
- **gpt-5-mini** — chat · 1M context
- **gpt-4.1** — chat · 1M context
- **text-embedding-3-large** — embedding · 3072 dims
- **text-embedding-3-small** — embedding · 1536 dims
- ... and 1 more
### Anthropic (3)
- **claude-sonnet-4-20250514** — chat · 200k context
- **claude-haiku-4-5-20251001** — chat · 200k context
- **claude-opus-4-1-20250805** — chat · 200k context
### Google (2)
- **gemini-2.5-pro** — chat · 1M context
- **gemini-2.5-flash** — chat · 1M context
If a search term was provided, filter and show only matching models:
# Orq.ai AI Router — Active Models [search: "embed"]
**3** embedding
Models enabled in your workspace. Add providers or enable more models at **[AI Router → Models](https://my.orq.ai/)**.
---
### OpenAI (2)
- **text-embedding-3-large** — embedding · 3072 dims
- **text-embedding-3-small** — embedding · 1536 dims
### Google (1)
- **text-embedding-004** — embedding · 768 dims
·, showing totals per model type.--- separator — a horizontal rule after the summary line to visually separate the overview from the detail sections.### Provider (N) headers — each provider gets a level-3 heading with its model count.**name** to visually anchor each list entry.· delimiters for metadata — use · (middle dot) to separate secondary attributes within a list item (e.g., type · context window). Use — (em dash) to separate the name from its metadata.ORQ_API_KEY is valid."claude mcp add --transport http orq-workspace https://my.orq.ai/v2/mcp --header 'Authorization: Bearer ${ORQ_API_KEY}'"