From research-workspace
Orchestrates research-hub CLI across AIs (Claude, Codex CLI, Gemini CLI) for planning, ingestion, crystal generation, CJK/English summaries, and literature workflows.
npx claudepluginhub wenyuchiou/ai-research-skills --plugin research-workspaceThis skill uses the workspace's default tool permissions.
research-hub can be driven by any AI that can run shell commands, call MCP tools, or call REST endpoints. Use this skill to decide when the primary assistant should work directly and when to delegate long or language-specific work to Codex CLI or Gemini CLI. The underlying workspace may connect Zotero, Obsidian, and NotebookLM, or any useful two-tool subset.
Delegates tasks like long document analysis, multimodal processing, research, summarization, and large file handling to Google Gemini CLI to preserve Claude context.
Chains tools and AI models into bash pipelines for multi-step workflows like research-to-summary, code review-to-fix, conditional branching, and parallel multi-agent analysis.
Operates research-hub CLI for literature discovery, source ingestion into Zotero/Obsidian/NotebookLM, reference organization, dashboard inspection, and vault maintenance. Use for finding papers, building knowledge bases, or generating research briefs.
Share bugs, ideas, or general feedback.
research-hub can be driven by any AI that can run shell commands, call MCP tools, or call REST endpoints. Use this skill to decide when the primary assistant should work directly and when to delegate long or language-specific work to Codex CLI or Gemini CLI. The underlying workspace may connect Zotero, Obsidian, and NotebookLM, or any useful two-tool subset.
Every command this skill emits goes through the research-hub CLI
(research-hub auto, research-hub plan, research-hub ask, etc.).
Before producing a delegation plan, verify the CLI exists:
research-hub --version
If that command is not found, the user installed only the Claude Code marketplace plugin. Stop and tell them:
This skill orchestrates
research-hubCLI commands across multiple AIs, but the CLI itself isn't installed. Please run:pip install research-hub-pipeline research-hub setup --persona researcher # or analyst | humanities | internalThen re-run your request.
Do not produce a plan that calls research-hub ... if the CLI is
missing — the user can't execute it.
Stage-agnostic, character-driven routing. This skill is NOT bound to a specific research stage (discover / ingest / design / write / submit). The decision to delegate is driven by task character — token-heavy code, long-context reading, CJK prose, mechanical bulk edits — not by which stage of the research pipeline you happen to be in. A 200-page systematic-review summary in Stage 1 routes to Gemini for the same reason a zh-TW cover letter in Stage 8 does: long CJK prose that pays off Gemini's strength.
| AI | Best role | Use when |
|---|---|---|
| Primary assistant | planning, review, domain judgment, user-facing explanation | ambiguous tasks, quality control, final synthesis |
| Codex CLI | Python/backend execution, tests, mechanical bulk work, crystal generation | long local processing or code-heavy workflows |
| Gemini CLI | long-context drafting and CJK output | Traditional Chinese summaries, bilingual briefs, long prose drafts |
research-hub plan "intent" for broad research requests.research-hub auto "topic" --with-crystals --llm-cli codex for long mechanical crystal generation when Codex is available.research-hub auto "topic" --with-crystals --llm-cli gemini when the user explicitly wants Chinese/Japanese/Korean output.research-hub auto "topic" --no-nlm for first-run validation before NotebookLM automation is trusted.research-hub ask <cluster> "question" before spending tokens on a fresh synthesis.research-hub plan "TOPIC"
research-hub auto "TOPIC" --with-crystals
research-hub auto "TOPIC" --with-crystals --llm-cli codex
research-hub auto "TOPIC" --with-crystals --llm-cli gemini
research-hub ask <cluster> "question"
Check available CLIs:
python -c "from research_hub.auto import detect_llm_cli; print(detect_llm_cli())"