Use this skill for "review this paper", "review this manuscript", "peer review", "review my paper", "critique this manuscript", "review this submission", "give me feedback on my paper", "check my methods", "review my statistics", "review as a peer reviewer", "evaluate this manuscript", "review this PDF", or mentions manuscript review, peer review, paper critique, or methodological review.
From manuscriptnpx claudepluginhub neuromechanist/research-skills --plugin manuscriptThis skill uses the workspace's default tool permissions.
references/figure-review-guide.mdreferences/methodology-checklist.mdreferences/review-output-template.mdreferences/review-principles.mdreferences/statistical-review-guide.mdProvides UI/UX resources: 50+ styles, color palettes, font pairings, guidelines, charts for web/mobile across React, Next.js, Vue, Svelte, Tailwind, React Native, Flutter. Aids planning, building, reviewing interfaces.
Fetches up-to-date documentation from Context7 for libraries and frameworks like React, Next.js, Prisma. Use for setup questions, API references, and code examples.
Builds 3-5 year financial models for startups with cohort revenue projections, cost structures, cash flow, headcount plans, burn rate, runway, and scenario analysis.
Provides structured, rigorous peer review of academic manuscripts. Reviews prioritize methodological soundness, statistical validity, logical consistency, and reproducibility.
Note on review calibration: This skill reflects an opinionated review style that prioritizes methodological precision, statistical rigor, and reproducibility. It is direct, evidence-based, and holds manuscripts to high standards. The severity calibration (Critical, Major, Minor) follows a strict hierarchy: Critical issues block publication; Major issues require significant revision; Minor issues improve polish. Reviewers using this skill should adapt the tone and depth to their own standards and the target journal's expectations.
Activate when the user wants peer-review feedback on a manuscript (journal article, conference paper, preprint) evaluated for methodological soundness, statistical validity, and clarity of presentation. The output is a structured review with categorized concerns and constructive suggestions.
Manuscripts for peer review are typically provided as PDFs from journal submission systems. Convert to both markdown and PNG for a complete review: markdown for efficient text analysis, PNG for exact page/line citations and figure inspection.
PDF (most common): Use a hybrid approach: convert to both markdown and PNG. Markdown gives efficient searchable text for content analysis; PNG preserves exact page layout, line numbers, and figure positions for precise citations.
Step 1: Convert to markdown for text analysis:
uvx opencite convert manuscript.pdf -o manuscript.md
Step 2: Convert to PNG for page/line references and figure inspection:
uv run --with pdf2image --with pillow python -c "
from pdf2image import convert_from_path
pages = convert_from_path('manuscript.pdf', dpi=200)
for i, page in enumerate(pages):
page.save(f'manuscript_page_{i+1}.png', 'PNG')
"
Note: requires poppler (brew install poppler on macOS, apt install poppler-utils on Linux). Alternatively, use pdftoppm -png -r 200 manuscript.pdf manuscript_page.
Workflow: Read the markdown for content review (methods, statistics, logic, literature). When citing a specific issue, refer to the PNG pages to provide exact page and line numbers (e.g., "page 4, line 23" or "p4 l23"). Use the PNGs to inspect figures, tables, and overall layout.
For large PDFs (>10 pages), read PNGs in batches as needed.
Markdown or LaTeX: Read directly; no conversion needed.
Read all sections including supplementary materials, appendices, and figures. Note the target journal if known, as expectations differ across venues (transactions vs. letters vs. conference proceedings).
Read everything: abstract, introduction, methods, results, discussion, conclusion, figures, tables, supplementary materials. Take note of:
This is the core of the review. Evaluate using the checklist in references/methodology-checklist.md. Key areas:
Experimental design:
Signal processing and data analysis (when applicable):
Statistical methods:
Trace the argument from introduction through methods to results and discussion:
Watch for contradictions: claims in the introduction that the authors' own methods cannot test, or discussion points that go beyond what the data show.
Use opencite to verify literature claims and search for potentially missing references:
uvx opencite search "topic keywords" --max 10 --sort citations
uvx opencite canonical "field or method" --max 5
When citing references in the review to support a methodological argument, include the full citation so the authors can verify the claim.
Read each figure and table carefully:
Structure the review according to the template in references/review-output-template.md:
Every concern must:
Consult references/review-principles.md for the full rationale. Summary:
references/review-output-template.md - Complete review output format with examplesreferences/methodology-checklist.md - Detailed methodological assessment checklistreferences/review-principles.md - Review philosophy and calibration guidancereferences/statistical-review-guide.md - Common statistical issues and how to identify themreferences/figure-review-guide.md - Figure quality assessment criteria