From academic-research
Automates university assignments: analyzes PDF/text requirements, researches topics, implements code with tests, generates cited LaTeX/Word reports, paraphrases for anti-plagiarism (macOS).
npx claudepluginhub jeandiable/academic-research-plugin --plugin academic-researchThis skill uses the workspace's default tool permissions.
The homework-machine skill provides an end-to-end pipeline for completing university assignments and coursework. It automates the entire workflow from requirement analysis through final deliverable assembly, including:
Provides UI/UX resources: 50+ styles, color palettes, font pairings, guidelines, charts for web/mobile across React, Next.js, Vue, Svelte, Tailwind, React Native, Flutter. Aids planning, building, reviewing interfaces.
Fetches up-to-date documentation from Context7 for libraries and frameworks like React, Next.js, Prisma. Use for setup questions, API references, and code examples.
Explores codebases via GitNexus: discover repos, query execution flows, trace processes, inspect symbol callers/callees, and review architecture.
Share bugs, ideas, or general feedback.
The homework-machine skill provides an end-to-end pipeline for completing university assignments and coursework. It automates the entire workflow from requirement analysis through final deliverable assembly, including:
The workflow produces publication-ready reports, tested code, and all supporting materials needed for submission.
<assignment> (required) — Path to assignment PDF or plain text file containing assignment requirements--format (optional, default: latex) — Output report format: latex or docx--language (optional, default: English) — Report writing language for the academic reportExample usage:
homework-machine /path/to/assignment.pdf --format latex --language English
homework-machine "assignment_text.txt" --format docx
Before running this skill, ensure all dependencies are installed:
pip install -r "BASE_DIR/scripts/requirements.txt"
If using .docx output format, also install the Word document library:
pip install python-docx
Verify that the following helper scripts exist in BASE_DIR/scripts/:
paper_search.py — Academic paper search and retrievalbibtex_utils.py — BibTeX citation managementtranslate_roundtrip.py — Translation-based paraphrasing (macOS only)The homework-machine executes 6 phases in sequence. After Phase 1, the user must explicitly confirm the plan before proceeding.
Read the Assignment
Extract Deliverables
Identify Constraints and Grading Criteria
Present the Plan to User
Conduct Literature Search
python "BASE_DIR/scripts/paper_search.py" --query "<assignment-topic>" --max-results 20 --sort citations
Collect Findings
Organize Research Results
Write Implementation Code
Write Test Cases
Run and Verify
Save Code
code/ directory in the outputrequirements.txt if external packages are usedREADME.md with instructions to run the code and testsDraft Academic Report
For LaTeX Format (--format latex)
For DOCX Format (--format docx)
from docx import Document
from docx.shared import Pt, RGBColor
from docx.enum.text import WD_ALIGN_PARAGRAPH
doc = Document()
doc.add_heading('Your Title', 0)
doc.add_paragraph('Your introduction text...')
# ... add more content ...
doc.save('report.docx')
Fetch and Format Citations
python "BASE_DIR/scripts/bibtex_utils.py" fetch --title "<paper-title>"
references.bibAccuracy and Depth
This phase uses Apple's built-in translation API to automatically rephrase content while preserving technical accuracy. It is only available on macOS systems with AppleScript support.
Generate Technical Terms File
terms.txt in the output directory (one term per line)Paraphrase Substantial Paragraphs
python "BASE_DIR/scripts/translate_roundtrip.py" --input <paragraph_file> --terms-file terms.txt --diff
Review and Approve
Error Handling
Apply Approved Changes
paraphrase_diff.md showing all accepted before/after changes for referenceCollect All Outputs
./output/homework-machine/YYYY-MM-DD-HHMMSS/Directory Structure
./output/homework-machine/2026-03-03-143025/
├── report.tex (or report.docx if --format docx)
├── references.bib (LaTeX) or bibliography data (docx)
├── code/
│ ├── main.py
│ ├── tests/
│ │ └── test_main.py
│ ├── requirements.txt
│ └── README.md
├── terms.txt (if paraphrasing ran)
├── paraphrase_diff.md (if paraphrasing ran)
└── checklist.md (submission checklist)
Generate Submission Checklist
checklist.md with all required deliverables from the assignmentFinal Verification
Use this template as the foundation for LaTeX reports. If the assignment provides a specific LaTeX template, use that instead.
\documentclass[12pt]{article}
\usepackage[margin=1in]{geometry}
\usepackage{amsmath, amssymb}
\usepackage{graphicx}
\usepackage{hyperref}
\usepackage[numbers]{natbib}
\usepackage{booktabs}
\title{Your Assignment Title}
\author{Your Name}
\date{\today}
\begin{document}
\maketitle
\begin{abstract}
Provide a concise summary of the problem, approach, and key results (150-250 words).
\end{abstract}
\section{Introduction}
\label{sec:intro}
Introduce the problem, explain its significance, and outline the scope of the work.
State the main contributions clearly.
\section{Related Work}
\label{sec:related}
Review relevant prior work and methods. Cite key papers using \cite{key}.
\section{Method / Approach}
\label{sec:method}
Describe the solution approach in detail. Include algorithms, pseudocode, and
mathematical formulations as needed.
\subsection{Algorithm Name}
Provide pseudocode or detailed steps here.
\section{Experiments and Results}
\label{sec:results}
Present experimental findings, results, and analysis.
\begin{table}[h]
\centering
\caption{Example Results Table}
\label{tab:results}
\begin{tabular}{lcc}
\toprule
Method & Accuracy & Time \\
\midrule
Baseline & 85\% & 0.5s \\
Proposed & 92\% & 0.6s \\
\bottomrule
\end{tabular}
\end{table}
\section{Discussion}
\label{sec:discussion}
Interpret the results, discuss limitations, and suggest directions for future work.
\section{Conclusion}
\label{sec:conclusion}
Summarize the key findings and contributions.
\bibliographystyle{plainnat}
\bibliography{references}
\end{document}
All deliverables are saved to: ./output/homework-machine/YYYY-MM-DD-HHMMSS/
The timestamp ensures each run creates a unique output directory. Contents include:
main.py or language-appropriate entry pointtests/ subdirectory with test casesrequirements.txt listing dependenciesREADME.md with build and execution instructionsAlways Confirm Before Heavy Work — After Phase 1 analysis, present the plan to the user and wait for explicit confirmation. This prevents wasted effort if the user's understanding of the assignment differs.
Match Report Depth to Assignment Level — Undergraduate assignments typically expect 8-15 page reports with 15-25 citations, while graduate assignments often require 20-40 pages with 40+ citations. Adjust accordingly.
Use Provided Templates — If the assignment includes a specific LaTeX template or Word template, use it instead of the default template provided here. Adapt the structure to match instructor expectations.
Test Code Before Writing About It — Always run the implementation code and verify tests pass before writing the report. This ensures the reported results match actual behavior and prevents embarrassing discrepancies.
Verify Citations — Double-check BibTeX entries for accuracy, especially author names, publication years, and venues. Incorrect citations reflect poorly on academic work.
Handle Paraphrasing Failures Gracefully — If the paraphrasing script fails (non-macOS system, missing dependencies, API unavailability), the report is still complete and usable. Warn the user but continue to Phase 6.
Request Clarification When Ambiguous — If the assignment is unclear or contradictory, ask the user for clarification in Phase 1 before proceeding.
Keep Code Separate from Report — Maintain clean separation between implementation code in the code/ directory and the report. Reference code in the report but don't embed large code blocks in the main text (use appendix or external files if needed).
Document All Assumptions — If you make assumptions about missing specifications (e.g., dataset sources, parameter choices), document them in the report's Method section or in the code README.
Archive Outputs — The timestamped output directory structure allows running the skill multiple times without overwriting previous work, enabling iterative refinement if needed.