From hamelnb
Interacts with live local Jupyter notebook kernels to discover servers/notebooks, inspect contents, execute code incrementally, and edit cells while preserving kernel state. Use when avoiding expensive reruns.
npx claudepluginhub hamelsmu/hamelnbThis skill uses the workspace's default tool permissions.
Use this skill when a local notebook kernel already holds useful state and you do not want to rerun expensive setup.
Interacts with a live local Jupyter notebook kernel for Jupyter-like in-memory REPL, notebook inspection/editing with persistent kernel, and explicit verification passes.
Creates and edits reproducible Jupyter notebooks (.ipynb) for experiments, explorations, or tutorials using templates and helper script to avoid JSON errors.
Reads, modifies, executes, and converts Jupyter notebooks (.ipynb) programmatically. Use for data science workflows: edit cells, clear outputs, convert formats.
Share bugs, ideas, or general feedback.
Use this skill when a local notebook kernel already holds useful state and you do not want to rerun expensive setup.
When launching JupyterLab for use with this skill, disable token and password auth so the script can access the server API without browser session cookies:
jupyter lab --IdentityProvider.token='' --ServerApp.password=''
If the server is already running but returns 403 errors, restart it with these flags.
Run from the repo root.
SCRIPT=skills/jupyter-live-kernel/scripts/jupyter_live_kernel.py
Prefer uv for this skill. The helper script declares its runtime dependencies inline, so the default invocation is:
uv run "$SCRIPT" --help
Because the script uses inline metadata, uv run "$SCRIPT" stays self-contained even when you launch it from inside this repo.
If the script is executable and uv is on PATH, direct execution also works:
"$SCRIPT" --help
Fallback:
python3 "$SCRIPT" ... only if uv is unavailable and the required packages are already installed.uv add --script "$SCRIPT" <package>.uv run "$SCRIPT" servers --compact
uv run "$SCRIPT" notebooks --port 8899 --compact
uv run "$SCRIPT" contents --port 8899 --path demo.ipynb --compact
uv run "$SCRIPT" execute \
--port 8899 \
--path demo.ipynb \
--code $'x = 41\nprint("hello")\nx + 1' \
--compact
contents.uv run "$SCRIPT" edit \
--port 8899 \
--path demo.ipynb \
replace-source \
--cell-id <cell-id> \
--source $'x = 42\nx' \
--compact
To insert a new cell, use insert with --at-index, --before <int>, or --after <int>.
Note: --before/--after accept integer indices, not cell IDs.
uv run "$SCRIPT" edit \
--port 8899 \
--path demo.ipynb \
insert \
--at-index 1 \
--cell-type code \
--source $'print("hello")' \
--compact
The available edit subcommands are: replace-source, insert, delete, move, clear-outputs.
Core guidance:
--cell-id over --index for existing cells (replace-source, delete, move).insert, use --at-index (integer) — cell IDs are not accepted by --before/--after.execute for the normal loop.restart, run-all, and restart-run-all for explicit verification or reset requests, not routine iteration.--save-outputs with run-all or restart-run-all to persist cell outputs into the notebook file so they appear in the JupyterLab UI.execute with --cell-id, outputs are automatically saved to the notebook file (no need to pass --save-outputs separately). Without --cell-id, outputs are not saved.--no-save-outputs. This suppresses the automatic save even when --cell-id is provided. Only use this flag when the user explicitly says they don't need the notebook updated (e.g. "run this headlessly", "I don't care about the notebook output"). Default to saving outputs.Always make the live target explicit before edits or execution. Never guess when more than one live option exists.
Required behavior:
--port (or --server-url) on all follow-up commands.--path, use it.notebooks for the selected server and collect candidates.--session-id on execute/restart/run-all/variables commands to pin the exact kernel.Claude Code ambiguity flow:
AskUserQuestion for each ambiguity point (server, notebook, session) as a picker.Server, Notebook, Session).port, base URL.path, port.session id, kernel id, path.Using port 8888, notebook notebooks/tiny-demo.ipynb, session 1234...), then continue.Codex ambiguity flow:
Once selected, keep using the same port + path and, when applicable, session_id until the user asks to switch.
Inspect live Python-kernel variables:
uv run "$SCRIPT" variables --port 8899 --path demo.ipynb list --compact
uv run "$SCRIPT" variables --port 8899 --path demo.ipynb preview --name x --compact
Verification commands:
uv run "$SCRIPT" restart --port 8899 --path demo.ipynb --compact
uv run "$SCRIPT" run-all --port 8899 --path demo.ipynb --compact
uv run "$SCRIPT" run-all --port 8899 --path demo.ipynb --save-outputs --compact
uv run "$SCRIPT" restart-run-all --port 8899 --path demo.ipynb --save-outputs --compact
Advanced guidance:
variables is Python-only.run-all and restart-run-all require a notebook-backed live session.run-all and restart-run-all exit non-zero when a cell fails.run-all and restart-run-all verify a saved snapshot loaded at the start.--save-outputs to run-all or restart-run-all to persist cell outputs and execution counts back into the notebook file. Without this flag outputs are not written back.execute, variables, run-all, and restart-run-all default to --transport auto.
websocket: use Jupyter Server kernel channels at /api/kernels/<kernel_id>/channelszmq: fall back to the local kernel connection file with jupyter_clientauto: try websocket first, then use local ZMQ fallback only when the websocket request did not already reach the kernelPrefer auto unless you are debugging transport behavior.
contents returns the saved notebook file, not unsaved browser edits.edit writes through the Contents API and changes the saved notebook on disk.--limit <= 100.--max-chars <= 2000 and avoids arbitrary repr(...) calls for non-scalar objects.