From readitlater-digest
Generate a themed weekly digest from Obsidian ReadItLater bookmarks. Use this skill when the user runs /readitlater-digest:digest or asks to process bookmarks, create a reading digest, summarize saved articles, or consolidate ReadItLater files. Also trigger on cron/loop invocations targeting bookmark processing.
npx claudepluginhub kjgarza/marketplace-claude --plugin readitlater-digestThis skill is limited to using the following tools:
Guides Payload CMS config (payload.config.ts), collections, fields, hooks, access control, APIs. Debugs validation errors, security, relationships, queries, transactions, hook behavior.
Designs, audits, and improves analytics tracking systems using Signal Quality Index for reliable, decision-ready data in marketing, product, and growth.
Enforces A/B test setup with gates for hypothesis locking, metrics definition, sample size calculation, assumptions checks, and execution readiness before implementation.
Generate a themed digest from unprocessed Obsidian ReadItLater bookmarks. Designed for automated runs via cron or /loop.
Read settings from .claude/readitlater-digest.local.md. If it doesn't exist, prompt the user to create one using the template below, then stop.
Required settings (YAML frontmatter):
| Field | Default | Description |
|---|---|---|
vault_path | /Volumes/Verbatim-Vi560-Media/Development/notes/scratchpad | Obsidian vault root |
inbox_folder | ReadItLater | Folder where ReadItLater saves bookmarks |
digest_folder | Digests | Folder for generated digests |
archive_folder | <inbox_folder>/Archive | Where processed bookmarks get moved (default: <inbox_folder>/Archive) |
Derived paths:
db_path = <vault_path>/.readitlater-digest.dbinbox_path = <vault_path>/<inbox_folder>digest_path = <vault_path>/<digest_folder>archive_path = <vault_path>/<archive_folder> (default: <inbox_path>/Archive)If archive_folder is not set, derive it as <inbox_folder>/Archive.
bunBefore running any script, resolve the bun executable once:
BUN=$(command -v bun 2>/dev/null || echo "$HOME/.bun/bin/bun")
Use $BUN in place of bun throughout all subsequent script invocations. This prevents silent failures in cron/non-interactive shells where .zshrc is not sourced.
Run these steps in order. If --dry-run is passed as an argument, stop after Step 2 and report what would be digested.
Before starting the pipeline, check whether a digest already exists for the current week:
sqlite3 "<db_path>" \
"SELECT id, file_path FROM digests WHERE week_start = '<week_start>' AND week_end = '<week_end>'"
If a row is returned, abort immediately and report: "A digest already exists for <week_start> to <week_end>: <file_path>. Pass --force to regenerate." Do not proceed unless the user explicitly confirms.
$BUN run ${CLAUDE_PLUGIN_ROOT}/scripts/init-db.ts --db-path "<db_path>"
This is idempotent — safe to run every time.
Also ensure the archive path exists:
mkdir -p "<archive_path>"
$BUN run ${CLAUDE_PLUGIN_ROOT}/scripts/scan-bookmarks.ts \
--inbox-path "<inbox_path>" \
--db-path "<db_path>"
The script outputs JSON with new, duplicates, already_tracked, and errors fields.
If there are zero new + zero unprocessed bookmarks, report "No unprocessed bookmarks" and stop. Do not generate an empty digest.
If --dry-run, print the scan results and stop here.
Query the database or use the scan output to identify all unprocessed bookmarks.
Batch-read all bookmark files in one pass — run a single shell command to read all inbox markdown files at once rather than one file per invocation:
cat "<inbox_path>"/*.md "<inbox_path>"/**/*.md 2>/dev/null
Or use the Read tool on each file in a single parallel batch rather than sequentially.
Classify bookmarks before enrichment. After reading, separate bookmarks into two groups:
Content enrichment — for resolvable bookmarks that are thin (<200 words), fetch the original URL with WebFetch. Run fetches concurrently where possible. If the fetch fails or returns little content, treat as unresolvable.
Extract substantially more content per bookmark than a bare summary — capture key arguments, notable quotes, specific data points, and the author's main thesis. The goal is enough material to write informed editorial commentary, not just a blurb.
Determine week boundaries (default: Monday-Sunday of the current week, or use --week YYYY-MM-DD to specify the Monday).
Week label accuracy: The week frontmatter field reflects the calendar week boundaries used to select bookmarks (e.g., 2026-03-30 to 2026-04-05). Additionally, include a date_range field showing the actual span of bookmark date_saved values (e.g., 2026-03-24 to 2026-03-29). If bookmarks span multiple weeks, use the most recent Monday-Sunday as week but always set date_range to the actual bookmark dates.
Write the digest to: <digest_path>/Digest — <week_start> to <week_end>.md
Template:
---
type: digest
date_generated: <ISO date>
week: <week_start> to <week_end>
date_range: <earliest_bookmark_date> to <latest_bookmark_date>
bookmark_count: <count>
themes:
- <theme 1>
- <theme 2>
---
# <Creative Title>
<Opening paragraph: 2-4 sentences establishing the throughline — what connected this week's reading, why it matters, and the editorial lens.>
## <Theme Name — evocative, not just descriptive>
<Editorial prose weaving multiple bookmarks into a narrative. Link to sources inline using [display text](url) markdown links. Discuss, compare, and comment on the articles rather than listing them. Each paragraph should read like a newsletter essay — opinionated, specific, and useful. Mention what's interesting, what's surprising, what connects to other reads. If a bookmark's content was thin or inaccessible, acknowledge it naturally in the prose (e.g., "though the full content wasn't extractable from the bookmark").>
<Continue with more paragraphs as needed for the theme. Multiple bookmarks per paragraph when they relate.>
## <Next Theme>
...
## <Catch-all section: "Quick Saves" or similar>
<Brief mentions of bookmarks that don't warrant full commentary but are worth noting. One or two sentences each, still in prose form.>
## Unresolvable Links
<Include this section only if there were unresolvable bookmarks. List each as a bullet with title and URL. No editorial commentary — just the link.>
- [<title>](<url>)
- ...
---
<Closing reflection: 1-2 sentences reflecting on the week's reading as a whole — a parting thought, not a summary.>
*Generated by ReadItLater Digest*
Writing style rules:
[display text](url) markdown links woven into sentences — never structured link blocks.Pass each bookmark file as a separate --bookmark-files flag. Do not use comma-separated values — filenames can contain commas.
$BUN run ${CLAUDE_PLUGIN_ROOT}/scripts/update-db.ts \
--db-path "<db_path>" \
--digest-file "<relative_path_to_digest>" \
--bookmark-files "<file1.md>" \
--bookmark-files "<file2.md>" \
--bookmark-files "<file with, comma.md>" \
--week-start "<YYYY-MM-DD>" \
--week-end "<YYYY-MM-DD>" \
--themes-json '<JSON array of {"name": "...", "count": N}>'
Build the argument list dynamically in shell:
ARGS=(--db-path "<db_path>" --digest-file "<relative_path_to_digest>" \
--week-start "<YYYY-MM-DD>" --week-end "<YYYY-MM-DD>" \
--themes-json '<json>')
for f in "${BOOKMARK_FILES[@]}"; do
ARGS+=(--bookmark-files "$f")
done
$BUN run ${CLAUDE_PLUGIN_ROOT}/scripts/update-db.ts "${ARGS[@]}"
$BUN run ${CLAUDE_PLUGIN_ROOT}/scripts/cleanup.ts archive \
--db-path "<db_path>" \
--inbox-path "<inbox_path>" \
--archive-path "<archive_path>"
After completing the pipeline, print a short summary:
archived and continue.date_range in frontmatter to the actual bookmark date span.--bookmark-files flags in Step 6. Filenames can contain commas.