From spotlight
Archives investigation findings into Obsidian vaults or directories as structured knowledge with entity notes, methodology notes, tool notes, registries, and wikilinks. Standalone or Spotlight pipeline.
npx claudepluginhub buriedsignals/skills --plugin spotlightThis skill uses the workspace's default tool permissions.
You are archiving confirmed investigation findings into a structured knowledge base.
Persists entities to vaults via single write point: detects, matches, creates/updates, and links bidirectionally. Handles structured lists, free-form text/notes, or graphify output; auto-resolves vault by flag, CWD, or default.
Runs 7-phase audit on Obsidian vault: structural scan, duplicate detection, link integrity check, frontmatter audit, MOC review, cross-agent integration, and health report.
Ingests files, URLs, and images into Obsidian wiki vault: extracts entities/concepts, creates/updates Markdown pages, cross-references, tracks deltas via manifest, supports batch mode.
Share bugs, ideas, or general feedback.
You are archiving confirmed investigation findings into a structured knowledge base.
This skill instructs. You — the host session — execute. You read investigation files, write vault notes, update registries, and maintain the knowledge graph. The user sees the result; you do the work.
Two input modes:
/ingest. You gather inputs interactively.The orchestrator passes project path and vault config (from .spotlight-config.json). All inputs are known:
vault_path — target vault or directoryvault_type — "obsidian" or "directory"project — project slugRead these case files:
cases/{project}/findings.json
cases/{project}/fact-check.json
cases/{project}/investigation-log.json
cases/{project}/summary.json
Skip to the Ingestion Process.
The user says /ingest or "ingest these findings."
Step 1 — Findings source:
"Point me to your findings file (JSON with claims, sources, and evidence)."
Read the file. Validate it contains a findings array where entries have sources. If the structure is wrong:
"This file doesn't match the expected format. I need a JSON file with a
findingsarray where each finding hasclaim,sources, andevidencefields."
STOP.
Step 2 — Vault target:
"Which vault or directory should I archive to?"
Check if the path contains .obsidian/:
ls -d {path}/.obsidian/ 2>/dev/null
vault_type = "obsidian" — wikilinks enabled.vault_type = "directory" — relative markdown links.Step 3 — Supplementary files:
Check whether these exist alongside the findings file. Use whatever is available; do not require all of them:
fact-check.json — verdict annotationsinvestigation-log.json — methodology, tools, search queriessummary.json — overview and conclusionsProceed to the Ingestion Process with whatever files were found.
Seven steps. Execute in order. Do not skip steps.
Read all registry files:
{vault}/_registry.json
{vault}/investigations/_registry.json
{vault}/entities/_registry.json
{vault}/methodology/_registry.json
{vault}/tools/_registry.json
If the vault is empty (registries do not exist), initialize each with the empty schema from references/registry-spec.md and schema_version: "1.0". Create the directories:
{vault}/investigations/
{vault}/entities/
{vault}/methodology/
{vault}/tools/
Write {vault}/investigations/{project-id}.md.
Frontmatter — per references/entity-model.md Investigation Note schema:
---
id: {project-id}
title: {from summary.json title, or derive from findings}
status: confirmed
date: {today YYYY-MM-DD}
regions: [{from findings}]
entities: [{entity IDs extracted in Step 3}]
methodology: [{technique IDs extracted in Step 4}]
tools: [{tool IDs extracted in Step 5}]
tags: [{derived from findings topics}]
verified_count: {count of high-confidence verified findings}
total_findings: {total findings count}
---
Body:
overview. If no summary.json, synthesize from findings.disputed or debunked, flag prominently: > **DISPUTED** — {reason} or > **DEBUNKED** — {reason}[[entity-id]] for each entity involved.[[technique-id]], [[tool-id]].Extract entities from:
findings.json — connections[].from and connections[].tofindings.json — named entities in findings[].claim (apply basic NER: proper nouns, organization names, geographic names)Infer entity type:
| Pattern | Type |
|---|---|
| Person names (first + last) | person |
| Known organization patterns (UN, EU, ministry, commission, etc.) | organization |
| Company indicators (Inc, Ltd, GmbH, AG, SA, etc.) | company |
| Geographic names (countries, cities, regions) | place |
Generate kebab-case ID from entity name.
If entity exists in {vault}/entities/_registry.json (match on id):
| [[{project-id}]] | {role description} | {date} |{project-id} to frontmatter investigations array (if not already present)If entity is new:
Create {vault}/entities/{entity-id}.md per references/entity-model.md:
---
id: {entity-id}
type: {inferred type}
subtype: {if determinable, else omit}
aliases: [{alternate names found in findings}]
country: {if determinable}
region: {if determinable}
investigations: [{project-id}]
first_seen: {today YYYY-MM-DD}
---
Body: Description, Role in Investigations table (one row for this project), Key Relationships (wikilinks to other entities from same investigation).
Extract techniques from investigation-log.json:
cycles[].methodology.techniques_usedIf technique exists in {vault}/methodology/_registry.json:
| [[{project-id}]] | {context} | {date} |cycles[].methodology.failed_approaches to "Lessons Learned" section{project-id} to frontmatter investigations arrayIf technique is new:
Create {vault}/methodology/{technique-id}.md per references/entity-model.md:
---
id: {technique-id}
type: technique
category: {infer from technique name}
tools: [{tool IDs used with this technique}]
investigations: [{project-id}]
---
Body: Description, Steps (if inferable from log), Tools (wikilinked), Usage History table, Lessons Learned.
Extract tools from investigation-log.json:
cycles[].methodology.tools_usedIf tool exists in {vault}/tools/_registry.json:
usage_count in frontmattercycles[].methodology.search_queries only if they are not duplicates of existing advice{project-id} to frontmatter investigations arrayIf tool is new:
Create {vault}/tools/{tool-id}.md per references/entity-model.md:
---
id: {tool-id}
type: tool
category: {infer from tool name}
url: {if known}
access: {if known, else omit}
methodology: [{technique IDs that use this tool}]
investigations: [{project-id}]
usage_count: 1
---
Body: Capabilities, Access Notes, Usage History table (one row), Tips for Future Agents (from search queries if useful).
This is mandatory. Update every registry affected by the ingestion.
{vault}/investigations/_registry.json — add or update the investigation entry.
{vault}/entities/_registry.json — add new entities, update investigations arrays for existing ones.
{vault}/methodology/_registry.json — add new techniques, update investigations arrays for existing ones.
{vault}/tools/_registry.json — add new tools, update investigations and usage_count for existing ones.
{vault}/_registry.json (master) — update stats counts and last_updated to current ISO 8601 timestamp.
See references/registry-spec.md for exact schemas.
Write {vault}/_INDEX.md using the template from references/registry-spec.md.
For Obsidian vaults: use wikilinks in the investigations table ([[project-id]]).
For directory fallback: use relative links ([project-id](investigations/project-id.md)).
When vault_type is "directory" (no .obsidian/ detected):
[[entity-id]] with relative markdown links [entity-id](../entities/entity-id.md).[[project-id]] with [project-id](../investigations/project-id.md).[[technique-id]] with [technique-id](../methodology/technique-id.md).[[tool-id]] with [tool-id](../tools/tool-id.md)._INDEX.md browse section uses relative links too.Frontmatter and registry JSON are identical regardless of vault type.
Before starting the ingestion process, check for a lock file:
ls {vault}/.ingest-lock 2>/dev/null
If present:
"Another ingestion is in progress. Wait for it to complete before running again."
STOP. Do not proceed.
If absent:
Create the lock:
echo "{project-id} $(date -u +%Y-%m-%dT%H:%M:%SZ)" > {vault}/.ingest-lock
Remove the lock when ingestion completes — whether successful or failed. Always clean up:
rm {vault}/.ingest-lock
If the process errors partway through, remove the lock before reporting the error.
id. If it exists, update it.references/entity-model.md. Agents rely on it programmatically. Never omit or rename fields.[[entity-id]] format in Obsidian vaults for all cross-references.swiss-leaks, john-doe, reverse-image-search.> **LOW CONFIDENCE** — {reason} if included at all.Reads from:
cases/{project}/findings.json
cases/{project}/fact-check.json
cases/{project}/investigation-log.json
cases/{project}/summary.json
{vault}/_registry.json
{vault}/investigations/_registry.json
{vault}/entities/_registry.json
{vault}/methodology/_registry.json
{vault}/tools/_registry.json
Writes to:
{vault}/investigations/{project-id}.md
{vault}/entities/{entity-id}.md (per entity)
{vault}/methodology/{technique-id}.md (per technique)
{vault}/tools/{tool-id}.md (per tool)
{vault}/investigations/_registry.json
{vault}/entities/_registry.json
{vault}/methodology/_registry.json
{vault}/tools/_registry.json
{vault}/_registry.json (master)
{vault}/_INDEX.md