npx claudepluginhub wunki/amplify --plugin ask-questions-if-underspecifiedThis skill uses the workspace's default tool permissions.
Produce a SQLite vault that Ampi can query through:
Guides sqlite3 CLI usage to build composable SQLite knowledge databases, design schemas, query data, manage relationships, and output for agent parsing.
Initializes any folder as a Bedrock-powered Obsidian vault: creates entity directories, copies templates, configures language/domain taxonomy, scaffolds example entities, checks dependencies. For new Second Brain setups.
Runs 7-phase audit on Obsidian vault: structural scan, duplicate detection, link integrity check, frontmatter audit, MOC review, cross-agent integration, and health report.
Share bugs, ideas, or general feedback.
Produce a SQLite vault that Ampi can query through:
search_vault_keywordsearch_vault_semanticsearch_vault_deeplookup_vault_recordsThe output vault includes:
chunks and docs views,chunks_fts keyword index,search_schema and build_info,amplify_search_manifest with entity-level capabilities.Prefer the documents command for non-technical users.
documents) to ingest + build + check.ok: true, the vault is Ampi-ready.Use easy only when the source table already exists in SQLite.
documents folder
|
v
documents ingest -> source_table rows
|
v
build contract objects (chunks/docs + fts + semantic + manifest)
|
v
check contract (keyword + lookup + semantic smoke checks)
|
v
Ampi-ready SQLite vault
Before running commands, ask the user:
md, txt, docx, doc)?If the user does not know entity names, propose domain nouns immediately.
Use domain terms, not technical storage words.
Good pairs:
tickets + customerscall_notes + accountsproduct_feedback + productsresearch_findings + studiesAvoid generic pairs:
chunks + docsDefaults:
--source-table<chunks-entity-name>_docsBefore running any command, resolve the absolute path to the script. Use the
amplify repo root (ask the user if unknown, or locate it via git rev-parse --show-toplevel from the working directory):
SCRIPT=/path/to/amplify/skills/create-vault/scripts/bootstrap_ampi_vault.py
All command examples below use $SCRIPT as a placeholder for this path.
Document-first one-command path (preferred for new vaults from files):
python3 $SCRIPT documents \
--db /path/to/my-vault.sqlite \
--input-dir /path/to/documents \
--source-table source_rows \
--overwrite-table \
--chunks-entity-name tickets \
--docs-entity-name customers
--overwrite-table drops and recreates the source table before ingest. Include
it on every fresh build or rebuild. Omit only when intentionally appending to
an existing table.
Existing SQLite one-command path (source table must already exist):
python3 $SCRIPT easy \
--db /path/to/my-vault.sqlite \
--source-table source_rows \
--chunks-entity-name tickets \
--docs-entity-name customers
easy runs build then check. build runs build only (no check).
Inspect a source table before build. Run this when the table columns are unfamiliar or when a previous build failed with a field-mapping error:
python3 $SCRIPT inspect \
--db /path/to/my-vault.sqlite \
--source-table source_rows
Output shows detected columns, inferred field mapping, and a suggested build
command. If required fields (id_field, text_field) cannot be inferred,
supply them explicitly with --id-field and --text-field.
Validate an existing vault:
python3 $SCRIPT check \
--db /path/to/my-vault.sqlite
Keyword-only mode (skip sparse semantic index and semantic/deep capabilities):
python3 $SCRIPT documents \
--db /path/to/my-vault.sqlite \
--input-dir /path/to/documents \
--source-table source_rows \
--overwrite-table \
--no-semantic
.md, .markdown, .txt: native parsing..docx: native parsing from Office XML..doc: best-effort parsing via textutil (macOS) or antiword (Linux); skipped with a warning if neither is available.If many .doc files are skipped, install antiword (Linux: apt install antiword, macOS: brew install antiword) or convert the files to .docx first.
After ingest, check warnings_count in the output. A high count relative to total files means many documents were skipped; investigate before treating the vault as complete.
contract_version uses CalVer (default example: 2026.02.13.1).manifest_json.version is numeric (--manifest-version, default 1).Build fails with "Unable to resolve required field"
Auto-mapping could not find a column matching id_field or text_field. Run
inspect to see the detected columns, then re-run the build command with
explicit --id-field and --text-field flags.
Build fails with "No rows available in chunks view"
The input directory was empty, all files were skipped, or the source table has
no rows. Confirm the --input-dir path is correct and contains supported file
types. Check warnings_count to see if every file was rejected.
check output shows ok: false
Read the errors array in the output. Common causes: missing chunks_fts
table (FTS5 not compiled into SQLite), missing manifest table, or zero rows in
views. Re-run the build after addressing the specific error, or run with
--no-semantic if FTS5-related errors appear.
FTS5 not available
Some SQLite builds (notably older system SQLite on macOS) lack FTS5. Check with
python3 -c "import sqlite3; sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE t USING fts5(x)')".
If this raises OperationalError, install a newer SQLite or use --no-semantic
(which skips the sparse index but still requires FTS5 for keyword search — if
FTS5 is absent, the vault cannot be built and the user must upgrade SQLite).
Rebuilding an existing vault
Re-run the same documents or easy command with --overwrite-table. This
drops and recreates the source table and rebuilds all contract objects cleanly.
check) after build unless the user explicitly skips it.ok: false, read the errors array and diagnose before reporting results.warnings_count from document ingest so users know what was skipped.entities, capabilities, counts).