From ultraship
Diagnoses and fixes non-indexed pages using GSC URL Inspection API and Bing Webmaster data. Identifies reasons like robots.txt blocks or noindex tags and applies targeted fixes.
npx claudepluginhub houseofmvps/ultraship --plugin ultraship<site-url>This skill is limited to using the following tools:
Diagnose why pages aren't indexed in Google and Bing, fix the root causes, and resubmit for indexing. This skill uses real data from GSC URL Inspection API and Bing Webmaster Tools — no guessing.
Troubleshoots and fixes Google Search Console indexing issues like 'Crawled - currently not indexed', noindex tags, static assets on Next.js/Vercel sites, canonicals, and soft-404s.
Audits Google indexing status of top site pages via Search Console APIs. Categorizes indexed, blocked, soft 404, or canonical issues with prioritized fix list.
Audits website architecture for technical SEO issues like indexing blocks, crawl budget waste, sitemaps, robots.txt, URL structures, and redirects. Scans local directories or live URLs.
Share bugs, ideas, or general feedback.
Diagnose why pages aren't indexed in Google and Bing, fix the root causes, and resubmit for indexing. This skill uses real data from GSC URL Inspection API and Bing Webmaster Tools — no guessing.
Goal: 100% index coverage for pages that SHOULD be indexed. Not every URL belongs in the index — staging pages, admin panels, thin pagination, and intentionally private content should stay excluded.
Verify data sources:
ULTRASHIP_GSC_CREDENTIALS or ULTRASHIP_GSC_ACCESS_TOKENULTRASHIP_BING_KEYIf GSC is not configured, show setup guide and stop — GSC is required for URL inspection.
Get the current index state:
node ${CLAUDE_PLUGIN_ROOT}/tools/index-doctor.mjs coverage <site-url>
This shows:
If Bing key is available:
node ${CLAUDE_PLUGIN_ROOT}/tools/index-doctor.mjs compare <site-url> <sitemap-url>
Compare Google vs Bing indexing to find:
Run the full diagnosis:
node ${CLAUDE_PLUGIN_ROOT}/tools/index-doctor.mjs diagnose <site-url> <sitemap-url>
This inspects up to 50 URLs via GSC URL Inspection API and reports:
Blocked by robots.txt (critical):
Noindex tag (critical):
<meta name="robots" content="noindex"> in the page HTMLSoft 404 (high):
Crawled but not indexed (high):
Discovered but not crawled (medium):
Redirect (medium):
Canonical mismatch (medium):
Server error (critical):
404 Not Found (high):
⚠️ SAFETY FIRST: Before applying ANY fix, verify the block/exclusion wasn't intentional. Ask the user if unsure. Removing noindex from staging pages or robots.txt Disallow from admin paths can expose sensitive content.
For each diagnosed issue, apply the fix:
robots.txt fixes:
noindex fixes:
Content quality fixes:
Sitemap cleanup:
node ${CLAUDE_PLUGIN_ROOT}/tools/sitemap-generator.mjs <dir> <base-url>
Internal linking:
Only resubmit pages where you've made SUBSTANTIAL fixes (added content, removed blocking directives, fixed server errors). Do NOT resubmit unchanged pages — this wastes API quota and can flag your account for spam.
After fixes, submit to both search engines:
Google:
# Resubmit sitemap (only after sitemap changes)
node ${CLAUDE_PLUGIN_ROOT}/tools/gsc-client.mjs submit-sitemap <site-url> <sitemap-url>
Bing:
# Submit sitemap (only after sitemap changes)
node ${CLAUDE_PLUGIN_ROOT}/tools/bing-webmaster.mjs submit-sitemap <site-url> <sitemap-url>
# Batch submit specific fixed URLs for fast indexing (max 500/day — only URLs you actually fixed)
node ${CLAUDE_PLUGIN_ROOT}/tools/bing-webmaster.mjs submit-url-batch <site-url> <url1> <url2> ...
Run the auto-fix command which diagnoses AND submits non-indexed URLs:
node ${CLAUDE_PLUGIN_ROOT}/tools/index-doctor.mjs fix <site-url> <sitemap-url>
This automatically:
After applying fixes:
node ${CLAUDE_PLUGIN_ROOT}/tools/index-doctor.mjs coverage <site-url>
Advise on preventing future indexing issues:
If fixes cause unexpected issues (pages appearing in search that shouldn't, traffic drops):
coverage to verify the rollback took effect