From local-seo-skills
Analyzes Screaming Frog crawl data (CSV/Excel) for technical SEO audits including broken links, duplicate content, location page quality, schema validation, and internal linking.
npx claudepluginhub garrettjsmith/localseoskillsThis skill uses the workspace's default tool permissions.
Screaming Frog is a desktop crawler with community-built MCP servers available. Even without MCP, the agent can analyze exported crawl data (CSV/Excel). This is your primary technical SEO audit tool.
Creates isolated Git worktrees for feature branches with prioritized directory selection, gitignore safety checks, auto project setup for Node/Python/Rust/Go, and baseline verification.
Executes implementation plans in current session by dispatching fresh subagents per independent task, with two-stage reviews: spec compliance then code quality.
Dispatches parallel agents to independently tackle 2+ tasks like separate test failures or subsystems without shared state or dependencies.
Screaming Frog is a desktop crawler with community-built MCP servers available. Even without MCP, the agent can analyze exported crawl data (CSV/Excel). This is your primary technical SEO audit tool.
| You Need | Use Screaming Frog | Use Instead |
|---|---|---|
| Full technical site crawl | ✅ Most detailed crawler | Semrush site audit (lighter) |
| Location page quality audit at scale | ✅ Best for this | — |
| Custom data extraction (NAP, schema fields) | ✅ Unique capability | — |
| Duplicate content detection | ✅ | — |
| Internal linking analysis | ✅ | — |
| Redirect chain detection | ✅ | — |
| Schema validation per page | ✅ Custom extraction | — |
| Missing titles/metas across hundreds of pages | ✅ | — |
| Keyword rankings | ❌ | Local Falcon, Semrush |
| Backlink data | ❌ | Ahrefs |
| Search traffic data | ❌ | GSC, GA4 |
| Citation data | ❌ | BrightLocal |
The agent can trigger crawls and read results directly.
The user runs the crawl locally and exports data. The agent analyzes the exported CSV/Excel files.
Tell the user what to export:
When: User has a multi-location site and needs to verify all location pages are technically sound.
What to check in crawl data:
| Check | Where to Find | What's Wrong If... |
|---|---|---|
| Title tags | Title 1 column | Duplicate titles across locations, missing titles, truncated titles |
| Meta descriptions | Meta Description 1 | Duplicate metas, missing metas, boilerplate metas |
| H1 tags | H1-1 column | Missing H1, duplicate H1s, H1 doesn't include service + city |
| Word count | Word Count column | Under 300 words = thin content (Google may not index) |
| Status codes | Status Code column | 404 errors, 302 redirects (should be 301), 5xx errors |
| Canonical tags | Canonical Link Element 1 | Self-referencing canonical missing, or canonical pointing to wrong page |
| Internal links in | Inlinks column | 0 or 1 internal links = orphan page |
| Schema present | Custom extraction needed | No LocalBusiness schema on location pages |
| NAP on page | Custom extraction needed | Missing or inconsistent NAP |
| Page speed | PageSpeed tab (if PSI integration enabled) | LCP > 2.5s, CLS > 0.1 |
What to tell the user to configure:
LocalBusiness Schema Extraction:
script[type="application/ld+json"]NAP Extraction:
Per-page schema validation the agent should do:
@type correct? (e.g., Dentist, Plumber, LocalBusiness)name match GBP exactly?address match GBP exactly?telephone match GBP exactly?openingHoursSpecification present?geo coordinates present and correct?areaServed present (for SABs)?When: Multi-location sites often have boilerplate location pages with only the city name changed.
What to check:
What "unique enough" looks like:
When: Location pages aren't getting organic traffic and you suspect they're orphaned or poorly linked.
What to check:
Healthy internal linking for location pages:
When: User migrated sites, changed URLs, or has old location pages that redirected.
What to check:
| Issue | Impact | How to Find |
|---|---|---|
| Location pages returning 404 | Pages completely invisible | Status Code = 404, filter to location URLs |
| Location pages not in sitemap | Google may not discover them | Cross-reference sitemap URLs with crawled URLs |
| Location pages blocked by robots.txt | Google can't crawl them | Indexability column = "Blocked by Robots.txt" |
| Location pages with noindex | Google won't index them | Meta Robots column contains "noindex" |
| Issue | Impact | How to Find |
|---|---|---|
| Duplicate titles across locations | Google may suppress duplicates | Title 1 column — sort and find duplicates |
| Thin content (under 300 words) | Google may not index | Word Count column < 300 |
| Missing schema | Losing structured data signals | Custom extraction shows empty |
| Orphan pages (0-1 internal links) | Low crawl priority, low authority | Inlinks column = 0 or 1 |
| Issue | Impact | How to Find |
|---|---|---|
| Missing meta descriptions | Lower CTR from search results | Meta Description 1 = empty |
| Images without alt text | Accessibility + minor SEO signal | Images tab, Alt Text column empty |
| Redirect chains | Wasted crawl budget, slow page loads | Redirect Chains report |
| What You Found | Next Action | Skill |
|---|---|---|
| Location pages with thin content | Rewrite with unique, substantial content per location | local-landing-pages |
| Missing schema on location pages | Implement LocalBusiness schema | local-schema |
| Duplicate titles/metas | Rewrite with unique, keyword-targeted titles per location | local-landing-pages |
| Orphan location pages | Fix internal linking structure | local-landing-pages |
| Indexing issues (noindex, robots.txt, missing from sitemap) | Fix technical issues | local-seo-audit |
| NAP inconsistencies found via custom extraction | Fix on-page NAP to match GBP exactly | gbp-optimization, local-citations |
| All technical issues documented | Package into audit report for client | client-deliverables |
Default next step: Screaming Frog crawl data is a goldmine but it's raw. Always prioritize: indexing blockers → content issues → structural issues → optimization opportunities.