Streamline programmatic data ingestion against sites and apps that don't ship a documented API — capture network traffic, map endpoints, infer schemas, and produce a draft OpenAPI spec you can build a stable client against. White-hat use only.
npx claudepluginhub danielrosehill/claude-code-plugins --plugin browser-data-captureAnalyze a HAR file (HTTP Archive — exported from Chrome/Firefox DevTools → Network tab → Save all as HAR with content) and produce a normalized inventory of the endpoints it captured, with inferred request/response schemas, auth scheme detection, and a human-readable summary. The zero-install capture path. Use when the user supplies a .har file, says "analyze this HAR", "what endpoints does this site call", or "I exported the network tab, now what".
Capture network traffic via a local mitmproxy instance and emit a normalized capture file the rest of the plugin can analyze. Use when the user wants to capture traffic from a desktop app, mobile app on the same network, or a browser without using DevTools — anywhere a HAR export isn't practical. The plugin starts mitmdump in the background, walks the user through trusting the local CA, runs for a chosen duration or until stopped, then converts the flow file into the same normalized shape analyze-har produces.
Produce or update a versioned "domain map" document for a single host — a curated, human-readable summary of what's been learned about a target across one or many capture runs. Aggregates endpoints, auth scheme, pagination style, rate-limit signals, ToS notes, and integration notes the user has added by hand. Writes to the user's private maps repo (if registered via init-private-repo) or to the plugin data directory otherwise. Use when the user says "make a map for example.com", "update the example.com map with this capture", "give me a clean summary of what we've learned about this site".
Help the user act as a good-faith reporter when a capture run incidentally surfaces something that looks like an unauthenticated endpoint exposing data that probably shouldn't be public, a leaked token in a response, or another security-relevant finding. Drafts a courteous, vendor-friendly disclosure email, identifies the right contact (security.txt, security@, abuse@, public bug-bounty programme), and writes the report into the maps repo so the user has a record. White-hat use only. Use when the user says "we found something we should report", "looks like an open endpoint", "draft a disclosure email for this finding".
Generate a draft OpenAPI 3.1 spec from an endpoints inventory produced by analyze-har, capture-via-proxy, or observe-tab. Reads endpoints.json + inferred schemas and emits openapi.yaml plus openapi.json. Use when the user wants to build a stable client against a captured surface — "make an OpenAPI spec from this", "generate a spec so I can build a client", "turn the capture into something I can codegen against".
Set up a private GitHub repository for version-controlling the user's domain maps and capture history. Creates the repo via gh, clones locally to a user-chosen path, scaffolds the standard layout (maps/, captures/, README, .gitignore), and registers the local path in the plugin's config so create-domain-map writes there. Use when the user says "set up a private repo for my maps", "I want to version-control my findings", or during setup when they accept the offer to track maps in git.
Optional convenience path — observe a live browser tab via the claude-in-chrome MCP server (read_network_requests) and build an endpoint inventory as the user interacts with the page. Useful when the user is already driving a tab and doesn't want to set up a proxy or export a HAR. Requires the claude-in-chrome MCP server. For most cases prefer analyze-har (zero install) or capture-via-proxy (works against any client). Trigger phrases: "observe this tab", "watch the network as I click around", "map this site as I use it".
First-run setup for Browser-Data-Capture — provisions the user data folder under the standard $CLAUDE_USER_DATA pattern, optionally registers a path to the user's private maps repo for version-controlled storage, and verifies dependencies (mitmproxy, genson, pyyaml, optional openapi-spec-validator). Use when onboarding a new machine, when another skill reports a missing dependency, or when the user says "set up browser-data-capture" / "configure capture" / "where does this plugin store things".
A Claude Code plugin to streamline programmatic data ingestion against sites and apps that don't ship a documented API. Capture network traffic, map endpoints, infer schemas, and produce a draft OpenAPI spec you can build a stable client against.
White-hat use only. This plugin is for building integrations against open data sources, public-interest data, your own systems, and sites where you have the right and intent to ingest. It does not bypass authentication, evade rate limits, or enumerate hidden surfaces. If a finding turns up incidentally that looks security-relevant, the
disclose-findingskill helps you act as a good-faith reporter.
Three independent ways to feed the plugin — pick whichever fits your situation:
| Path | Tool | Strength | Use when |
|---|---|---|---|
| HAR | Browser DevTools | Zero install, works in any browser | One-off captures, sharing with a teammate, no proxy setup |
| mitmproxy | capture-via-proxy | Captures any client, not just browsers | Desktop apps, mobile apps on your LAN, browsers across navigations |
| claude-in-chrome | observe-tab | Lowest friction if MCP server is installed | You're already driving a Chrome tab in this Claude session |
All three normalize to the same internal endpoints.json shape so the downstream skills (generate-openapi, create-domain-map) work transparently regardless of capture path.
setup — first-run setup; provisions the data folder, registers your private maps repo (optional), records your disclosure contact, checks dependencies.analyze-har — analyze a HAR exported from DevTools.capture-via-proxy — run mitmdump in the background and capture traffic from any client on the machine (or a phone proxied to it).observe-tab — observe a live Chrome tab via the claude-in-chrome MCP server (optional dependency).generate-openapi — turn a captured inventory into a draft OpenAPI 3.1 spec.create-domain-map — produce or update a versioned, human-readable map of a single target domain across multiple capture runs.init-private-repo — provision a private GitHub repo for version-controlling your domain maps.disclose-finding — draft a courteous, vendor-friendly disclosure email if a capture incidentally surfaces something that looks security-relevant.One-off integration scoping (zero install):
analyze-har <path>generate-openapiSustained mapping of a target you care about:
setup — answer yes to the private repo prompt; init-private-repo runs and registers the path.create-domain-map example.com — folds new captures into the persistent per-domain map in your private repo. Each map accumulates: endpoints, schemas, auth scheme, pagination, rate-limit signals, your hand-written ToS notes and integration notes, and a history log.Capture from a desktop or mobile app:
capture-via-proxy --host api.example.com — starts mitmproxy in the background and walks you through CA trust + proxy pointing.create-domain-map and/or generate-openapi as above.Plugin data lives at:
${CLAUDE_USER_DATA:-${XDG_DATA_HOME:-$HOME/.local/share}/claude-plugins}/browser-data-capture/
Per-run captures go under data/<run-id>/. If you've registered a private maps repo via init-private-repo, persistent domain maps live there instead — outside the plugin install directory, version-controlled, and yours.
By default the plugin redacts cookie values and bearer tokens before writing anything to disk — only the presence of an auth header is recorded, never the value. This default is enforced; turn it off only by hand-editing config.json.
Raw mitmproxy flow files and HARs may still contain unredacted auth values. The init-private-repo .gitignore ensures they never reach git by accident.
claude plugins marketplace add danielrosehill/Claude-Code-Plugins
claude plugins install browser-data-capture@danielrosehill
MIT — see LICENSE.
Share bugs, ideas, or general feedback.
Comprehensive skill pack with 66 specialized skills for full-stack developers: 12 language experts (Python, TypeScript, Go, Rust, C++, Swift, Kotlin, C#, PHP, Java, SQL, JavaScript), 10 backend frameworks, 6 frontend/mobile, plus infrastructure, DevOps, security, and testing. Features progressive disclosure architecture for 50% faster loading.
Manus-style persistent markdown files for planning, progress tracking, and knowledge storage. Works with Claude Code, Kiro, Clawd CLI, Gemini CLI, Cursor, Continue, Hermes, and 17+ AI coding assistants. Now with Arabic, German, Spanish, and Chinese (Simplified & Traditional) support.
Payload Development plugin - covers collections, fields, hooks, access control, plugins, and database adapters.
Write SQL, explore datasets, and generate insights faster. Build visualizations and dashboards, and turn raw data into clear stories for stakeholders.
Complete creative writing suite with 10 specialized agents covering the full writing process: research gathering, character development, story architecture, world-building, dialogue coaching, editing/review, outlining, content strategy, believability auditing, and prose style/voice analysis. Includes genre-specific guides, templates, and quality checklists.
Own this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claimOwn this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claim