From brightdata-plugin
Debugs Bright Data Scraping Browser sessions via Browser Sessions API for errors, puppeteer traces, bandwidth, captchas, connection issues, and unexpected results like empty data.
npx claudepluginhub brightdata/skills --plugin brightdata-pluginThis skill uses the workspace's default tool permissions.
Diagnose Bright Data Scraping Browser sessions using the Browser Sessions API. Fetches live session data and performs smart triage: error diagnosis, bandwidth analysis, captcha reporting, and pattern detection across recent sessions.
Diagnoses and fixes Bright Data errors like 407 Proxy Authentication, 502 Bad Gateway, SSL certificate failures, connection timeouts, inactive zones, and 429 rate limits using curl tests and config tweaks.
Captures full DevTools Protocol traces including CDP firehose, screenshots, and DOM dumps from browser automation sessions, bisecting into per-page searchable buckets for debugging and auditing.
Integrates Bright Data APIs for production web scraping, SERP results, structured extraction, and browser automation with best practices, CLI setup, and auth patterns.
Share bugs, ideas, or general feedback.
Diagnose Bright Data Scraping Browser sessions using the Browser Sessions API. Fetches live session data and performs smart triage: error diagnosis, bandwidth analysis, captcha reporting, and pattern detection across recent sessions.
Set your API key:
export BRIGHTDATA_API_KEY="your-api-key"
Get a key from Bright Data Dashboard → API Tokens.
No zone configuration needed — zone is returned as a field in session data.
Invoked as /brd-browser-debug with no arguments.
API reference: GET /browser_sessions
Start with a single call using limit=100 (the maximum) sorted by most recent:
GET https://api.brightdata.com/browser_sessions?limit=100&sort=timestamp&order=desc
Authorization: Bearer $BRIGHTDATA_API_KEY
Pagination: The response includes total, has_more, and next_offset. If has_more is true and the analysis requires more data (e.g. bandwidth outlier detection needs a larger sample), fetch the next page using offset=<next_offset>. Continue until you have enough data or has_more is false.
Available filters — apply when the user specifies a scope:
status=failed|finished|running — narrow to a specific session stateapi_name=<zone> — filter to a specific Bright Data zonetarget_url=<domain> — filter by target domain (e.g. ksp.co.il)start_date / end_date — ISO 8601 datetime rangesort=timestamp|duration|bandwidth with order=asc|descIf the user asks about a specific zone, date range, or domain — apply the relevant filter rather than fetching all sessions and filtering client-side.
total from the response, counts of finished / failed / running.error.code, call it a systemic issue:
"3 sessions failed with
custom_headers— you are overriding a header Bright Data forbids. Removepage.setExtraHTTPHeaders()from your code."
target_url domain. For each domain with 3+ sessions, calculate the median bandwidth. Flag any session whose bandwidth exceeds 2× the median for that domain as an outlier, and note if it was a failed session that burned unusually high bandwidth before dying.Invoked as /brd-browser-debug <session_id>.
API reference: GET /browser_sessions/{session_id}
Call:
GET https://api.brightdata.com/browser_sessions/<session_id>
Authorization: Bearer $BRIGHTDATA_API_KEY
Returns 404 if the session ID is not found — tell the user and stop.
Present a deep-dive using the response fields:
status): running / finished / failedapi_name): the Bright Data zone that handled the sessiontimestamp): ISO 8601 — show in local-friendly formatduration): seconds (nullable) — flag if < 2 s on failure (session barely started)bandwidth): convert bytes → MBnavigations): flag if 0 (nothing was loaded)captcha): one of solved / none / detected / failed — detected means a challenge appeared but was not solved; failed means solving was attempted but unsuccessfultarget_url → end_url — note significant drift (different domain, login wall, error page)error.code + error.message): reason about the cause using the signals in Diagnosing Failed Sessions belowClose with a one-line verdict.
When a Bright Data browser issue appears in the conversation — including puppeteer stack traces, error codes, mention of brd.superproxy.io, the user describing a session failure, OR a scraper producing empty/unexpected results (e.g. "Found 0 categories", "Got 0 products", fewer items than expected):
Do not rely on the error code alone. Cross-reference all available session signals to reason about what went wrong:
captcha is detected but not solved, the session was stopped by an unsolved challenge — suggest enabling captcha solving on the zone.