From brightdata-pack
Scrape structured data with Bright Data Scraping Browser using Playwright/Puppeteer. Use when scraping JavaScript-rendered pages, SPAs, or sites requiring browser interaction. Trigger with phrases like "brightdata scraping browser", "brightdata playwright", "brightdata puppeteer", "scrape SPA with brightdata", "browser scraping".
npx claudepluginhub flight505/skill-forge --plugin brightdata-packThis skill is limited to using the following tools:
Use Bright Data's Scraping Browser to scrape JavaScript-rendered pages. The Scraping Browser works like a regular Playwright/Puppeteer browser but routes through Bright Data's proxy infrastructure with built-in CAPTCHA solving, fingerprint management, and automatic retries.
Conducts multi-round deep research on GitHub repos via API and web searches, generating markdown reports with executive summaries, timelines, metrics, and Mermaid diagrams.
Share bugs, ideas, or general feedback.
Use Bright Data's Scraping Browser to scrape JavaScript-rendered pages. The Scraping Browser works like a regular Playwright/Puppeteer browser but routes through Bright Data's proxy infrastructure with built-in CAPTCHA solving, fingerprint management, and automatic retries.
brightdata-install-auth setupnpm install playwright
npx playwright install chromium
// scraping-browser.ts
import { chromium } from 'playwright';
import 'dotenv/config';
const { BRIGHTDATA_CUSTOMER_ID, BRIGHTDATA_ZONE, BRIGHTDATA_ZONE_PASSWORD } = process.env;
const AUTH = `brd-customer-${BRIGHTDATA_CUSTOMER_ID}-zone-${BRIGHTDATA_ZONE}:${BRIGHTDATA_ZONE_PASSWORD}`;
const BROWSER_WS = `wss://${AUTH}@brd.superproxy.io:9222`;
async function scrapWithBrowser(url: string) {
console.log('Connecting to Scraping Browser...');
const browser = await chromium.connectOverCDP(BROWSER_WS);
try {
const page = await browser.newPage();
await page.goto(url, { waitUntil: 'domcontentloaded', timeout: 60000 });
// Wait for dynamic content to load
await page.waitForSelector('body', { timeout: 30000 });
// Extract structured data
const data = await page.evaluate(() => ({
title: document.title,
metaDescription: document.querySelector('meta[name="description"]')?.getAttribute('content') || '',
h1: document.querySelector('h1')?.textContent?.trim() || '',
links: Array.from(document.querySelectorAll('a[href]')).slice(0, 20).map(a => ({
text: a.textContent?.trim(),
href: a.getAttribute('href'),
})),
}));
console.log('Scraped data:', JSON.stringify(data, null, 2));
return data;
} finally {
await browser.close();
}
}
scrapWithBrowser('https://example.com').catch(console.error);
// scrape-products.ts — real-world example
import { chromium, Page } from 'playwright';
import 'dotenv/config';
interface Product {
name: string;
price: string;
rating: string;
url: string;
}
const AUTH = `brd-customer-${process.env.BRIGHTDATA_CUSTOMER_ID}-zone-${process.env.BRIGHTDATA_ZONE}:${process.env.BRIGHTDATA_ZONE_PASSWORD}`;
async function scrapeProducts(searchUrl: string): Promise<Product[]> {
const browser = await chromium.connectOverCDP(`wss://${AUTH}@brd.superproxy.io:9222`);
const page = await browser.newPage();
try {
await page.goto(searchUrl, { waitUntil: 'networkidle', timeout: 90000 });
// Scroll to trigger lazy-loaded content
await autoScroll(page);
const products = await page.evaluate(() => {
return Array.from(document.querySelectorAll('[data-testid="product-card"]')).map(card => ({
name: card.querySelector('.product-title')?.textContent?.trim() || '',
price: card.querySelector('.price')?.textContent?.trim() || '',
rating: card.querySelector('.rating')?.textContent?.trim() || '',
url: card.querySelector('a')?.getAttribute('href') || '',
}));
});
return products;
} finally {
await browser.close();
}
}
async function autoScroll(page: Page): Promise<void> {
await page.evaluate(async () => {
await new Promise<void>((resolve) => {
let totalHeight = 0;
const distance = 300;
const timer = setInterval(() => {
window.scrollBy(0, distance);
totalHeight += distance;
if (totalHeight >= document.body.scrollHeight) {
clearInterval(timer);
resolve();
}
}, 200);
});
});
}
// scraping-browser-puppeteer.ts
import puppeteer from 'puppeteer-core';
const AUTH = `brd-customer-${process.env.BRIGHTDATA_CUSTOMER_ID}-zone-${process.env.BRIGHTDATA_ZONE}:${process.env.BRIGHTDATA_ZONE_PASSWORD}`;
async function scrapeWithPuppeteer(url: string) {
const browser = await puppeteer.connect({
browserWSEndpoint: `wss://${AUTH}@brd.superproxy.io:9222`,
});
const page = await browser.newPage();
await page.goto(url, { waitUntil: 'domcontentloaded', timeout: 60000 });
const title = await page.title();
console.log('Page title:', title);
await browser.close();
}
| Error | Cause | Solution |
|---|---|---|
WebSocket connection failed | Wrong zone or credentials | Verify Scraping Browser zone is active |
Timeout 60000ms exceeded | Slow page load | Increase timeout; use domcontentloaded instead of networkidle |
Target closed | Browser disconnected | Implement retry logic; browser sessions are ephemeral |
Navigation failed | Site blocked request | Scraping Browser handles this; increase timeout |
For SERP API scraping, see brightdata-core-workflow-b.