npx claudepluginhub unclecode/crawl4ai-cloud-sdk --plugin crawl4aiThis skill uses the workspace's default tool permissions.
Switch between cloud and local backends for Crawl4AI.
Sets up Firecrawl for dev (self-hosted Docker), staging, and prod with env-specific configs for API keys, crawl limits, concurrency, and credits.
Scrapes URLs to markdown/HTML/JSON, crawls websites for multi-page extraction, searches the web, maps sites, and extracts structured data using Firecrawl MCP tools.
Provides TypeScript patterns for Apify SDK Actors with Crawlee crawlers, proxy/data management, lifecycle handling, and typed apify-client wrappers.
Share bugs, ideas, or general feedback.
Switch between cloud and local backends for Crawl4AI.
Read current config:
Read ~/.crawl4ai/claude_config.json. Report the current mode.
Switch to the other mode:
Handle prerequisites:
python3 -c "import crawl4ai". If missing: pip install crawl4ai && crawl4ai-setupWrite new config:
Update ~/.crawl4ai/claude_config.json with the new mode.
Verify:
Use the crawl MCP tool to crawl https://example.com with the new backend. Confirm it works.
Report: Tell the user the switch is complete and the new active mode.