Web crawling, extraction, screenshots, and URL discovery powered by Crawl4AI
npx claudepluginhub unclecode/crawl4ai-cloud-sdkWeb crawling, extraction, screenshots, and URL discovery. Free local mode + cloud API. After installing, run /crawl4ai:setup to configure your API key or local mode.
Share bugs, ideas, or general feedback.
This is Uncle Code.
You know Crawl4AI. We built it together — you, me, and this incredible community. It became the de facto standard for self-hosting — the most-starred web crawler on GitHub. Battle-tested by developers worldwide. I am so proud of us.
We built it because access to data must be democratized. That principle hasn't changed.
Now I'm bringing it to the cloud.
Same principles. Same policies. But now:
The API you love, zero setup:
from crawl4ai_cloud import AsyncWebCrawler
async with AsyncWebCrawler(api_key="sk_live_...") as crawler:
result = await crawler.run("https://example.com")
print(result.markdown.raw_markdown)
That's it. Your existing Crawl4AI code works with minimal changes. See the SDK Documentation for Python, Node.js, and Go.
Here's the truth: Crawl4AI open source became so solid because you complained loudly — and I love that.
We need to do the same thing here.
This is version one. It will have issues. It will have bugs. The whole server might crash during the first week. But we love it. We're going to continue building every day.
I am a developer, and I love to build.
| Action | Link |
|---|---|
| Report a Bug | Open an Issue |
| Request a Feature | Open an Issue |
| Ask Questions | Join Discord |
| Share Feedback | Join Discord |
Every bug report, every feature request, every piece of feedback — it all makes this better.
Sign up for early access, get your API key (sk_live_...), and start crawling.
# Python
pip install crawl4ai-cloud-sdk
# Node.js
npm install crawl4ai-cloud
# Go
go get github.com/unclecode/crawl4ai-cloud-sdk/go
from crawl4ai_cloud import AsyncWebCrawler
async with AsyncWebCrawler(api_key="sk_live_...") as crawler:
result = await crawler.run("https://example.com")
print(result.markdown.raw_markdown)
Full SDK documentation: SDK.md
Crawling is just the beginning.
I've always had an eye for context and refined data — to make intelligence truly useful. Whether you're building an app, an agent, or something to help humanity, I hope Crawl4AI Cloud can be a part of it.