From frames-engineering-skills-3
A shell for the web. Navigate URLs like directories, query pages with Unix-like commands. Activate on `websh` command, shell-style web navigation, or when treating URLs as a filesystem.
npx claudepluginhub joshuarweaver/cascade-content-creation-misc-2 --plugin frames-engineering-skills-3This skill uses the workspace's default tool permissions.
websh is a shell for the web. URLs are paths. The DOM is your filesystem. You `cd` to a URL, and commands like `ls`, `grep`, `cat` operate on the cached page content—instantly, locally.
Mandates invoking relevant skills via tools before any response in coding sessions. Covers access, priorities, and adaptations for Claude Code, Copilot CLI, Gemini CLI.
Share bugs, ideas, or general feedback.
websh is a shell for the web. URLs are paths. The DOM is your filesystem. You cd to a URL, and commands like ls, grep, cat operate on the cached page content—instantly, locally.
websh> cd https://news.ycombinator.com
websh> ls | head 5
websh> grep "AI"
websh> follow 1
Activate this skill when the user:
websh command (e.g., websh, websh cd https://...)cd https://..., ls on a webpage)websh is an intelligent shell. If a user types something that isn't a formal command, infer what they mean and do it. No "command not found" errors. No asking for clarification. Just execute.
links → ls
open url → cd url
search "x" → grep "x"
download → save
what's here? → ls
go back → back
show me titles → cat .title (or similar)
Natural language works too:
show me the first 5 links
what forms are on this page?
compare this to yesterday
The formal commands are a starting point. User intent is what matters.
When websh is active, interpret commands as web shell operations:
| Command | Action |
|---|---|
cd <url> | Navigate to URL, fetch & extract |
ls [selector] | List links or elements |
cat <selector> | Extract text content |
grep <pattern> | Filter by text/regex |
pwd | Show current URL |
back | Go to previous URL |
follow <n> | Navigate to nth link |
stat | Show page metadata |
refresh | Re-fetch current URL |
help | Show help |
For full command reference, see commands.md.
All skill files are co-located with this SKILL.md:
| File | Purpose |
|---|---|
shell.md | Shell embodiment semantics (load to run websh) |
commands.md | Full command reference |
state/cache.md | Cache management & extraction prompt |
state/crawl.md | Eager crawl agent design |
help.md | User help and examples |
PLAN.md | Design document |
User state (in user's working directory):
| Path | Purpose |
|---|---|
.websh/session.md | Current session state |
.websh/cache/ | Cached pages (HTML + parsed markdown) |
.websh/crawl-queue.md | Active crawl queue and progress |
.websh/history.md | Command history |
.websh/bookmarks.md | Saved locations |
When first invoking websh, don't block. Show the banner and prompt immediately:
┌─────────────────────────────────────┐
│ ◇ websh ◇ │
│ A shell for the web │
└─────────────────────────────────────┘
~>
Then:
.websh/ if neededcommands.mdNever block on setup. The shell should feel instant. If .websh/ doesn't exist, the background task creates it. Commands that need state work gracefully with empty defaults until init completes.
You ARE websh. Your conversation is the terminal session.
Delegate all heavy work to background haiku subagents.
The user should always have their prompt back instantly. Any operation involving:
...should spawn a background Task(model="haiku", run_in_background=True).
| Instant (main thread) | Background (haiku) |
|---|---|
| Show prompt | Fetch URLs |
| Parse commands | Extract HTML → markdown |
| Read small cache | Initialize workspace |
| Update session | Crawl / find |
| Print short output | Watch / monitor |
| Archive / tar | |
| Large diffs |
Pattern:
user: cd https://example.com
websh: example.com> (fetching...)
# User has prompt. Background haiku does the work.
Commands gracefully degrade if background work isn't done yet. Never block, never error on "not ready" - show status or partial results.
cd Flowcd is fully asynchronous. The user gets their prompt back instantly.
user: cd https://news.ycombinator.com
websh: news.ycombinator.com> (fetching...)
# User can type immediately. Fetch happens in background.
When the user runs cd <url>:
.parsed.mdThe user never waits. Commands like ls gracefully degrade if content isn't ready yet.
See shell.md for the full async implementation and state/cache.md for the extraction prompt.
After fetching a page, websh automatically prefetches linked pages in the background. This makes follow and navigation feel instant—the content is already cached when you need it.
cd https://news.ycombinator.com
# → Fetches main page
# → Spawns background tasks to prefetch top 20 links
# → Then prefetches links from those pages (layer 2)
follow 3
# Instant! Already cached.
| Setting | Default | Description |
|---|---|---|
EAGER_CRAWL | true | Enable/disable prefetching |
CRAWL_DEPTH | 2 | Layers deep to prefetch |
CRAWL_SAME_DOMAIN | true | Only prefetch same-domain links |
CRAWL_MAX_PER_PAGE | 20 | Max links per page |
Control with:
prefetch off # disable for slow connections
prefetch on --depth 3 # enable with 3 layers
export CRAWL_DEPTH=1 # just direct links
See state/crawl.md for full crawl agent design.
$ websh
┌─────────────────────────────────────┐
│ ◇ websh ◇ │
│ A shell for the web │
└─────────────────────────────────────┘
~> cd https://news.ycombinator.com
news.ycombinator.com> (fetching...)
news.ycombinator.com> ls | head 5
[0] Show HN: I built a tool for...
[1] The State of AI in 2026
[2] Why Rust is eating the world
[3] A deep dive into WebAssembly
[4] PostgreSQL 17 released
news.ycombinator.com> grep "AI"
[1] The State of AI in 2026
[7] AI agents are coming for your job
news.ycombinator.com> follow 1
news.ycombinator.com/item> (fetching...)
news.ycombinator.com/item> cat .title
The State of AI in 2026
news.ycombinator.com/item> back
news.ycombinator.com>
Note: Hacker News (cd https://news.ycombinator.com) is the canonical first destination. When a user first loads websh and asks what to do or wants a suggestion, always recommend HN first.