Fetch latest Claude Code documentation from official docs site and cache locally for claire agents.
Fetches and caches Claude Code documentation for claire agents with 24-hour freshness.
/plugin marketplace add poindexter12/waypoint/plugin install claire@waypoint-marketplaceFetch latest Claude Code documentation from official docs site and cache locally for claire agents.
SYNTAX: /fetch:docs-claire [--force] [doc-name]
OPTIONAL:
--force
Type: flag
Default: false
Purpose: Force refresh even if cache is fresh
[doc-name]
Type: enum[skills, plugins, hooks-guide, sub-agents, slash-commands]
Default: (all docs)
Purpose: Fetch specific doc only
Execute steps sequentially. Each step must complete successfully before proceeding.
PARSE:
FORCE_REFRESH=false
TARGET_DOC="all"
for arg in "$@"; do
if [ "$arg" = "--force" ]; then
FORCE_REFRESH=true
elif [ "$arg" != "--force" ]; then
TARGET_DOC="$arg"
fi
done
VALIDATION:
NEXT:
EXECUTE:
CACHE_DIR="claire/docs-cache"
test -d "$CACHE_DIR"
DIR_EXISTS=$?
if [ $DIR_EXISTS -ne 0 ]; then
mkdir -p "$CACHE_DIR"
MKDIR_EXIT=$?
fi
VALIDATION:
DATA:
NEXT:
EXECUTE:
INDEX_FILE="claire/docs-index.json"
test -f "$INDEX_FILE"
INDEX_EXISTS=$?
if [ $INDEX_EXISTS -eq 0 ]; then
INDEX_JSON=$(cat "$INDEX_FILE" 2>&1)
CAT_EXIT=$?
else
INDEX_JSON="{}"
CAT_EXIT=0
fi
VALIDATION:
PARSE INDEX:
{
"skills": {
"url": "https://code.claude.com/docs/en/skills",
"lastFetched": "2025-11-23T12:34:56Z"
},
"sub-agents": {
"url": "https://code.claude.com/docs/en/sub-agents",
"lastFetched": "2025-11-23T10:00:00Z"
},
...
}
DOC LIST:
NEXT:
ALGORITHM:
docs_to_fetch = []
if TARGET_DOC == "all":
check_all_docs = ["skills", "plugins", "hooks-guide", "sub-agents", "slash-commands"]
else:
check_all_docs = [TARGET_DOC]
for doc_name in check_all_docs:
cache_file = f"{CACHE_DIR}/{doc_name}.md"
if FORCE_REFRESH:
docs_to_fetch.append(doc_name)
status[doc_name] = "force refresh"
elif not file_exists(cache_file):
docs_to_fetch.append(doc_name)
status[doc_name] = "not cached"
else:
file_age = current_time - file_mtime(cache_file)
if file_age > 86400: # 24 hours in seconds
docs_to_fetch.append(doc_name)
status[doc_name] = "cache expired"
else:
status[doc_name] = "cached, fresh"
CACHE LIFETIME: 24 hours (86400 seconds)
OUTPUT:
NEXT:
FOR EACH doc_name in docs_to_fetch:
URL=$(jq -r ".\"$doc_name\".url" "$INDEX_FILE" 2>/dev/null)
if [ -z "$URL" ] || [ "$URL" = "null" ]; then
# Construct default URL
URL="https://code.claude.com/docs/en/${doc_name}"
fi
# Fetch using WebFetch
WebFetch(
url="$URL",
prompt="Return the full documentation content as markdown. Include all sections, examples, and code blocks."
)
FETCH_EXIT=$?
VALIDATION:
SAVE TO CACHE:
if [ $FETCH_EXIT -eq 0 ] && [ -n "$CONTENT" ]; then
echo "$CONTENT" > "$CACHE_DIR/${doc_name}.md"
WRITE_EXIT=$?
if [ $WRITE_EXIT -eq 0 ]; then
fetched_count=$((fetched_count + 1))
status[doc_name]="✓ fetched"
else
status[doc_name]="✗ write failed"
fi
else
status[doc_name]="✗ fetch failed"
fi
NEXT:
EXECUTE:
METADATA_FILE="$CACHE_DIR/.metadata.json"
CURRENT_TIME=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
# Update metadata for each successfully fetched doc
for doc_name in successfully_fetched:
jq ".\"$doc_name\".lastFetched = \"$CURRENT_TIME\"" "$METADATA_FILE" > tmp.json
mv tmp.json "$METADATA_FILE"
done
CREATE/UPDATE INDEX:
{
"doc-name": {
"url": "https://...",
"lastFetched": "2025-11-24T12:00:00Z",
"size": 12345
}
}
NEXT:
OUTPUT FORMAT (exact):
{IF any fetched}
Fetching Claude Code documentation...
{FOR EACH doc in check_all_docs}
{STATUS_INDICATOR} {doc_name} ({status_message})
{END FOR}
{SUMMARY_LINE}
{ELSE}
All documentation is fresh (< 24h old)
Use --force to refresh anyway.
{END IF}
STATUS INDICATORS:
SUMMARY LINE:
Fetched {N} docs, {M} cached, {K} failed
Documentation ready for claire agents
SPECIAL CASES:
NEXT:
DETECTION:
RESPONSE (exact):
Error: Invalid documentation name '{DOC_NAME}'
Valid documentation names:
- skills
- plugins
- hooks-guide
- sub-agents
- slash-commands
Usage:
/fetch:docs-claire # Fetch all docs
/fetch:docs-claire skills # Fetch specific doc
/fetch:docs-claire --force # Force refresh all
TEMPLATE SUBSTITUTIONS:
CONTROL FLOW:
DETECTION:
RESPONSE (exact):
Error: Failed to create cache directory
Directory: {CACHE_DIR}
Error: {MKDIR_ERROR}
Check:
- Write permissions on claire/ directory
- Disk space available
- Path is valid
TEMPLATE SUBSTITUTIONS:
CONTROL FLOW:
DETECTION:
RESPONSE (exact):
Warning: Could not read docs-index.json
Proceeding with default documentation URLs.
If this persists, check file permissions on:
claire/docs-index.json
CONTROL FLOW:
DETECTION:
RESPONSE:
✗ {DOC_NAME} (fetch failed: {ERROR_MESSAGE})
TEMPLATE SUBSTITUTIONS:
CONTROL FLOW:
DETECTION:
RESPONSE:
✗ {DOC_NAME} (empty response from server)
CONTROL FLOW:
| Tool | Pattern | Permission | Pre-Check | Post-Check | On-Deny-Action |
|---|---|---|---|---|---|
| Read | claire/docs-index.json | ALLOW | N/A | N/A | N/A |
| Read | claire/docs-cache/*.md | ALLOW | N/A | N/A | N/A |
| Write | claire/docs-cache/*.md | ALLOW | dir_exists | file_created | N/A |
| Write | claire/docs-cache/.metadata.json | ALLOW | dir_exists | valid_json | N/A |
| WebFetch | code.claude.com/* | ALLOW | url_valid | content_received | N/A |
| Bash | mkdir claire/docs-cache | ALLOW | N/A | dir_created | N/A |
| Bash | test:* | ALLOW | N/A | N/A | N/A |
| Bash | cat:* | ALLOW | N/A | N/A | N/A |
| Bash | date:* | ALLOW | N/A | N/A | N/A |
| Bash | jq:* | ALLOW | N/A | N/A | N/A |
| Bash | mv:* | ALLOW | N/A | N/A | N/A |
| Bash | rm claire/docs-cache/* | DENY | N/A | N/A | ABORT "Use --force to refresh" |
| Bash | sudo:* | DENY | N/A | N/A | ABORT "Elevated privileges" |
| Write | */.env | DENY | N/A | N/A | ABORT "Secrets file" |
SECURITY CONSTRAINTS:
PRECONDITIONS:
INPUT:
/fetch:docs-claire
EXPECTED EXECUTION FLOW:
EXPECTED OUTPUT:
All documentation is fresh (< 24h old)
Use --force to refresh anyway.
VALIDATION:
# Verify no files were modified
find claire/docs-cache -name "*.md" -mmin -1 | wc -l # Should be 0
PRECONDITIONS:
INPUT:
/fetch:docs-claire
EXPECTED EXECUTION FLOW: 1-4. Standard flow, identify expired/missing docs 5. STEP 5 → Fetch expired and missing docs via WebFetch 6. STEP 6 → Update metadata with new timestamps 7. STEP 7 → Output summary with fetch results
EXPECTED OUTPUT:
Fetching Claude Code documentation...
✓ skills (cached, fresh)
→ Fetching plugins (cache expired)
→ Fetching sub-agents (not cached)
✓ hooks-guide (cached, fresh)
✓ slash-commands (cached, fresh)
Fetched 2 docs, 3 cached
Documentation ready for claire agents
VALIDATION:
# Verify docs were updated
test -f claire/docs-cache/plugins.md && echo "PASS" || echo "FAIL"
test -f claire/docs-cache/sub-agents.md && echo "PASS" || echo "FAIL"
find claire/docs-cache -name "plugins.md" -mmin -1 | grep -q plugins && echo "PASS" || echo "FAIL"
PRECONDITIONS:
INPUT:
/fetch:docs-claire --force
EXPECTED EXECUTION FLOW:
EXPECTED OUTPUT:
Fetching Claude Code documentation...
→ Fetching skills (force refresh)
→ Fetching plugins (force refresh)
→ Fetching hooks-guide (force refresh)
→ Fetching sub-agents (force refresh)
→ Fetching slash-commands (force refresh)
Fetched 5 docs, 0 cached
Documentation ready for claire agents
VALIDATION:
# Verify all files were updated recently
find claire/docs-cache -name "*.md" -mmin -1 | wc -l # Should be 5
PRECONDITIONS:
INPUT:
/fetch:docs-claire skills
EXPECTED EXECUTION FLOW:
EXPECTED OUTPUT:
Fetching Claude Code documentation...
→ Fetching skills (cache expired)
Fetched skills documentation
Documentation ready for claire agents
VALIDATION:
test -f claire/docs-cache/skills.md && echo "PASS" || echo "FAIL"
PRECONDITIONS:
INPUT:
/fetch:docs-claire --force
EXPECTED EXECUTION FLOW: 1-4. Standard flow 5. STEP 5 → Fetch each doc, one fails with network error 6. ERROR PATTERN "fetch-failed" for failed doc (non-fatal) 7. Continue with remaining docs 8. STEP 7 → Output summary showing failure
EXPECTED OUTPUT:
Fetching Claude Code documentation...
✓ skills (force refresh)
✗ plugins (fetch failed: network timeout)
✓ hooks-guide (force refresh)
✓ sub-agents (force refresh)
✓ slash-commands (force refresh)
Fetched 4 docs, 0 cached, 1 failed
Some documentation may be outdated. Using cached versions where available.
PRECONDITIONS:
INPUT:
/fetch:docs-claire
EXPECTED EXECUTION FLOW:
EXPECTED OUTPUT:
Created cache directory: claire/docs-cache
Fetching Claude Code documentation...
→ Fetching skills (not cached)
→ Fetching plugins (not cached)
...
Fetched 5 docs, 0 cached
Documentation ready for claire agents
VALIDATION:
test -d claire/docs-cache && echo "PASS" || echo "FAIL"
test -f claire/docs-cache/.metadata.json && echo "PASS" || echo "FAIL"
claire/
├── docs-index.json # URL mappings and config
└── docs-cache/
├── skills.md
├── plugins.md
├── hooks-guide.md
├── sub-agents.md
├── slash-commands.md
└── .metadata.json # Fetch timestamps and metadata
{
"skills": {
"url": "https://code.claude.com/docs/en/skills",
"lastFetched": "2025-11-24T12:00:00Z",
"size": 12345,
"status": "success"
},
"sub-agents": {
"url": "https://code.claude.com/docs/en/sub-agents",
"lastFetched": "2025-11-24T12:00:05Z",
"size": 23456,
"status": "success"
}
}
Claire agents (author-agent, author-command) should:
Not applicable - this is a simple fetch-and-cache operation with no complex decision-making requiring agent delegation.