Manage grepai workspaces — create, add/remove projects, delete, list, show details, and check indexing status.
Manages GrepAI workspaces for cross-project semantic search by creating, listing, and modifying shared vector stores.
npx claudepluginhub jugrajsingh/skillgardenThis skill is limited to using the following tools:
Handle all workspace operations based on the operation parameter passed from the invoking command.
Workspaces enable cross-project semantic search with a shared vector store (PostgreSQL or Qdrant). Config lives in ~/.grepai/workspace.yaml.
Create a new workspace with backend selection.
Extract workspace name from user arguments. If missing, ask:
What should this workspace be called?
Suggest a name based on the parent directory (e.g., ~/projects/ → projects).
Workspaces require a shared vector store. Ask via AskUserQuestion:
Which shared backend for this workspace?
○ Qdrant — lightweight, purpose-built vector DB (Recommended)
○ PostgreSQL + pgvector — battle-tested, SQL-based
Ask via AskUserQuestion:
Which embedding provider?
○ Ollama — local, private, free, works offline (Recommended)
○ OpenAI — cloud, high quality, costs per index
Ask via AskUserQuestion based on provider.
For Ollama:
Which embedding model?
○ nomic-embed-text — 768 dims, 274MB, fast general use (Recommended)
○ mxbai-embed-large — 1024 dims, 670MB, highest accuracy
○ bge-m3 — 1024 dims, 1.2GB, multilingual
○ nomic-embed-text-v2-moe — 768 dims, 500MB, multilingual MoE
For OpenAI:
Which embedding model?
○ text-embedding-3-small — 1536 dims, $0.00002/1K tokens (Recommended)
○ text-embedding-3-large — 3072 dims, $0.00013/1K tokens
Check Docker for the selected backend by image or port, not container name:
PostgreSQL:
docker ps --filter ancestor=postgres --format "{{.Names}}\t{{.Status}}\t{{.Ports}}"
Or check connectivity directly:
curl -s --max-time 5 http://localhost:5432 2>&1 || echo "Port 5432 check done"
Qdrant:
Check Qdrant REST API (port 6333, not 6334 which is gRPC):
curl -s --max-time 5 http://localhost:6333/collections
Ollama:
curl -s --max-time 5 http://localhost:11434/api/tags
If not running, warn and suggest starting:
Backend not running. Start with:
docker compose up -d
grepai workspace create is interactive — it prompts for backend, provider, and model sequentially. Use piped input for non-interactive creation:
Qdrant + Ollama (most common):
The prompt sequence is:
2 (Qdrant)http://localhost)6334)1 (Ollama)nomic-embed-text)printf '2\n\n\n\n1\n\n{MODEL}\n' | grepai workspace create {NAME}
PostgreSQL + Ollama:
The prompt sequence is:
1 (PostgreSQL)1 (Ollama)printf '1\npostgres://grepai:grepai@localhost:5432/grepai\n1\n\n{MODEL}\n' | grepai workspace create {NAME}
Qdrant + OpenAI:
printf '2\n\n\n\n2\n{MODEL}\n' | grepai workspace create {NAME}
PostgreSQL + OpenAI:
printf '1\npostgres://grepai:grepai@localhost:5432/grepai\n2\n{MODEL}\n' | grepai workspace create {NAME}
If piped input fails or prompts change, fall back to reading/writing ~/.grepai/workspace.yaml directly.
Ask user which directories to add as projects via AskUserQuestion. Offer contextual options:
Which directories should be added to workspace {NAME}?
○ Current directory ({cwd}) (Recommended)
○ Subdirectories of {parent} — add each subfolder as a separate project
○ Custom paths — I'll specify directories
If current directory: use its absolute path:
grepai workspace add {NAME} {ABSOLUTE_CWD_PATH}
If subdirectories: list subdirectories, let user confirm which ones, then add each:
grepai workspace add {NAME} {ABSOLUTE_PATH_1}
grepai workspace add {NAME} {ABSOLUTE_PATH_2}
Important: grepai workspace add takes an absolute path and derives the project name from filepath.Base(path) (the directory basename).
============================================================================
Workspace Created: {NAME}
============================================================================
Backend: {BACKEND}
Embedder: {PROVIDER} / {MODEL}
Projects: {COUNT}
Add projects: grepai workspace add {NAME} /absolute/path/to/project
List projects: /grepai:workspace:show {NAME}
Start watcher: grepai watch --workspace {NAME} --background
Check status: /grepai:workspace:status {NAME}
============================================================================
Add a project to an existing workspace.
Extract workspace name and project path. If workspace name missing, list available and ask:
grepai workspace list
Which workspace?
If project path missing, default to current directory.
Use absolute path:
grepai workspace add {WORKSPACE} {ABSOLUTE_PATH}
Note: grepai derives the project name from filepath.Base(path) (the directory basename).
Added {BASENAME} to workspace {WORKSPACE} (path: {ABSOLUTE_PATH})
Remove a project from a workspace.
Extract workspace name and project name from arguments.
Important: grepai workspace remove takes the project name (directory basename), not the path. If the user provides a path, extract the basename.
Show current projects first so the user can identify the correct name:
grepai workspace show {WORKSPACE}
grepai workspace remove {WORKSPACE} {PROJECT_NAME}
Removed {PROJECT_NAME} from workspace {WORKSPACE}
Delete an entire workspace.
Extract workspace name.
Ask via AskUserQuestion:
Confirm deletion of workspace {NAME}? This removes config but not indexed data.
○ Yes, delete workspace
○ No, cancel
If cancel, stop.
grepai workspace delete {NAME}
Workspace {NAME} deleted
List all configured workspaces.
grepai workspace list
Display each workspace with backend type and project count:
============================================================================
GrepAI Workspaces
============================================================================
{NAME} backend: {TYPE} projects: {COUNT}
{NAME} backend: {TYPE} projects: {COUNT}
Manage:
/grepai:workspace:show {NAME} View details
/grepai:workspace:create {NAME} Create new
============================================================================
Show workspace details and projects.
Extract workspace name.
grepai workspace show {NAME}
Optionally also read ~/.grepai/workspace.yaml for additional detail if CLI output is sparse.
============================================================================
Workspace: {NAME}
============================================================================
Backend: {TYPE}
Embedder: {PROVIDER} / {MODEL}
Projects:
{PROJECT_1} {PATH_1}
{PROJECT_2} {PATH_2}
Commands:
/grepai:workspace:add {NAME} /path Add project
/grepai:workspace:remove {NAME} proj Remove project
/grepai:workspace:status {NAME} Check index health
============================================================================
Show workspace indexing status.
Extract optional workspace name. If omitted, check all workspaces.
grepai workspace status {NAME}
Or for all:
grepai workspace status
grepai watch --workspace {NAME} --status
============================================================================
Workspace Status: {NAME}
============================================================================
Backend: {TYPE} — {STATUS}
Projects:
{S} {PROJECT_1} {FILES} files, {CHUNKS} chunks last: {TIMESTAMP}
{S} {PROJECT_2} {FILES} files, {CHUNKS} chunks last: {TIMESTAMP}
Watcher: {RUNNING|STOPPED}
============================================================================
Where {S} is one of: OK for indexed, FAIL for failed, STALE for stale/partial.
Expert guidance for Next.js Cache Components and Partial Prerendering (PPR). **PROACTIVE ACTIVATION**: Use this skill automatically when working in Next.js projects that have `cacheComponents: true` in their next.config.ts/next.config.js. When this config is detected, proactively apply Cache Components patterns and best practices to all React Server Component implementations. **DETECTION**: At the start of a session in a Next.js project, check for `cacheComponents: true` in next.config. If enabled, this skill's patterns should guide all component authoring, data fetching, and caching decisions. **USE CASES**: Implementing 'use cache' directive, configuring cache lifetimes with cacheLife(), tagging cached data with cacheTag(), invalidating caches with updateTag()/revalidateTag(), optimizing static vs dynamic content boundaries, debugging cache issues, and reviewing Cache Component implementations.
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.