Set up project with dev standards — rules, skills, and hooks
From dev-standardsnpx claudepluginhub standardbeagle/standardbeagle-tools --plugin dev-standardsSet up development standards for this project by detecting the tech stack, interviewing the user, and generating contextual rules and skills into .claude/rules/ and .claude/skills/.
All template files are located at ${CLAUDE_PLUGIN_ROOT}/assets/templates/.
Scan the project automatically. Do NOT ask the user anything yet.
Use Glob to check for the presence of these files:
package.json -- Node.js / JavaScript / TypeScripttsconfig.json or **/*.ts -- TypeScript confirmation*.csproj or *.sln -- C# / .NETgo.mod -- GoCargo.toml -- Rustpyproject.toml, setup.py, requirements.txt -- PythonMakefile -- build systemCheck for framework-specific config files:
next.config.* -- Next.jsangular.json -- Angularvite.config.* -- Vitenuxt.config.* -- Nuxtsvelte.config.* -- SvelteKitastro.config.* -- Astroremix.config.* -- Remixgatsby-config.* -- GatsbyProgram.cs with WebApplication -- ASP.NETserverless.yml or serverless.ts -- Serverless Frameworksam-template.yaml or template.yaml -- AWS SAMcdk.json -- AWS CDKterraform directory -- TerraformDockerfile -- Containerizeddocker-compose*.yml -- Docker ComposeCheck for test configuration files:
jest.config.* or "jest" key in package.json -- Jestvitest.config.* -- Vitestpytest.ini, pyproject.toml with [tool.pytest] -- pytestxunit or nunit references in *.csproj -- .NET test frameworks*_test.go files -- Go testingplaywright.config.* -- Playwrightcypress.config.* -- CypressCheck for:
.eslintrc* or eslint.config.* -- ESLint.prettierrc* or prettier.config.* -- Prettierbiome.json -- Biome.editorconfig -- EditorConfigruff.toml or [tool.ruff] in pyproject.toml -- Ruff.golangci.yml -- golangci-lintrustfmt.toml -- rustfmtCheck for:
.claude/ directory -- existing Claude configuration.claude/rules/ -- existing rules.claude/skills/ -- existing skillsCLAUDE.md -- existing project instructionsCheck for:
prisma/ or schema.prisma -- Prisma ORM**/migrations/ directory -- database migrationsdocker-compose*.yml with database service names (postgres, mysql, redis, mongo)*.entity.ts, *.model.ts, **/models/, **/entities/ -- ORM entitiesknexfile.* -- Knexdrizzle.config.* -- Drizzlesequelize in package.json -- Sequelizealembic/ or alembic.ini -- Alembic (Python)Entity Framework references in *.csproj -- EF CoreCollect all findings into a detection summary. Proceed to Phase 2.
Present the detection summary to the user in a clear format:
Detected:
Languages: [list]
Frameworks: [list]
Test runners: [list]
Linters: [list]
Persistence: [list]
Existing config: [list or "none"]
Then ask ONE question at a time. Wait for each answer before asking the next.
Ask the user to confirm the detected languages and frameworks. Ask if anything is missing or incorrect.
Ask the user to pick the project type:
Ask the user to pick the architecture style:
Ask the user to pick the deployment target:
Ask the user about external dependencies. Ask ONE question at a time.
Ask what databases or storage systems the project uses (if any). Examples: PostgreSQL, MySQL, MongoDB, Redis, S3, SQLite. Skip if no persistence was detected and user confirms none.
Ask what external APIs or third-party services the project integrates with (if any). Examples: Stripe, Auth0, SendGrid, OpenAI, AWS services.
If external APIs were listed, ask which ones need replay proxies for testing. Explain briefly: replay proxies record real API responses and replay them in CI so tests are fast and deterministic.
Ask the user for project context. Ask ONE question at a time.
Ask for a brief (1-3 sentence) description of what the project does.
Ask for key domain concepts or terms that are important for understanding the codebase. Examples: "tenant", "workspace", "pipeline", "widget". Skip if the user says none.
Ask if there are any team conventions or patterns that should be documented. Examples: "we use barrel exports", "all API responses use a wrapper type", "we prefix interfaces with I". Skip if the user says none.
Generate all output files. Read each template from ${CLAUDE_PLUGIN_ROOT}/assets/templates/ before copying.
Every generated file MUST start with this comment on the first line:
<!-- Generated by dev-standards plugin. Customize as needed. -->
If a template already includes this comment, do not duplicate it.
Create .claude/rules/ and .claude/skills/ directories if they do not exist.
mkdir -p .claude/rules .claude/skills
Read and copy these templates to .claude/rules/:
${CLAUDE_PLUGIN_ROOT}/assets/templates/rules/version-control.md -> .claude/rules/version-control.md${CLAUDE_PLUGIN_ROOT}/assets/templates/rules/code-quality.md -> .claude/rules/code-quality.mdCopy them as-is. They have no placeholders that need filling.
Read ${CLAUDE_PLUGIN_ROOT}/assets/templates/rules/architecture.md. Replace these placeholders with values from the interview:
{{project_type_description}} -- Replace with the project type and architecture style. Example: "Web application using DDD architecture, deployed to containers."{{active_decisions}} -- Replace with "No active architecture decisions documented yet. Add decisions here as they are made." (The user will fill this in over time.){{active_migrations}} -- Replace with "No active migrations. Document ongoing migrations here." (The user will fill this in over time.){{project_constraints}} -- Replace with deployment target and any team conventions mentioned. Example: "Deployed to Kubernetes. All API responses use a standard wrapper type."Write the result to .claude/rules/architecture.md.
For each detected language, read the corresponding template and copy it to .claude/rules/:
| Language | Template file | Output file |
|---|---|---|
| TypeScript | typescript.md | .claude/rules/typescript.md |
| C# | csharp.md | .claude/rules/csharp.md |
| Python | python.md | .claude/rules/python.md |
| Go | go.md | .claude/rules/go.md |
| Rust | rust.md | .claude/rules/rust.md |
Copy each as-is. The templates already include correct glob paths in their frontmatter.
If any test runner was detected, read ${CLAUDE_PLUGIN_ROOT}/assets/templates/rules/testing.md and replace these placeholders:
{{test_runner}} -- The detected or confirmed test runner (e.g., "vitest", "jest", "pytest", "go test"){{e2e_framework}} -- The detected e2e framework (e.g., "playwright", "cypress"). If none detected, use "not configured"{{replay_proxy}} -- The replay proxy tool if specified by the user, otherwise "not configured"{{external_services}} -- The external services that need replay proxies, or "none configured"Write the result to .claude/rules/testing.md.
If persistence was detected (databases, ORMs, migrations), read and copy ${CLAUDE_PLUGIN_ROOT}/assets/templates/rules/data-integrity.md to .claude/rules/data-integrity.md.
Update the paths frontmatter to match the project's actual directory structure for database-related code. For example, if the project uses src/db/ instead of db/, adjust accordingly.
Based on the project type, read and copy skill templates to .claude/skills/. Replace all {{placeholders}} in each skill with values from the interview.
Common placeholders across skills:
{{framework}} -- The primary framework (e.g., "Next.js", "ASP.NET", "FastAPI"){{test_runner}} -- The test runner command{{e2e_framework}} -- The e2e framework{{api_style}} -- "REST" or "GraphQL" (infer from detection or ask){{validation_library}} -- Detected validation library or "project's validation approach"{{auth_pattern}} -- "middleware auth" or "decorator auth" or similar (infer from detection){{orm}} -- The detected ORM or "repository pattern"Project type to skill mapping:
| Project Type | Skills to Copy |
|---|---|
| web app | webapp/add-endpoint, webapp/add-page, webapp/add-data-model |
| SaaS | webapp/add-endpoint, webapp/add-page, webapp/add-data-model |
| serverless | serverless/add-function |
| CLI | cli/add-command |
| library | library/add-public-api |
| desktop | desktop/add-message |
| script | (no skills) |
If the architecture style is DDD, also copy ddd/add-aggregate and ddd/add-domain-event in addition to the project type skills.
For each skill, create the directory and copy the file:
.claude/skills/<skill-name>/SKILL.md
For example, webapp/add-endpoint/SKILL.md becomes .claude/skills/add-endpoint/SKILL.md.
Generate a thin .claude/CLAUDE.md file. This file provides project context to Claude without duplicating rule content.
<!-- Generated by dev-standards plugin. Customize as needed. -->
# Project
{{project_description}}
## Type
{{project_type}} / {{architecture_style}} / {{deployment_target}}
## Key Technologies
{{comma-separated list of languages, frameworks, and key tools}}
## Domain Concepts
{{domain_concepts or "No domain concepts documented yet."}}
## Team Conventions
{{team_conventions or "No team conventions documented yet."}}
## Rules
Project rules are loaded automatically from `.claude/rules/`. See that directory for active standards.
## Skills
Project skills are loaded automatically from `.claude/skills/`. See that directory for available workflows.
Replace all {{placeholders}} with actual values from the interview.
If .claude/ already existed:
CLAUDE.md already exists at the project root, do not touch it -- only generate .claude/CLAUDE.mdAfter generating all files, present a summary:
Setup complete. Generated files:
Rules (loaded automatically by Claude):
.claude/rules/version-control.md
.claude/rules/code-quality.md
.claude/rules/architecture.md
.claude/rules/<language>.md (for each language)
.claude/rules/testing.md (if tests detected)
.claude/rules/data-integrity.md (if persistence detected)
Skills (available workflows):
.claude/skills/<skill-name>/SKILL.md (for each skill)
Project context:
.claude/CLAUDE.md
All files are customizable. Edit them to refine your project standards.