This skill should be used when the user asks to "scope an architecture review", "define review goals", "create a review brief", "align review goals with codebase", or needs to establish the scope and goals of a brownfield codebase review before analysis begins.
From pm-architect-brownfieldnpx claudepluginhub nbkm8y5/claude-plugins --plugin pm-architect-brownfieldThis skill uses the workspace's default tool permissions.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Details PluginEval's skill quality evaluation: 3 layers (static, LLM judge), 10 dimensions, rubrics, formulas, anti-patterns, badges. Use to interpret scores, improve triggering, calibrate thresholds.
Establish the scope, goals, and alignment matrix for a brownfield architecture review. This is the first step in the brownfield pipeline — it takes review goals and codebase context and produces a structured review brief plus alignment matrix mapping goals to codebase areas.
[Review Goals + Codebase] --> **REVIEW BRIEF ALIGNMENT** --> [REVIEW_BRIEF.md + ALIGNMENT_MATRIX.md] --> Codebase Discovery --> ...
Input: User-provided review goals, codebase path
Output: REVIEW_BRIEF.md and ALIGNMENT_MATRIX.md written to artifacts directory
Both output artifacts follow the standard artifact template:
# [ARTIFACT TITLE]
## Summary
## Inputs
## Outputs
## Assumptions
## Open Questions
## Main Content
## Acceptance Criteria
Collect all inputs needed to scope the review.
--goals flag or interactive promptpackage.json, requirements.txt, pyproject.toml, Cargo.toml, go.mod, pom.xml, build.gradle, Makefile, CMakeLists.txt, *.csproj, *.sln.git/ to confirm version control presenceREADME.md if present for project description context.py, .ts, .go, .rs, .java, .swift, etc.)src/app/, pages/, controllers/, handlers/, etc.).eslintrc, tsconfig.json, setup.cfg, Dockerfile, etc.)tests/, __tests__/, spec/, test/).github/workflows/, .gitlab-ci.yml, Jenkinsfile)Evidence requirement: Every observation about the codebase MUST cite a specific file.
> Project uses TypeScript with React [package.json:5]
> Test suite uses Jest [package.json:22]
Explicitly state what is included and excluded from the review.
Create a structured goal inventory with deterministic identifiers.
Goal format:
### GOAL-0001: [Title]
- **Priority**: P0 | P1 | P2
- **Category**: Architecture | Security | Performance | Maintainability | Reliability | Observability
- **Description**: [What this goal aims to evaluate or achieve]
- **Success Criteria**:
- [ ] [Specific, verifiable criterion]
- [ ] [Additional criterion]
- **Context**: [Why this goal matters, who requested it]
Document constraints that bound the review and any resulting recommendations.
Document who has interest in and authority over the review outcomes.
Create a mapping from goals to codebase areas and analysis methods.
For each GOAL-NNNN, identify:
static-analysis: Code pattern matching, AST analysisdependency-trace: Following import chains and dependency graphspattern-search: Grep-based search for anti-patterns or specific constructsconfig-review: Examining configuration files and environment setuptest-coverage: Analyzing test presence, coverage, and qualitydata-flow: Tracing data from input to storage to outputsecurity-scan: Examining authentication, authorization, input validation, secrets managementIdentify unmapped areas: Codebase directories or modules that are not mapped to any goal. Flag these for potential blind spots.
Matrix format:
| Goal ID | Codebase Area | Analysis Method | Expected Evidence |
|---------|--------------|-----------------|-------------------|
| GOAL-0001 | src/api/, src/middleware/ | dependency-trace, pattern-search | [file:line] citations |
| GOAL-0002 | src/models/, migrations/ | data-flow, config-review | Entity relationship diagram |
Generate both output artifacts using the standard templates.
Write REVIEW_BRIEF.md:
${CLAUDE_PLUGIN_ROOT}/reference/templates/REVIEW_BRIEF.template.mdWrite ALIGNMENT_MATRIX.md:
${CLAUDE_PLUGIN_ROOT}/reference/templates/ALIGNMENT_MATRIX.template.mdOutput path: artifacts/brownfield/<project_name>/REVIEW_BRIEF.md and ALIGNMENT_MATRIX.md
# Review Brief: [Project Name]
## Summary
[2-3 sentence overview of the review scope and purpose]
## Inputs
- Review goals provided by: [stakeholder]
- Codebase path: [path]
- Codebase language/framework: [detected stack]
## Outputs
- REVIEW_BRIEF.md (this document)
- ALIGNMENT_MATRIX.md
## Assumptions
- [ASM-NNNN]: [Assumption text with evidence]
## Open Questions
- [OQ-NNNN]: [Question text]
## Main Content
### Project Overview
[Evidence-backed description of the project]
### Review Scope
#### In Scope
#### Out of Scope
#### Depth
### Goals
[GOAL-NNNN entries]
### Constraints
[CON-NNNN entries]
### Stakeholders
[Stakeholder inventory]
## Acceptance Criteria
- [ ] All goals have GOAL-NNNN IDs assigned alphabetically
- [ ] Every goal has priority and success criteria
- [ ] Scope boundaries are explicitly defined
- [ ] Stakeholders are documented
- [ ] All codebase claims cite [file:line] evidence
# Alignment Matrix: [Project Name]
## Summary
[Purpose of this matrix]
## Inputs
- REVIEW_BRIEF.md
## Outputs
- Goal-to-codebase mapping
- Analysis method assignments
- Unmapped area inventory
## Main Content
### Goal-Area Mapping
[Table mapping goals to codebase areas]
### Analysis Methods
[Detailed method descriptions per goal]
### Unmapped Areas
[Codebase areas not covered by any goal]
### Evidence Plan
[What evidence types each goal requires]
## Acceptance Criteria
- [ ] Every GOAL-NNNN maps to at least one codebase area
- [ ] Analysis methods specified for each mapping
- [ ] Unmapped areas flagged
These rules ensure reproducible output regardless of when or how many times the skill is invoked on the same inputs.
Every architectural claim, observation, or assertion about the codebase MUST include evidence citations.
Format: > Claim text [relative/path/to/file.ext:line_number]
Examples:
> The project uses Express.js as its HTTP framework [package.json:8]
> Authentication middleware validates JWT tokens [src/middleware/auth.ts:15-42]
> Database connections are pooled with a max of 10 [config/database.yml:7]
Rules:
[file:line][file:start-end][file1:line] [file2:line][UNVERIFIED] and add to Open QuestionsBefore finalizing artifacts, verify: