Brownfield Project Analysis
Execute existing codebase analysis for: $ARGUMENTS
Objective Context
IMPORTANT: Parse the arguments for an objective/context statement. Look for patterns like:
--objective "..." or -o "..."
--context "..." or -c "..."
--reason "..." or -r "..."
- Natural language after the path:
./src - refactoring for testability
- Quoted context:
./src "migrating to microservices"
If an objective is provided, store it as the Analysis Objective and use it to:
- Focus discovery - Prioritize areas relevant to the objective
- Weight findings - Emphasize aspects that matter for the stated goal
- Guide questions - Ask stakeholder questions oriented toward the objective
- Shape recommendations - Align all outputs with the intended outcome
Analysis Objective: {extracted from arguments, or "General analysis" if not specified}
Brownfield Overview
This command is optimized for projects involving existing systems. The workflow combines codebase analysis to understand current capabilities with stakeholder interviews to identify desired changes and improvements.
When an objective is provided, the entire workflow adapts:
- Performance refactor → Focus on bottlenecks, hot paths, resource usage
- Testability improvement → Focus on coupling, dependencies, side effects
- Migration/modernization → Focus on outdated patterns, upgrade paths
- Security audit → Focus on vulnerabilities, data flows, auth boundaries
- Feature addition → Focus on extension points, existing patterns to follow
Brownfield Workflow
Phase 1: Codebase Discovery (20 min)
Objective-Driven Discovery
If an objective was provided, immediately focus discovery on relevant areas:
- For "performance" objectives → Start with hot paths, database queries, I/O operations
- For "testability" objectives → Start with tightly coupled components, static dependencies
- For "security" objectives → Start with auth, input validation, data handling
- For "migration" objectives → Start with framework dependencies, deprecated APIs
Project Structure Analysis
- Identify technology stack
- Map project organization
- Identify architectural patterns
- Locate key components relevant to the objective
Initial Questions (skip if already answered via objective)
- What path should I analyze?
- Are there multiple services/components?
- Is there existing documentation to review?
- What is driving this analysis? (if objective not provided)
Phase 2: Domain Model Extraction (30 min)
Entity Discovery
- Identify domain entities
- Map relationships
- Document attributes and constraints
- Identify aggregate boundaries
Documentation Format
## Entity: {Name}
- Attributes: {list}
- Relationships: {list}
- Business meaning: {description}
- Location: {file path}
Phase 3: Business Rules Discovery (30 min)
Rule Identification
- Find validation logic
- Identify calculations/formulas
- Map workflows and state machines
- Document business constraints
Rule Categories
- Validation rules
- Calculation rules
- State transition rules
- Authorization rules
Phase 4: Integration Mapping (20 min)
External Systems
- Identify API endpoints (exposed)
- Find external service calls
- Map message queue usage
- Document data flows
Documentation Format
## Integration: {Name}
- Type: API / Message Queue / File / Database
- Direction: Inbound / Outbound / Bidirectional
- Data: {what flows}
- Location: {file path}
Phase 5: Capability Documentation (15 min)
Current State Summary
- What can users do today?
- What data does the system manage?
- What integrations exist?
- What business processes are supported?
Phase 6: Stakeholder Interview (30 min)
Current State Questions
- What works well in the current system?
- What are the biggest pain points?
- What workarounds do people use?
- What's missing?
Change Requirements
- What needs to change?
- What new features are needed?
- What should be removed/deprecated?
- What performance improvements are needed?
Migration Considerations
- What data needs to be preserved?
- Can we run parallel systems?
- What is the cutover strategy?
- What training is needed?
Phase 7: Gap Analysis (15 min)
As-Is vs To-Be
- Document current capabilities
- Document desired capabilities
- Identify gaps
- Categorize: New / Changed / Removed
Technical Debt Assessment
- Code quality issues
- Outdated dependencies
- Security vulnerabilities
- Performance bottlenecks
Phase 8: Prioritization (15 min)
Change Prioritization
Using MoSCoW:
- Must Have: Critical changes/fixes
- Should Have: Important improvements
- Could Have: Nice to have enhancements
- Won't Have: Deferred to future
Phase 9: Validation and Documentation (20 min)
Validation Checkpoint
- Verify codebase findings with stakeholder
- Confirm change requirements
- Validate priorities
- Identify risks
Documentation
- Generate as-is requirements
- Document change requirements
- Create gap analysis report
- Produce SRS document
User Checkpoints
I will pause for your confirmation at:
- After Structure Discovery: "Does this architecture overview match your understanding?"
- After Domain Extraction: "Are these the main business entities?"
- After Business Rules: "Did I capture the business rules correctly?"
- After Integration Mapping: "Are these all the integrations?"
- After Stakeholder Interview: "Have I captured all the desired changes?"
- After Gap Analysis: "Is this gap assessment accurate?"
- Before Documentation: "Ready to generate the SRS?"
Expected Outputs
-
Codebase Analysis Report
- Technology stack overview
- Architecture pattern identification
- Domain model documentation
- Business rules inventory
- Integration map
-
As-Is Requirements Document
- Current capabilities
- Current business rules
- Current integrations
-
Change Requirements
- New requirements
- Modified requirements
- Deprecated features
-
Gap Analysis Report
- Current vs desired state
- Prioritized gaps
- Technical debt assessment
-
SRS Document
- IEEE 830 compliant
- Includes both as-is and to-be
-
Migration Considerations
- Data migration needs
- Cutover strategy
- Risk assessment
-
Validation Report
- Quality and completeness assessment
Analysis Scope Options
Full Analysis (Recommended)
Analyze entire codebase and all aspects.
Focused Analysis
Target specific areas:
/business-analyst:brownfield ./src/auth
/business-analyst:brownfield ./api/orders
Specific Aspects
Focus on particular concerns:
/business-analyst:brownfield security audit
/business-analyst:brownfield integration mapping
/business-analyst:brownfield business rules
Getting Started
Let's begin the brownfield analysis!
If an objective was provided in the arguments, acknowledge it:
"I see you want to analyze this codebase for: {objective}. I'll focus my analysis accordingly."
Then proceed directly to Phase 1, using the objective to guide discovery.
If no objective was provided, ask:
- What is the path to the codebase to analyze?
- What technology stack is used (if you know)?
- What is driving this analysis? Examples:
- "Refactoring for better testability"
- "Migrating from monolith to microservices"
- "Security audit before going to production"
- "Adding a new feature and need to understand the codebase"
- "Performance optimization"
- Are there specific areas of concern?
Usage Examples
# Basic usage (will ask for objective interactively)
/business-analyst:brownfield ./src
# With objective flag
/business-analyst:brownfield ./src --objective "Refactor to improve testability"
/business-analyst:brownfield ./src -o "Migrate authentication to OAuth2"
# Natural language (path followed by context)
/business-analyst:brownfield ./src/orders - need to split into microservice
/business-analyst:brownfield ./api "security audit for compliance"
# Focused analysis with objective
/business-analyst:brownfield ./src/payments --objective "PCI compliance audit"