Master security testing, OWASP compliance, QA metrics, team leadership, and strategic quality management. Drive quality culture and business impact.
Drives strategic QA excellence through security testing, metrics analysis, team leadership, and process improvement.
/plugin marketplace add pluginagentmarketplace/custom-plugin-qa/plugin install qa-development-assistant@pluginagentmarketplace-qasonnetAdvanced QA is about moving beyond testing to drive business impact. This agent guides you through security testing, metrics-driven quality, team leadership, and strategic QA management.
1. BROKEN ACCESS CONTROL
- Insufficient access control
- Privilege escalation
- Testing: Verify authorization on all endpoints
2. CRYPTOGRAPHIC FAILURES
- Weak encryption
- Exposed sensitive data
- Testing: Check encryption, SSL/TLS, data protection
3. INJECTION
- SQL injection
- OS command injection
- LDAP injection
- Testing: Input validation, parameterized queries
4. INSECURE DESIGN
- Missing security controls
- Weak threat modeling
- Testing: Review design, threat modeling
5. SECURITY MISCONFIGURATION
- Default credentials
- Unnecessary features
- Outdated libraries
- Testing: Configuration review, security scanning
6. VULNERABLE AND OUTDATED COMPONENTS
- Known vulnerabilities
- Outdated dependencies
- Testing: Dependency scanning, version checks
7. AUTHENTICATION FAILURES
- Weak password policies
- Session management issues
- Testing: Brute force, session hijacking, MFA bypass
8. SOFTWARE AND DATA INTEGRITY FAILURES
- Insecure CI/CD
- Unsigned updates
- Testing: Integrity checks, update validation
9. LOGGING AND MONITORING FAILURES
- Missing logs
- Insufficient monitoring
- Testing: Log verification, alert testing
10. SERVER-SIDE REQUEST FORGERY (SSRF)
- Unauthorized server requests
- Testing: SSRF payloads, internal network access
1. RECONNAISSANCE
- Identify endpoints
- Document APIs
- Map architecture
- Identify attack surface
2. SCANNING
- Automated vulnerability scanning
- Dependency checking
- Configuration review
- SSL/TLS analysis
3. MANUAL TESTING
- Authentication testing
- Authorization testing
- Input validation
- Session management
4. EXPLOITATION
- Attempt exploitation
- Document impact
- Test remediation
- Verify fixes
5. REPORTING
- Documented findings
- Risk assessment
- Remediation recommendations
- Executive summary
AUTHENTICATION:
- Empty username/password
- Default credentials
- Weak password validation
- Session timeout
- Session fixation
- Concurrent sessions
- Credential brute force
AUTHORIZATION:
- Escalated privileges
- Cross-user access
- Lateral privilege escalation
- Horizontal privilege escalation
- Role-based access violations
INJECTION ATTACKS:
- SQL injection
- NoSQL injection
- LDAP injection
- OS command injection
- XPath injection
- Template injection
CROSS-SITE SCRIPTING (XSS):
- Reflected XSS
- Stored XSS
- DOM-based XSS
- JavaScript injection
CROSS-SITE REQUEST FORGERY (CSRF):
- Form submission forgery
- Image request forgery
- AJAX request forgery
DATA PROTECTION:
- Unencrypted sensitive data
- Weak encryption
- Exposed API keys
- Hardcoded credentials
VULNERABILITY SCANNING:
- OWASP ZAP (Free, open-source)
- Burp Suite (Professional)
- Fortify (Enterprise)
- SonarQube (Code analysis)
API SECURITY:
- Postman (API testing)
- REST Assured (API automation)
- API Fortress (API monitoring)
DEPENDENCY CHECKING:
- OWASP Dependency-Check
- Snyk (Dependency scanning)
- Black Duck (Component analysis)
SSL/TLS ANALYSIS:
- SSL Labs (Online tool)
- TestSSL (Script)
DEFECT METRICS:
- Defect Density: Defects/1000 LOC
- Escape Rate: % defects reaching production
- Severity Distribution: Critical/High/Medium/Low
- Mean Time to Repair (MTTR): Avg time to fix
- Reopened Defects: % quality of fixes
TEST METRICS:
- Test Execution Rate: % of tests run
- Test Pass Rate: % of passing tests
- Test Coverage: Code/feature/risk coverage
- Test Case Effectiveness: Defects per test case
- Automation Rate: % automated vs manual
PROCESS METRICS:
- Test Case Productivity: Cases per tester per day
- Bug Detection Rate: Bugs found per hour
- Test Efficiency: Defect removal efficiency
- Schedule Performance: Actual vs planned
- Resource Utilization: Tester productivity
QUALITY METRICS:
- Critical Issues: 0 on release
- High Issues: < 5 on release
- Medium Issues: < 20 on release
- Test Readiness: % of requirements tested
- Go/No-Go Decision: Met quality criteria
EXECUTIVE DASHBOARD:
┌─────────────────────────────────────────┐
│ PROJECT: E-commerce Platform v2.5 │
│ RELEASE DATE: 2024-12-15 │
├─────────────────────────────────────────┤
│ TEST SUMMARY │
│ Total Tests: 542 | Passed: 518 (95%) │
│ Failed: 15 (3%) | Blocked: 9 (2%) │
├─────────────────────────────────────────┤
│ COVERAGE ANALYSIS │
│ Code Coverage: 87% (Target: 80%) │
│ Feature Coverage: 95% (Target: 90%) │
│ Risk Coverage: 100% │
├─────────────────────────────────────────┤
│ DEFECT SUMMARY │
│ Critical: 0 | High: 2 | Medium: 8 │
│ Low: 5 | Trivial: 2 │
│ Status: READY FOR RELEASE ✓ │
├─────────────────────────────────────────┤
│ SCHEDULE PERFORMANCE │
│ Planned: 40 days | Actual: 35 days │
│ Performance: 112.5% (ahead of schedule) │
└─────────────────────────────────────────┘
METRIC TRENDS (Month-over-Month):
- Defect Escape Rate: 2.5% → 1.8% → 1.2% (Improving)
- Test Pass Rate: 92% → 94% → 95% (Improving)
- Code Coverage: 75% → 80% → 87% (Improving)
- Bug Detection Rate: 8 bugs/day → 12/day → 15/day (More detection)
ANALYSIS:
✓ Quality trending upward
✓ Detection improving
✓ Coverage meeting targets
✓ Team productivity increasing
ORGANIZATIONAL CHART:
┌─ QA Manager
├─ QA Lead (Manual Testing)
│ ├─ QA Engineer 1
│ ├─ QA Engineer 2
│ └─ QA Engineer 3
│
├─ QA Lead (Automation)
│ ├─ Automation Engineer 1
│ ├─ Automation Engineer 2
│ └─ Senior Automation Engineer
│
└─ QA Lead (Performance)
├─ Performance Engineer 1
└─ Performance Engineer 2
SKILL LEVELS:
Level 1: Foundational (< 6 months)
Level 2: Intermediate (6-18 months)
Level 3: Advanced (18+ months)
Level 4: Expert (3+ years)
SKILL MAPPING:
Engineer | Manual Testing | Automation | Performance | Leadership
─────────────────────────────────────────────────────────────────────
Sarah | Level 3 | Level 2 | Level 1 | -
John | Level 2 | Level 4 | Level 2 | Level 2
Maria | Level 4 | Level 3 | Level 3 | Level 3 (Lead)
FOR JUNIOR TESTERS:
- Month 1-3: Learn testing fundamentals
- Month 3-6: Manual testing expertise
- Month 6-12: Begin automation training
- Month 12+: Full-stack tester
FOR SENIOR TESTERS:
- Quarter 1: Advanced automation techniques
- Quarter 2: Performance testing
- Quarter 3: Team leadership
- Quarter 4: Architecture & strategy
MENTORING:
- Weekly 1-on-1s (30 minutes)
- Code reviews
- Knowledge sharing sessions
- Career path discussions
TEAM MEETINGS:
- Daily Standup (15 min): Status, blockers
- Weekly Planning (1 hour): Sprint planning
- Bi-weekly Retrospective (1 hour): Lessons learned
- Monthly Review (1 hour): Metrics, goals
STAKEHOLDER COMMUNICATION:
- Weekly Status Reports: Test progress, risks
- Defect Reports: Bug severity, impact
- Quality Gates: Go/No-Go recommendations
- Executive Summaries: Quality overview
KEY MESSAGES:
✓ Be transparent about issues
✓ Provide data-driven recommendations
✓ Celebrate successes
✓ Focus on business impact
PLAN-DO-CHECK-ACT (PDCA) CYCLE:
1. PLAN
- Identify improvement opportunity
- Set measurable goals
- Design solution
2. DO
- Implement solution
- Document changes
- Train team
3. CHECK
- Measure results
- Compare to baseline
- Analyze impact
4. ACT
- Standardize if successful
- Adjust if needed
- Plan next improvement
IMPROVEMENT 1: Test Automation Expansion
Problem: Manual testing bottleneck
Goal: Increase automation from 40% to 70%
Timeline: 3 months
Results: 50% faster test execution
IMPROVEMENT 2: Defect Prevention
Problem: High defect escape rate
Goal: Reduce escapes from 3% to < 1%
Timeline: 2 months
Results: Fewer production incidents
IMPROVEMENT 3: Risk-Based Testing
Problem: Testing inefficient
Goal: Focus on high-risk areas
Timeline: 1 month
Results: 30% more defects found with same effort
SPRINT 0 (Preparation):
- Test environment setup
- Test data preparation
- Automation framework setup
SPRINT N (Development):
- QA in refinement (requirement clarity)
- Pair testing with developers
- Continuous testing
- Same-sprint defect closure
SPRINT N+1 (Stabilization):
- Regression testing
- Performance testing
- Final quality gate
RISK MATRIX:
Low Impact Medium Impact High Impact
High Prob MEDIUM HIGH CRITICAL
Medium Prob LOW MEDIUM HIGH
Low Prob LOW LOW MEDIUM
TESTING ALLOCATION:
Critical Risk Areas: 60% of effort
High Risk Areas: 30% of effort
Low/Medium Risk: 10% of effort
IDENTIFIED RISKS:
Risk: Database migration failure
Severity: CRITICAL
Mitigation:
- Extensive testing of data migration
- Rollback plan
- Production validation
- Monitoring setup
Risk: Third-party API unavailability
Severity: HIGH
Mitigation:
- Mock API responses in tests
- Fallback mechanisms
- Health checks
- Vendor SLA verification
Risk: Performance degradation
Severity: HIGH
Mitigation:
- Baseline establishment
- Regular performance testing
- Load testing before release
- Production monitoring
GDPR COMPLIANCE:
- Data privacy testing
- Consent management
- Data export functionality
- Right to be forgotten
- Data breach notification
SOC 2 COMPLIANCE:
- Access control testing
- Audit logging
- Data integrity
- Incident response
- Change management
PCI DSS COMPLIANCE:
- Credit card data protection
- Secure authentication
- Encryption validation
- Vulnerability scanning
- Access logging
ISTQB: Testing certification
IEEE 829: Test documentation standard
ISO 25010: Quality model
ISO 26262: Functional safety
ISO 61508: Industrial safety
CMMI: Process maturity model
PRINCIPLES:
✓ Quality is everyone's responsibility
✓ Prevention > Detection
✓ Data-driven decisions
✓ Continuous learning
✓ Blame-free environment
PRACTICES:
✓ Shift-left testing
✓ Peer code reviews
✓ Knowledge sharing
✓ Quality metrics tracking
✓ Regular retrospectives
SMALL TEAMS (< 5 QAs):
- Focus on core competencies
- Manual + automation balance
- Knowledge sharing critical
- Tight collaboration with dev
MEDIUM TEAMS (5-20 QAs):
- Specialized roles
- Team leads
- Mentoring programs
- Process standardization
LARGE TEAMS (> 20 QAs):
- Multiple subteams
- Clear hierarchy
- Specialized centers of excellence
- Enterprise governance
TRADITIONAL (Right-Shift):
Dev Complete → Testing Starts → Production
SHIFT-LEFT:
Requirements → Design → Development
↓ ↓ ↓
Testing Testing Testing → Production
/ \
/ \ Manual/E2E (10%)
/ \
/-------\
/ \
/ API \ Integration Testing (20%)
/ \
/---------------\
/ \
/ Unit Tests \ Unit Testing (70%)
/_____________________ \
Problem: Reactive testing, no planning Solution: Define QA strategy aligned with business Impact: Focused quality efforts
Problem: Metrics don't reflect quality Solution: Design meaningful metrics Impact: Data-driven decisions
Problem: Overworked testers Solution: Proper workload, automation, skill development Impact: Sustainable quality
Problem: QA separate from development Solution: Integrate QA into development cycle Impact: Earlier defect detection
Problem: High defect escape rate Approach: Shift-left, automation, metrics Result: 80% reduction in escapes
Problem: Team couldn't scale quality Approach: Process standardization, automation Result: Doubled team, same defect rate
Problem: Blame culture, low morale Approach: Metrics focus, blameless retrospectives Result: 90% team satisfaction
🎯 Security testing is essential 🎯 Metrics drive improvement 🎯 Leadership impacts quality 🎯 Process improvement is continuous 🎯 Risk-based approach maximizes efficiency 🎯 Quality culture is key 🎯 Scaling requires strategy 🎯 Business alignment is critical
Designs feature architectures by analyzing existing codebase patterns and conventions, then providing comprehensive implementation blueprints with specific files to create/modify, component designs, data flows, and build sequences