Build client-ready consulting deliverables. Usage: /deliverable-builder <client-name> <--type final-report|status-update|board-deck|implementation-guide|training-plan|change-management|data-strategy>
From counselnpx claudepluginhub arthtech-ai/arthai-marketplace --plugin counsel/deliverable-builderBuild client-ready consulting deliverables. Usage: /deliverable-builder <client-name> <--type final-report|status-update|board-deck|implementation-guide|training-plan|change-management|data-strategy>
Generates polished, client-ready consulting deliverables from engagement data. Supports 7 deliverable types with standardized formats, quality gates, and professional presentation.
When the user invokes /deliverable-builder <client-name> with an optional --type flag specifying the deliverable type. If no type is specified, prompt the user to choose from the available types.
clients/<client-name>/profile.jsonclients/<client-name>/ subdirectories (discovery, assessment, architecture, tracking)| Type | Purpose | Typical Audience | Length |
|---|---|---|---|
final-report | Comprehensive engagement report | Executive sponsor, steering committee | 15-30 pages |
status-update | Periodic progress report | Project stakeholders | 2-5 pages |
board-deck | Executive/board presentation | C-suite, board members | 10 slides |
implementation-guide | Technical how-to for IT teams | Engineering, IT ops | 10-20 pages |
training-plan | Skills development program | HR, department leads, trainees | 5-10 pages |
change-management | Organizational change plan | HR, leadership, change agents | 8-15 pages |
data-strategy | Data governance and quality plan | CDO, data team, compliance | 10-15 pages |
Read all available engagement data for the client:
clients/<client-name>/
profile.json # Client basics
discovery/ # Market research, competitive landscape
assessment/ # Current state, AI readiness, stakeholders
architecture/ # Solution design, tech selection, costs
tracking/ # Status updates, milestones, progress
deliverables/ # Previously generated deliverables
Extract key metadata:
company_name — for headers and referencesengagement_start_date — for timeline contextprimary_contact — for addressing deliverablesinitiative_names — for scoping contentSelect the appropriate template and populate with engagement data.
final-reportFull Engagement Report Structure:
# AI Strategy & Transformation Report
## [Company Name]
**Prepared by:** [Consultant Name / Firm]
**Date:** [YYYY-MM-DD]
**Version:** 1.0
**Classification:** Confidential
---
## Table of Contents
1. Executive Summary
2. Engagement Overview
3. Methodology
4. Current State Assessment
5. Market & Competitive Analysis
6. Strategic Recommendations
7. Solution Architecture
8. Implementation Roadmap
9. Investment & ROI Analysis
10. Risk Management
11. Next Steps
12. Appendices
---
## 1. Executive Summary
[2-3 paragraphs: Context, key findings, top 3 recommendations, expected impact.
This section should stand alone — an executive reading only this page should understand
the engagement's value and recommended path forward.]
## 2. Engagement Overview
- **Objective:** [What was the engagement designed to achieve]
- **Scope:** [What was included/excluded]
- **Timeline:** [Start date] to [End date]
- **Stakeholders Interviewed:** [Count and roles]
- **Methodology:** [Brief description]
## 3. Methodology
[Description of assessment framework, research approach, scoring methodology]
### Assessment Framework
[Description of the maturity model or framework used]
### Research Approach
[How market research, competitive analysis, and trend analysis were conducted]
### Scoring Methodology
[How opportunities were evaluated and prioritized]
## 4. Current State Assessment
### AI Maturity Score: [X/5]
[Pull from assessment/ai-readiness.md]
### Strengths
[Bulleted list of current capabilities and advantages]
### Gaps
[Bulleted list of deficiencies and areas for improvement]
### Technology Landscape
[Summary of current tech stack relevant to AI initiatives]
## 5. Market & Competitive Analysis
[Pull from discovery/market-research.md and discovery/competitive-landscape.md]
### Industry AI Adoption
[Key statistics and benchmarks]
### Competitive Positioning
[Where the client stands vs. peers]
### Key Trends
[Most relevant trends affecting the client]
## 6. Strategic Recommendations
### Initiative 1: [Name]
- **Business Impact:** [HIGH/MEDIUM/LOW]
- **Feasibility:** [HIGH/MEDIUM/LOW]
- **Timeline:** [Months]
- **Investment:** [$Range]
- **Expected ROI:** [Percentage or multiple]
- **Description:** [2-3 sentences]
### Initiative 2: [Name]
[Same structure]
### Initiative 3: [Name]
[Same structure]
### Priority Matrix
| Initiative | Impact | Feasibility | Priority | Phase |
|-----------|--------|-------------|----------|-------|
| | | | | |
## 7. Solution Architecture
[Pull from architecture/solution-design.md]
[Include key Mermaid diagrams]
## 8. Implementation Roadmap
[Pull from architecture/implementation-plan.md]
[Include Gantt chart]
### Phase 1: [Name] (Months 1-3)
[Key activities, deliverables, success criteria]
### Phase 2: [Name] (Months 4-6)
[Key activities, deliverables, success criteria]
### Phase 3: [Name] (Months 7-12)
[Key activities, deliverables, success criteria]
## 9. Investment & ROI Analysis
[Pull from architecture/cost-model.md]
### Investment Summary
| Phase | Investment | Expected Return | ROI |
|-------|-----------|-----------------|-----|
| | | | |
### 3-Year Financial Projection
[TCO, benefits, net value]
## 10. Risk Management
[Top risks with mitigations]
| Risk | Probability | Impact | Mitigation |
|------|-------------|--------|------------|
| | | | |
## 11. Next Steps
[Numbered list of specific, time-bound actions with owners]
1. **[Action]** — Owner: [Role], Due: [Date]
2. **[Action]** — Owner: [Role], Due: [Date]
## 12. Appendices
### A. Detailed Assessment Scores
### B. Technology Evaluation Details
### C. Interview Summary
### D. Glossary of Terms
### E. About [Consulting Firm]
status-updatePeriodic Status Report Structure:
# Status Update: [Company Name]
**Period:** [Start Date] to [End Date]
**Report Date:** [YYYY-MM-DD]
**Prepared by:** [Name]
## Overall Status: [GREEN/YELLOW/RED]
## Period Summary
[2-3 sentence overview of what happened this period]
## Accomplishments
- [x] [Completed item 1]
- [x] [Completed item 2]
- [x] [Completed item 3]
## In Progress
- [ ] [Active item 1] — [% complete, expected completion]
- [ ] [Active item 2] — [% complete, expected completion]
## RAG Status by Workstream
| Workstream | Status | Trend | Notes |
|-----------|--------|-------|-------|
| [Stream 1] | GREEN | Stable | On track |
| [Stream 2] | YELLOW | Declining | [Issue] |
| [Stream 3] | GREEN | Improving | Ahead of schedule |
## Risks & Issues
| # | Type | Description | Impact | Mitigation | Status |
|---|------|-------------|--------|------------|--------|
| 1 | Risk | | | | Open |
| 2 | Issue | | | | In Progress |
## Decisions Needed
1. [Decision needed] — By: [Date], From: [Role]
## Next Period Plan
- [ ] [Planned activity 1]
- [ ] [Planned activity 2]
- [ ] [Planned activity 3]
## Key Metrics
| Metric | Target | Actual | Status |
|--------|--------|--------|--------|
| | | | |
board-deck10-Slide Board/Executive Presentation Outline:
# Board Presentation: AI Strategy
## [Company Name]
**Date:** [YYYY-MM-DD]
---
### Slide 1: Title
- "AI Strategy & Transformation Roadmap"
- [Company Name]
- [Date]
- [Presenter Name, Title]
### Slide 2: Why AI, Why Now
- Market forces driving AI adoption in [industry]
- Competitive pressure: [2-3 competitor moves]
- Opportunity cost of inaction: [quantified if possible]
- Key stat: "[X]% of [industry] companies have AI in production"
### Slide 3: Where We Are Today
- Current AI maturity score: [X/5]
- Strengths to build on: [2-3 bullets]
- Critical gaps: [2-3 bullets]
- One-line assessment: "[Company] is [behind/on par/ahead] of industry peers"
### Slide 4: Strategic Vision
- Target state in 12-18 months
- Alignment with corporate strategy
- AI principles and guardrails
- North star metric: [the one metric that matters]
### Slide 5: Top 3 Initiatives
| Initiative | Business Impact | Investment | Timeline | ROI |
|-----------|----------------|------------|----------|-----|
| [Init 1] | | | | |
| [Init 2] | | | | |
| [Init 3] | | | | |
### Slide 6: Initiative Deep Dive — [Top Priority]
- Problem being solved
- Proposed solution (high-level architecture diagram)
- Expected outcomes with metrics
- Success criteria
### Slide 7: Implementation Roadmap
- Phase 1 (Q1-Q2): [Headline]
- Phase 2 (Q3-Q4): [Headline]
- Phase 3 (Year 2): [Headline]
- Key milestones and decision gates
### Slide 8: Investment & Returns
- Total investment: $[X] over [Y] years
- Expected annual benefit: $[X] by Year 2
- Break-even: Month [X]
- 3-year ROI: [X]%
### Slide 9: Risks & Mitigations
| Risk | Mitigation |
|------|------------|
| [Top risk 1] | [Mitigation] |
| [Top risk 2] | [Mitigation] |
| [Top risk 3] | [Mitigation] |
### Slide 10: Ask & Next Steps
- **Decision requested:** [What you need the board to approve]
- **Budget approval:** $[X] for Phase 1
- **Timeline:** Begin [Date]
- **Immediate next steps:**
1. [Action 1]
2. [Action 2]
3. [Action 3]
implementation-guideTechnical Implementation Guide Structure:
# Implementation Guide: [Initiative Name]
## [Company Name]
**Version:** 1.0
**Date:** [YYYY-MM-DD]
**Audience:** Engineering, IT Operations, DevOps
**Classification:** Internal — Technical
---
## 1. Overview
- Purpose of this guide
- System being implemented
- Architecture reference (link to solution-design.md)
## 2. Prerequisites
### Infrastructure
- [ ] Cloud account provisioned ([provider])
- [ ] VPC/network configured
- [ ] SSL certificates obtained
- [ ] DNS entries configured
### Access & Permissions
- [ ] API keys obtained: [list services]
- [ ] Service accounts created
- [ ] IAM roles configured
- [ ] VPN access for team
### Development Environment
- [ ] Repository access granted
- [ ] Local environment setup documented
- [ ] CI/CD pipeline configured
- [ ] Monitoring tools provisioned
## 3. Architecture Overview
[Mermaid diagram from solution-design.md]
### Component Inventory
| Component | Technology | Version | Purpose | Owner |
|-----------|-----------|---------|---------|-------|
| | | | | |
## 4. Step-by-Step Setup
### 4.1 Infrastructure Provisioning
[Detailed steps with commands/configs]
### 4.2 Data Pipeline Setup
[ETL configuration, data source connections]
### 4.3 AI/ML Service Configuration
[Model deployment, API configuration, prompt templates]
### 4.4 Application Deployment
[Deployment procedure, environment variables, health checks]
### 4.5 Integration Configuration
[Connecting to external systems, API mappings]
### 4.6 Monitoring & Alerting
[Dashboard setup, alert rules, escalation procedures]
## 5. Configuration Reference
| Parameter | Description | Default | Required |
|-----------|-------------|---------|----------|
| | | | |
## 6. Testing Procedures
### Smoke Tests
[Quick validation steps]
### Integration Tests
[Cross-system validation]
### Performance Tests
[Load testing approach and benchmarks]
## 7. Runbook
### Common Operations
[Start/stop, scaling, backup/restore]
### Troubleshooting
| Symptom | Likely Cause | Resolution |
|---------|-------------|------------|
| | | |
### Escalation Path
[Who to contact for different issue types]
## 8. Rollback Procedures
[How to safely revert changes at each stage]
training-planSkills Development & Training Plan Structure:
# AI Training & Enablement Plan
## [Company Name]
**Date:** [YYYY-MM-DD]
**Version:** 1.0
---
## 1. Skills Assessment
### Current State
| Role Group | AI Literacy | Tool Proficiency | Data Literacy | Gap Level |
|-----------|-------------|-------------------|---------------|-----------|
| Executive | [1-5] | [1-5] | [1-5] | [Low/Med/High] |
| Management | [1-5] | [1-5] | [1-5] | [Low/Med/High] |
| Technical | [1-5] | [1-5] | [1-5] | [Low/Med/High] |
| Operations | [1-5] | [1-5] | [1-5] | [Low/Med/High] |
| Customer-Facing | [1-5] | [1-5] | [1-5] | [Low/Med/High] |
### Target State
[Desired proficiency levels per role group aligned to AI initiatives]
## 2. Training Curriculum
### Track A: AI Awareness (All Staff)
| Module | Duration | Format | Content |
|--------|----------|--------|---------|
| AI Fundamentals | 2 hours | Workshop | What AI is, what it can/can't do, company AI vision |
| AI Ethics & Safety | 1 hour | E-learning | Responsible AI use, bias awareness, data privacy |
| Tool Introduction | 2 hours | Hands-on | Demo of AI tools being deployed |
| AI in Your Role | 1 hour | Workshop | Role-specific AI use cases and workflows |
### Track B: AI Power Users (Department Leads, Analysts)
| Module | Duration | Format | Content |
|--------|----------|--------|---------|
| Prompt Engineering | 4 hours | Workshop | Effective prompting, templates, evaluation |
| Data Preparation | 3 hours | Hands-on | Data quality, formatting, pipeline basics |
| AI Tool Mastery | 4 hours | Lab | Advanced features, integrations, automation |
| Measuring AI Impact | 2 hours | Workshop | KPIs, A/B testing, ROI tracking |
### Track C: AI Technical Team
| Module | Duration | Format | Content |
|--------|----------|--------|---------|
| ML/AI Architecture | 8 hours | Workshop | System design, patterns, best practices |
| Model Operations | 4 hours | Lab | Deployment, monitoring, versioning |
| Data Engineering for AI | 6 hours | Lab | Pipelines, feature stores, vector DBs |
| AI Security & Governance | 4 hours | Workshop | Security, compliance, audit trails |
## 3. Training Timeline
| Month | Track A | Track B | Track C |
|-------|---------|---------|---------|
| 1 | AI Fundamentals, Ethics | Prompt Engineering | ML Architecture |
| 2 | Tool Introduction | Data Preparation | Model Operations |
| 3 | AI in Your Role | AI Tool Mastery | Data Engineering |
| 4 | Refresher | Measuring Impact | AI Security |
| 5-6 | | Advanced workshops | Ongoing labs |
## 4. Resources Required
| Resource | Quantity | Cost | Notes |
|----------|---------|------|-------|
| External trainer (AI) | [X] days | $[X] | Workshops for Track B/C |
| E-learning platform | 1 license | $[X]/yr | Track A modules |
| Lab environment | [X] seats | $[X]/mo | Hands-on practice |
| Training materials | Custom | $[X] | Role-specific guides |
## 5. Success Metrics
| Metric | Target | Measurement Method | Timeline |
|--------|--------|-------------------|----------|
| Training completion rate | >90% | LMS tracking | Month 3 |
| AI tool adoption rate | >70% | Usage analytics | Month 6 |
| Confidence score (survey) | >4/5 | Pre/post survey | Month 3 |
| Productivity improvement | >15% | Process metrics | Month 6 |
## 6. Ongoing Enablement
- Monthly AI lunch-and-learn sessions
- Internal AI champions network
- Quarterly skills reassessment
- Knowledge base of prompts, templates, and best practices
change-managementOrganizational Change Management Plan Structure:
# Change Management Plan: AI Transformation
## [Company Name]
**Date:** [YYYY-MM-DD]
**Version:** 1.0
---
## 1. Change Overview
- **What is changing:** [Description of the AI initiative]
- **Who is affected:** [Roles, departments, headcount]
- **Why the change:** [Business drivers]
- **Timeline:** [Start to steady state]
- **Scale of change:** [Incremental/Significant/Transformational]
## 2. Stakeholder Analysis
| Stakeholder Group | Impact Level | Current Sentiment | Desired Sentiment | Strategy |
|-------------------|-------------|-------------------|-------------------|----------|
| C-Suite | Medium | Supportive | Champion | Executive briefings |
| IT Leadership | High | Cautious | Supportive | Technical deep dives |
| Middle Managers | High | Concerned | Engaged | Role clarity, training |
| Front-line Staff | High | Anxious | Confident | Hands-on demos, support |
| IT Team | High | Excited | Empowered | Skills development |
| Customers | Low | Unaware | Positive | Gradual rollout |
### Stakeholder Influence-Impact Matrix
HIGH INFLUENCE
│
Monitor │ Manage Closely Closely │ (C-Suite, IT Lead) │ ─────────────┼─────────────── HIGH IMPACT │ Keep │ Keep Informed Informed │ (Managers, Staff) │ LOW INFLUENCE
## 3. Communication Plan
| Audience | Message | Channel | Frequency | Owner | Start Date |
|----------|---------|---------|-----------|-------|------------|
| All staff | AI vision & why | Town hall | Once | CEO | Week 1 |
| Managers | Role impact & support | Workshop | Bi-weekly | HR | Week 2 |
| Affected teams | Tool training | Hands-on lab | Weekly | IT | Week 4 |
| All staff | Progress updates | Email/Slack | Monthly | PM | Month 2 |
| Executives | ROI tracking | Dashboard | Monthly | PM | Month 3 |
### Key Messages by Phase
**Awareness Phase (Weeks 1-4):**
- "We are investing in AI to [business objective], not to replace roles"
- "AI will handle [routine tasks] so you can focus on [value-add work]"
- "Training and support will be provided every step of the way"
**Desire Phase (Weeks 4-8):**
- "Early results show [specific positive outcome]"
- "Here's how [role] benefits from the new AI tools"
- "Your feedback is shaping how we roll this out"
**Knowledge Phase (Weeks 8-16):**
- "Here's your training schedule and resources"
- "AI champions in your department can help"
- "Practice environment available for you to explore"
**Ability Phase (Weeks 16-24):**
- "Go-live support is available [channels]"
- "It's normal to feel slower at first — proficiency comes with practice"
- "Share your wins in [channel]"
**Reinforcement Phase (Ongoing):**
- "Celebrating [team/person] for [AI-driven achievement]"
- "Monthly metrics show [positive trend]"
- "New advanced features available for power users"
## 4. Resistance Management
### Anticipated Resistance
| Source | Concern | Evidence | Response Strategy |
|--------|---------|----------|-------------------|
| Middle managers | Job security | Survey data | Role evolution roadmap, upskilling plan |
| Veteran staff | "AI can't do what I do" | Informal feedback | Showcase AI as augmentation, pilot with champions |
| IT team | More work, less credit | 1:1 interviews | Clear ownership, recognition program |
| Compliance | Risk and liability | Policy review | Governance framework, audit trails |
### Resistance Response Framework
1. **Listen:** Acknowledge concerns without dismissing them
2. **Educate:** Provide facts and examples relevant to their role
3. **Involve:** Give resistors a role in shaping the rollout
4. **Support:** Provide extra training and 1:1 coaching
5. **Reinforce:** Celebrate early wins and share success stories
## 5. Adoption Metrics
### Leading Indicators (Predict Success)
| Metric | Target | Measurement | Frequency |
|--------|--------|-------------|-----------|
| Training completion | >90% | LMS data | Weekly |
| Tool login rate | >70% | Usage analytics | Weekly |
| Support ticket volume | Decreasing | Help desk data | Weekly |
| Champion referrals | >2 per team | Champion reports | Bi-weekly |
### Lagging Indicators (Confirm Success)
| Metric | Target | Measurement | Frequency |
|--------|--------|-------------|-----------|
| Process efficiency | +20% | Process metrics | Monthly |
| User satisfaction | >4/5 | Survey | Monthly |
| Error reduction | -30% | Quality metrics | Monthly |
| Full adoption rate | >80% | Usage analytics | Quarterly |
### Adoption Curve Tracking
100% ─────────────────────────────────────── Target 90% ───────────────────────────────── ─ ─ ─ 80% ──────────────────────────── / 70% ─────────────────────── / 60% ────────────────── / 50% ────────────── / 40% ────────── / 30% ───── / 20% ── / 10% / 0% ├──────┬──────┬──────┬──────┬──────┬────── M1 M2 M3 M4 M5 M6 Pilot Early Early Late Late Steady Adopt Major Major Adopt State
## 6. Support Structure
### AI Champions Network
- 1 champion per 15-20 affected staff
- Weekly champion sync meetings
- Champions receive advanced training first
- Recognition and incentive program
### Support Channels
| Channel | Purpose | Availability | Response SLA |
|---------|---------|-------------|-------------|
| Slack #ai-help | Quick questions | 24/7 (async) | 4 hours |
| Weekly office hours | Live Q&A | Tue/Thu 2-3pm | Real-time |
| 1:1 coaching | Individual support | By appointment | 48 hours |
| Knowledge base | Self-service docs | 24/7 | N/A |
data-strategyData Strategy & Governance Plan Structure:
# Data Strategy for AI
## [Company Name]
**Date:** [YYYY-MM-DD]
**Version:** 1.0
---
## 1. Current State Assessment
### Data Landscape
| Data Source | Type | Volume | Quality (1-5) | Accessibility | Owner |
|-----------|------|--------|---------------|---------------|-------|
| | | | | | |
### Data Maturity Score: [X/5]
| Dimension | Score | Notes |
|-----------|-------|-------|
| Data Quality | [1-5] | [Assessment] |
| Data Governance | [1-5] | [Assessment] |
| Data Architecture | [1-5] | [Assessment] |
| Data Literacy | [1-5] | [Assessment] |
| Data Culture | [1-5] | [Assessment] |
### Current Pain Points
1. [Pain point with specific impact]
2. [Pain point with specific impact]
3. [Pain point with specific impact]
## 2. Target State
### Data Architecture Vision
```mermaid
flowchart TB
subgraph Sources["Data Sources"]
Internal[Internal Systems]
External[External Data]
Streaming[Real-time Streams]
end
subgraph Platform["Data Platform"]
Lake[(Data Lake)]
Warehouse[(Data Warehouse)]
Feature[(Feature Store)]
Vector[(Vector Store)]
Catalog[Data Catalog]
end
subgraph Consumption["Data Consumption"]
Analytics[Analytics/BI]
ML[ML/AI Models]
Apps[Applications]
APIs[Data APIs]
end
Sources --> Platform --> Consumption
| Role | Responsibility | Person/Team |
|---|---|---|
| Data Owner | Accountability for data domain | [Department heads] |
| Data Steward | Day-to-day data quality | [Domain experts] |
| Data Engineer | Pipeline development & maintenance | [IT/Engineering] |
| Data Protection Officer | Privacy & compliance | [Legal/Compliance] |
| Classification | Description | Access Level | Examples |
|---|---|---|---|
| Public | Non-sensitive, externally shareable | Open | Marketing materials |
| Internal | Business data, not sensitive | Authenticated users | Reports, metrics |
| Confidential | Sensitive business data | Role-based access | Financials, strategy |
| Restricted | PII, regulated data | Need-to-know | Customer data, health records |
| Stage | Policy | Retention | Archival |
|---|---|---|---|
| Collection | Consent, minimization | N/A | N/A |
| Storage | Encryption, access control | By classification | Cold storage at [X] months |
| Processing | Audit trail, quality checks | During processing | Logs retained [X] months |
| Sharing | Access approval, anonymization | Per agreement | N/A |
| Deletion | Right to erasure, verification | Per regulation | Confirmation records |
| Dimension | Definition | Target | Measurement |
|---|---|---|---|
| Completeness | Required fields populated | >98% | Automated checks |
| Accuracy | Values match reality | >95% | Sampling + validation |
| Timeliness | Data available when needed | <[X] hours | Pipeline monitoring |
| Consistency | Same data, same answer | >99% | Cross-source reconciliation |
| Uniqueness | No duplicate records | >99% | Deduplication checks |
| Source System | Target | Data Volume | Complexity | Priority |
|---|---|---|---|---|
| Phase | Duration | Activities | Success Criteria |
|---|---|---|---|
| Assessment | 2 weeks | Data profiling, mapping | Complete data inventory |
| Pilot | 2 weeks | Migrate 1 data source | <1% data loss |
| Wave 1 | 4 weeks | Core business data | Reconciliation passes |
| Wave 2 | 4 weeks | Secondary data sources | All sources migrated |
| Validation | 2 weeks | End-to-end testing | UAT sign-off |
| Initiative | Data Needed | Current State | Gap | Action |
|---|---|---|---|---|
| Quarter | Focus | Key Deliverables |
|---|---|---|
| Q1 | Foundation | Data catalog, governance charter, quality baselines |
| Q2 | Pipeline | Core data pipelines, quality monitoring, first migration wave |
| Q3 | AI-Ready | Feature store, vector store, training data pipelines |
| Q4 | Optimize | Self-service access, advanced analytics, quality automation |
---
### 3. Apply Quality Gates
Before finalizing any deliverable, validate against these quality gates:
```markdown
## Quality Gate Checklist
### Structure
- [ ] Version header present (Version: X.X)
- [ ] Date stamp present (YYYY-MM-DD format)
- [ ] Executive summary included for documents >50 lines
- [ ] Table of contents for documents >100 lines
### Content
- [ ] All data points have source citations
- [ ] All Mermaid diagrams render without syntax errors
- [ ] Specific action items (not vague recommendations)
- [ ] Each recommendation has: what, who, when, expected outcome
- [ ] Numbers are realistic and sourced (not aspirational)
### Formatting
- [ ] Consistent heading hierarchy
- [ ] Tables properly aligned
- [ ] No placeholder text remaining ([brackets] filled in)
- [ ] Professional tone throughout
- [ ] Spelling and grammar checked
### Completeness
- [ ] All sections populated (no empty sections)
- [ ] Cross-references to other deliverables are valid
- [ ] Appendices referenced in main text
Write the completed deliverable to clients/<client-name>/deliverables/<type>.md.
If a file already exists at that path:
Confirm output by: