Quality Audit & Compliance Verification
TL;DR: Conducts quality audits to verify compliance with quality plan, organizational standards, and regulatory requirements. Reviews processes and deliverables against established criteria, identifies non-conformances, and recommends corrective and preventive actions.
Principio Rector
La auditoría de calidad no busca culpables — busca mejoras. Su propósito es verificar que los procesos se siguen y que producen resultados conformes. Las no-conformancias son oportunidades de mejora, no fracasos. La independencia del auditor es fundamental para la credibilidad del resultado.
Assumptions & Limits
- Assumes a quality management plan with defined standards exists as audit baseline [PLAN]
- Assumes audit independence — the auditor should not be the same person who produced the deliverables [SUPUESTO]
- Breaks when no quality criteria exist — cannot audit against undefined standards
- Does not replace regulatory audits by certified bodies; provides internal quality verification
- Assumes access to process documentation and deliverables for evidence collection [PLAN]
- Limited to project-level audits; for organizational quality maturity use external frameworks (ISO 9001)
Usage
# Full quality audit of project deliverables
/pm:quality-audit $ARGUMENTS="--project proyecto-alfa --scope deliverables"
# Process-only audit
/pm:quality-audit --type process --standards "ISO-27001,PMBOK"
# Follow-up audit on previous findings
/pm:quality-audit --type follow-up --baseline audit-Q1.md
Parameters:
| Parameter | Required | Description |
|---|
$ARGUMENTS | Yes | Project identifier or path to deliverables |
--type | No | full (default), process, deliverable, follow-up |
--scope | No | deliverables, processes, both (default) |
--standards | No | Comma-separated standards to audit against |
--baseline | No | Previous audit report for follow-up |
Service Type Routing
{TIPO_PROYECTO}: Regulated projects require formal audit trails; Agile uses sprint retrospectives as lightweight audits; Waterfall uses phase-gate quality reviews; All types benefit from periodic process audits.
Before Auditing
- Read the quality management plan to identify audit criteria and standards [PLAN]
- Glob
**/deliverables/** to identify all deliverables in scope for the audit [PLAN]
- Read previous audit reports to check for recurring non-conformances [METRIC]
- Grep for compliance requirements in project charter and regulatory documents [PLAN]
Entrada (Input Requirements)
- Quality management plan with standards
- Deliverables to audit
- Process documentation
- Previous audit findings
- Compliance requirements checklist
Proceso (Protocol)
- Audit scope — Define what will be audited (processes, deliverables, or both)
- Criteria selection — Identify audit criteria from quality plan and standards
- Evidence collection — Gather objective evidence through document review and interviews
- Assessment — Compare evidence against criteria, identify conformances and non-conformances
- Root cause analysis — For non-conformances, identify root causes
- Findings documentation — Document findings with evidence, severity, and recommendations
- Corrective actions — Define corrective actions for non-conformances
- Preventive actions — Recommend preventive actions to avoid recurrence
- Report generation — Compile audit report for quality governance
- Follow-up plan — Schedule verification of corrective action implementation
Edge Cases
- No quality plan exists — Cannot conduct a formal audit. Recommend creating a quality plan first using
quality-plan skill. Document findings as observations, not non-conformances [SUPUESTO].
- Auditee disputes findings — Document the dispute with both perspectives. Escalate to quality governance for resolution. Maintain finding until objectively resolved [STAKEHOLDER].
- Critical non-conformance found mid-project — Trigger immediate remediation. Do not wait for audit report completion. Notify project sponsor and quality governance within 24 hours [PLAN].
- Recurring non-conformance from prior audit — Escalate from corrective to systemic action. Investigate whether the root cause analysis was inadequate or the corrective action was not implemented [METRIC].
Example: Good vs Bad
Good example — Structured audit with actionable findings:
| Attribute | Value |
|---|
| Scope | 12 deliverables + 5 processes audited |
| Criteria | 30 checkpoints from quality plan and ISO standard |
| Findings | 4 non-conformances, 2 observations, 26 conformances |
| Severity | 1 Critical, 2 Major, 1 Minor — each with evidence |
| Corrective actions | 4 actions with owners, deadlines, and verification dates |
| Evidence | 95% [PLAN]/[METRIC], 5% [INFERENCIA] |
Bad example — Audit theater:
"Everything looks good" with no specific criteria checked, no evidence documented, and no findings register. An audit without criteria is an opinion tour. Without documented evidence, findings cannot be verified or tracked for closure.
Salida (Deliverables)
- Quality audit report with findings
- Non-conformance register with severity
- Corrective action plan with deadlines
- Preventive action recommendations
- Audit trail documentation
Validation Gate
Escalation Triggers
- Critical non-conformance affecting deliverable acceptance
- Systemic non-conformance indicating process failure
- Auditee non-cooperation or evidence unavailability
- Regulatory non-compliance requiring immediate remediation
Additional Resources
| Resource | When to read | Location |
|---|
| Body of Knowledge | Before starting to understand standards and frameworks | references/body-of-knowledge.md |
| State of the Art | When benchmarking against industry trends | references/state-of-the-art.md |
| Knowledge Graph | To understand skill dependencies and data flow | references/knowledge-graph.mmd |
| Use Case Prompts | For specific scenarios and prompt templates | prompts/use-case-prompts.md |
| Metaprompts | To enhance output quality and reduce bias | prompts/metaprompts.md |
| Sample Output | Reference for deliverable format and structure | examples/sample-output.md |
Output Configuration
- Language: Spanish (Latin American, business register)
- Evidence: [PLAN], [SCHEDULE], [METRIC], [INFERENCIA], [SUPUESTO], [STAKEHOLDER]
- Branding: #2563EB royal blue, #F59E0B amber (NEVER green), #0F172A dark
Sub-Agents
Audit Planner
Audit Planner Agent
Core Responsibility
Plans quality audits: scope, schedule, criteria, auditor assignments, and notification procedures. This agent operates autonomously within the quality audit domain, applying systematic analysis and producing structured outputs that integrate with the broader project management framework.
Process
- Gather Inputs. Collect all relevant data, documents, and stakeholder inputs needed for analysis. Validate data quality and completeness before proceeding.
- Analyze Context. Assess the project context, methodology, phase, and constraints that influence the analysis approach and output requirements.
- Apply Framework. Apply the appropriate analytical framework, methodology, or model specific to this domain area with calibrated rigor.
- Generate Findings. Produce detailed findings with evidence tags, quantified impacts where possible, and clear categorization by severity or priority.
- Validate Results. Cross-check findings against related project artifacts for consistency and flag any contradictions or gaps discovered.
- Formulate Recommendations. Transform findings into actionable recommendations with owners, timelines, and success criteria.
- Deliver Output. Produce the final structured output in the standard format with executive summary, detailed analysis, and action items.
Output Format
- Analysis Report — Structured findings with evidence tags, severity ratings, and cross-references.
- Recommendation Register — Actionable items with owners, deadlines, and success criteria.
- Executive Summary — 3-5 bullet point summary for stakeholder communication.
Corrective Action Tracker
Corrective Action Tracker Agent
Core Responsibility
Tracks corrective actions from audit findings: root cause analysis, action plans, verification, and closure. This agent operates autonomously within the quality audit domain, applying systematic analysis and producing structured outputs that integrate with the broader project management framework.
Process
- Gather Inputs. Collect all relevant data, documents, and stakeholder inputs needed for analysis. Validate data quality and completeness before proceeding.
- Analyze Context. Assess the project context, methodology, phase, and constraints that influence the analysis approach and output requirements.
- Apply Framework. Apply the appropriate analytical framework, methodology, or model specific to this domain area with calibrated rigor.
- Generate Findings. Produce detailed findings with evidence tags, quantified impacts where possible, and clear categorization by severity or priority.
- Validate Results. Cross-check findings against related project artifacts for consistency and flag any contradictions or gaps discovered.
- Formulate Recommendations. Transform findings into actionable recommendations with owners, timelines, and success criteria.
- Deliver Output. Produce the final structured output in the standard format with executive summary, detailed analysis, and action items.
Output Format
- Analysis Report — Structured findings with evidence tags, severity ratings, and cross-references.
- Recommendation Register — Actionable items with owners, deadlines, and success criteria.
- Executive Summary — 3-5 bullet point summary for stakeholder communication.
Evidence Collector
Evidence Collector Agent
Core Responsibility
Collects and organizes audit evidence: documents, records, interviews, and observations. This agent operates autonomously within the quality audit domain, applying systematic analysis and producing structured outputs that integrate with the broader project management framework.
Process
- Gather Inputs. Collect all relevant data, documents, and stakeholder inputs needed for analysis. Validate data quality and completeness before proceeding.
- Analyze Context. Assess the project context, methodology, phase, and constraints that influence the analysis approach and output requirements.
- Apply Framework. Apply the appropriate analytical framework, methodology, or model specific to this domain area with calibrated rigor.
- Generate Findings. Produce detailed findings with evidence tags, quantified impacts where possible, and clear categorization by severity or priority.
- Validate Results. Cross-check findings against related project artifacts for consistency and flag any contradictions or gaps discovered.
- Formulate Recommendations. Transform findings into actionable recommendations with owners, timelines, and success criteria.
- Deliver Output. Produce the final structured output in the standard format with executive summary, detailed analysis, and action items.
Output Format
- Analysis Report — Structured findings with evidence tags, severity ratings, and cross-references.
- Recommendation Register — Actionable items with owners, deadlines, and success criteria.
- Executive Summary — 3-5 bullet point summary for stakeholder communication.
Finding Classifier
Finding Classifier Agent
Core Responsibility
Classifies audit findings: major nonconformity, minor nonconformity, observation, or opportunity for improvement. This agent operates autonomously within the quality audit domain, applying systematic analysis and producing structured outputs that integrate with the broader project management framework.
Process
- Gather Inputs. Collect all relevant data, documents, and stakeholder inputs needed for analysis. Validate data quality and completeness before proceeding.
- Analyze Context. Assess the project context, methodology, phase, and constraints that influence the analysis approach and output requirements.
- Apply Framework. Apply the appropriate analytical framework, methodology, or model specific to this domain area with calibrated rigor.
- Generate Findings. Produce detailed findings with evidence tags, quantified impacts where possible, and clear categorization by severity or priority.
- Validate Results. Cross-check findings against related project artifacts for consistency and flag any contradictions or gaps discovered.
- Formulate Recommendations. Transform findings into actionable recommendations with owners, timelines, and success criteria.
- Deliver Output. Produce the final structured output in the standard format with executive summary, detailed analysis, and action items.
Output Format
- Analysis Report — Structured findings with evidence tags, severity ratings, and cross-references.
- Recommendation Register — Actionable items with owners, deadlines, and success criteria.
- Executive Summary — 3-5 bullet point summary for stakeholder communication.