From harness-claude
Scans codebases for SOC2, HIPAA, GDPR, PCI-DSS compliance patterns, classifies data sensitivity, audits controls, traces data flows, and generates gap analysis reports with remediation plans. Use before releases or audits.
npx claudepluginhub intense-visions/harness-engineering --plugin harness-claudeThis skill uses the workspace's default tool permissions.
> SOC2, HIPAA, GDPR compliance checks, audit trails, and regulatory checklists. Scans codebases for compliance-relevant patterns, classifies data by sensitivity, audits implementation against framework-specific controls, and generates gap analysis reports with remediation plans.
Audits code for GDPR, HIPAA, SOC2, PCI-DSS compliance: lawful basis, data subject rights, consent management, safeguards, audit trails, license checks. For regulated data features.
Performs compliance audits for GDPR, HIPAA, SOC2, PCI-DSS in software systems; provides gap analysis, implementation plans, technical controls, policy templates, and monitoring scripts.
Performs compliance audits for GDPR, HIPAA, SOC2, PCI-DSS in software systems, delivers gap analysis, implementation plans, technical controls, and monitoring procedures.
Share bugs, ideas, or general feedback.
SOC2, HIPAA, GDPR compliance checks, audit trails, and regulatory checklists. Scans codebases for compliance-relevant patterns, classifies data by sensitivity, audits implementation against framework-specific controls, and generates gap analysis reports with remediation plans.
Identify applicable compliance frameworks. Scan for indicators:
docs/compliance/soc2/, audit logging implementation, access control patternsInventory data stores. Map all locations where user data is persisted:
Trace data flows. Map how user data moves through the system:
Check for existing compliance artifacts. Look for:
PRIVACY.md, privacy-policy.md, or served via web routeSECURITY.md, security disclosure processdocs/compliance/dpa/src/**/audit/**, event sourcing patternsDetect sensitive data patterns. Grep for fields and patterns that indicate regulated data:
Classify data fields by sensitivity. Apply a tiered classification:
Map regulatory scope per data class. Determine which frameworks apply to each data class:
Identify cross-border data flows. For GDPR compliance:
Document data retention policies. For each data class:
Produce the data classification matrix. Output a structured inventory:
SOC2 Trust Services Criteria audit. Check implementation against key controls:
HIPAA Security Rule audit. If PHI is present:
GDPR compliance audit. If EU data is processed:
PCI-DSS audit. If payment data is present:
Audit trail verification. For all applicable frameworks:
Score compliance posture per framework. For each applicable framework:
Produce the gap analysis. For each control not fully met:
Generate audit-ready checklists. Produce framework-specific checklists:
Create remediation plan. Organize gaps into actionable work:
Output the compliance report. Generate docs/compliance/audit-report-YYYY-MM-DD.md:
Compliance Audit Report — YYYY-MM-DD
Frameworks Assessed: SOC2, GDPR
Data Classifications: 12 critical, 28 sensitive, 45 internal, 15 public
SOC2 Status: 78% (18/23 controls met, 3 partial, 2 not met)
NOT MET:
CC7.2 — No security event alerting configured
CC8.1 — No deployment audit trail
PARTIAL:
CC6.1 — RBAC exists but 4 endpoints lack authorization checks
CC6.3 — TLS in transit, but database encryption at rest not configured
CC6.2 — Passwords hashed, but no MFA available
GDPR Status: 65% (11/17 controls met, 4 partial, 2 not met)
NOT MET:
Article 17 — No data deletion endpoint implemented
Article 30 — No processing activities register
PARTIAL:
Article 15 — Data export exists but incomplete (missing analytics data)
...
Remediation Plan: 7 items (2 critical, 3 important, 2 improvement)
Estimated total effort: 45 engineering-hours
harness skill run harness-compliance -- Primary CLI entry point. Runs all four phases.harness validate -- Run after generating compliance artifacts to verify project structure.harness check-deps -- Verify that compliance-related dependencies (audit logging libraries, encryption modules) are declared.emit_interaction -- Used at framework selection (checkpoint:decision) when multiple frameworks apply and the team wants to prioritize, and at remediation plan review (checkpoint:human-verify).Glob -- Discover compliance documentation, audit trail implementations, privacy policies, and data models.Grep -- Search for PII field patterns, encryption configurations, consent collection, logging patterns, and sensitive data handling.Write -- Generate compliance reports, audit checklists, and remediation plans.Edit -- Update existing compliance documentation with current audit status.Phase 1: SCAN
Frameworks detected:
- SOC2: docs/compliance/soc2/ directory exists, audit logging in src/audit/
- GDPR: EU customers present (detected from i18n locales and privacy policy)
- PCI-DSS: Not applicable (payments via Stripe, card data never touches servers)
Data stores: PostgreSQL (primary), Redis (cache/sessions), S3 (file uploads)
Third-party processors: Stripe, SendGrid, Segment, Datadog
Phase 2: CLASSIFY
Critical: None (no SSN, card data handled by Stripe)
Sensitive: email, phone, address (users table), IP address (access_logs)
Internal: order_history, preferences, usage_metrics
Public: username, display_name, avatar_url
Cross-border: Primary DB in us-east-1, CDN globally, Segment data to US
GDPR gap: No SCCs documented for US-based sub-processors
Phase 3: AUDIT
SOC2: 78% compliant (18/23)
CC6.3 — PostgreSQL not using column-level encryption for sensitive fields
CC7.2 — Datadog alerts exist but no security-specific monitors
GDPR: 65% compliant (11/17)
Article 17 — DELETE /api/users/:id exists but does not cascade to S3 files or Segment
Article 30 — No Records of Processing Activities document
Phase 4: REPORT
Generated: docs/compliance/audit-report-2026-03-27.md
Remediation plan:
Critical (week 1-2):
1. Implement cascading deletion across PostgreSQL, S3, Segment, SendGrid
2. Create Records of Processing Activities document
Important (week 3-6):
3. Add column-level encryption for email, phone, address fields
4. Create security-specific Datadog monitors for auth failures
5. Document SCCs for all US-based sub-processors
Improvement (week 7-12):
6. Implement data export endpoint including Segment analytics data
7. Add automated retention enforcement with TTL-based cleanup jobs
Phase 1: SCAN
Frameworks detected:
- HIPAA: patient, diagnosis, prescription models in src/models/
- SOC2: Required by enterprise customers, docs/compliance/soc2/ present
Data stores: PostgreSQL (primary), Redis (session cache), AWS S3 (medical records)
Third-party processors: Twilio (patient notifications), AWS (infrastructure)
BAA status: AWS BAA signed, Twilio BAA signed
Phase 2: CLASSIFY
Critical (PHI):
- patient_records: name, DOB, SSN, diagnosis_code, treatment_plan
- prescriptions: medication, dosage, prescribing_physician
- medical_images: stored in S3 bucket 'patient-records-prod'
Sensitive: provider email, staff credentials, appointment schedules
PHI field count: 23 fields across 8 tables
Phase 3: AUDIT
HIPAA Security Rule: 72% compliant
164.312(a)(1) — Access control exists but no automatic session logoff
164.312(b) — Audit log captures reads but not all PHI access events
164.312(c)(1) — No integrity checksums on medical records in S3
164.312(e)(1) — TLS 1.2 in transit, AES-256 at rest in PostgreSQL and S3
SOC2: 81% compliant
All findings overlap with HIPAA gaps
Phase 4: REPORT
Generated: docs/compliance/hipaa-audit-2026-03-27.md
Remediation plan:
Critical (week 1-2):
1. Add automatic session timeout (15 min idle) for clinical users
2. Extend audit logging to capture all PHI read events with user context
3. Add SHA-256 integrity checksums to S3 medical record objects
Important (week 3-6):
4. Implement minimum necessary access — restrict PHI queries to treating providers
5. Add PHI access review report for compliance officer (monthly)
Improvement (week 7-12):
6. Implement emergency access ("break the glass") with post-access audit
7. Add automated HIPAA compliance regression tests to CI pipeline
| Rationalization | Reality |
|---|---|
| "We're not in the EU so GDPR doesn't apply to us" | GDPR applies to any organization that processes data of EU residents, regardless of where the organization is based. If a single EU user can sign up, GDPR scope must be assessed. |
| "Our lawyers will handle the compliance questions — just document what we have" | Legal review and technical implementation are distinct. Lawyers cannot attest that Article 17 deletion cascades to S3 and Segment. The technical implementation must be audited separately. |
| "We already did a SOC2 audit last year — this codebase is the same" | SOC2 Type II assesses controls over time. Adding a new data store, third-party processor, or API endpoint can invalidate previous control attestations. Audits are point-in-time snapshots, not permanent certificates. |
| "The audit isn't for three months — we can fix the gaps before then" | Gaps found now require implementation, testing, and evidence collection time. Auditors expect evidence of sustained control operation, not freshly deployed fixes. A gap fixed the week before an audit is still a finding. |
| "That field is technically a username, not PII" | Data classification cannot be done by naming convention. A username combined with any other identifying field (email, IP, phone) is PII under GDPR. Classification must be based on the realistic re-identification risk, not the field name. |