From data-classification-skills
Identifies and classifies GDPR Art. 9 special category data including racial origin, political opinions, biometric, genetic, health data. Maps to Art. 9(2) processing conditions for privacy compliance.
npx claudepluginhub mukul975/privacy-data-protection-skills --plugin data-classification-skillsThis skill uses the workspace's default tool permissions.
Article 9(1) of the GDPR establishes a general prohibition on processing special categories of personal data. These categories were identified by the European legislature as carrying heightened risk to fundamental rights and freedoms due to their potential for discrimination, social stigma, or irreversible harm. Processing is permitted only when one of the ten conditions in Art. 9(2)(a)-(j) is ...
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Article 9(1) of the GDPR establishes a general prohibition on processing special categories of personal data. These categories were identified by the European legislature as carrying heightened risk to fundamental rights and freedoms due to their potential for discrimination, social stigma, or irreversible harm. Processing is permitted only when one of the ten conditions in Art. 9(2)(a)-(j) is satisfied, in addition to a valid lawful basis under Art. 6. This skill provides a systematic framework for identifying special category data across enterprise systems and mapping each instance to an appropriate processing condition.
Definition: Data revealing or from which racial or ethnic origin can be inferred, including direct declarations, photographs, names characteristic of particular ethnic groups, or nationality data when used as a proxy for ethnic origin.
Examples at Vanguard Financial Services:
Boundary Cases:
Definition: Data revealing political views, party membership, voting behaviour, political donations, or participation in political activities.
Examples at Vanguard Financial Services:
Key Precedent: Austrian Post (Österreichische Post AG) — Austrian DPA and subsequently CJEU Case C-300/21 (2023) — fined EUR 18 million for processing political affinity scores derived from statistical models applied to demographic data. The CJEU confirmed that data revealing political opinions includes inferred data, not only data directly provided by the data subject.
Definition: Data revealing religious faith, atheism, agnosticism, philosophical convictions, or related practices. Includes dietary preferences when they indicate religious observance (halal, kosher), religious holiday requests, and membership of religious organisations.
Examples at Vanguard Financial Services:
Definition: Data revealing whether an individual is or was a member of a trade union. Includes union dues deductions from payroll, attendance at union meetings, and communications with union representatives.
Examples at Vanguard Financial Services:
Definition: Personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or health of that natural person, resulting in particular from an analysis of a biological sample from the natural person in question.
Examples at Vanguard Financial Services:
Regulatory Note: The Genetic Information Nondiscrimination Act (GINA) in the US prohibits use of genetic information in health insurance and employment. In the EU, genetic data receives dual protection under both Art. 9 and specific Member State genetic data legislation.
Definition: Personal data resulting from specific technical processing relating to the physical, physiological, or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data.
Critical Distinction: Biometric data is special category ONLY when processed "for the purpose of uniquely identifying a natural person" (Art. 9(1)). A photograph stored in an HR file is personal data but not special category. The same photograph processed through facial recognition software for access control is special category biometric data.
Examples at Vanguard Financial Services:
Key Precedent: Clearview AI — CNIL Decision SAN-2022-019 (20 October 2022) — EUR 20 million fine for processing biometric data (facial recognition) without lawful basis and without conducting DPIA.
Definition: Personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about their health status. Recital 35 specifies this includes data pertaining to the health status of a data subject which reveals information relating to the past, current, or future physical or mental health of the data subject, including: registration for health care services, number/symbol/identifier assigned for health purposes, information derived from testing or examination of a body part or bodily substance, and any information on a disease, disability, disease risk, medical history, clinical treatment, or physiological or biomedical condition.
Examples at Vanguard Financial Services:
Breadth of Health Data: The CJEU has interpreted health data broadly. In Case C-184/20 (Vyriausioji tarnybinės etikos komisija, 2022), the Court held that data which indirectly reveals health information (such as a spouse's name in a declaration of interests that could reveal sexual orientation or health status) may constitute special category data.
Definition: Data concerning sexual behaviour, sexual preferences, or sexual orientation. Includes data from which sexual orientation can be inferred.
Examples at Vanguard Financial Services:
Processing of special category data is lawful ONLY when one of these conditions is met (in addition to Art. 6 lawful basis):
| Condition | Art. 9(2) | Requirements | Vanguard Application |
|---|---|---|---|
| Explicit consent | (a) | Must be freely given, specific, informed, unambiguous, and EXPLICIT (higher standard than Art. 6(1)(a) consent). Must be a clear affirmative statement, not implied. | Employee diversity monitoring with opt-in explicit consent |
| Employment and social security law | (b) | Processing necessary for obligations under employment, social security, or social protection law. Must be authorised by EU or Member State law or collective agreement with appropriate safeguards. | Payroll processing of trade union dues, occupational health assessments required by law |
| Vital interests | (c) | Processing necessary to protect vital interests where data subject is physically or legally incapable of giving consent. | Emergency medical situations where employee is incapacitated |
| Legitimate activities of non-profit | (d) | Processing by foundation, association, or not-for-profit body with political, philosophical, religious, or trade union aims, relating to members or regular contacts, with no disclosure outside the body without consent. | Not applicable to Vanguard (commercial entity) |
| Data manifestly made public | (e) | Data subject has manifestly made the data public (e.g., publicly declared political views on social media, public disclosure of health condition). | Customer data voluntarily posted on public forums |
| Legal claims | (f) | Processing necessary for establishment, exercise, or defence of legal claims, or whenever courts are acting in their judicial capacity. | Litigation holds involving health data in employment disputes |
| Substantial public interest | (g) | Processing necessary for reasons of substantial public interest, on basis of EU or Member State law, proportionate to the aim, with appropriate safeguards. | Regulatory reporting obligations (e.g., AML suspicious activity involving special category data) |
| Health care and occupational medicine | (h) | Processing necessary for preventive or occupational medicine, assessment of working capacity, medical diagnosis, health/social care provision, or management of health systems. Must be processed by or under responsibility of a professional with secrecy obligation. | Occupational health surveillance mandated by workplace health regulations |
| Public health | (i) | Processing necessary for public health reasons such as protection against serious cross-border threats to health, ensuring high standards of quality and safety for medicines/medical devices. | COVID-19 workplace safety measures (now largely wound down) |
| Archiving, research, statistics | (j) | Processing necessary for archiving in the public interest, scientific or historical research, or statistical purposes under Art. 89(1), with appropriate safeguards including data minimisation. | Internal workforce diversity statistical analysis |
| Indicator Type | Detection Approach | Special Category Flag |
|---|---|---|
| Field names containing health terminology | Regex pattern matching: `diagnosis | symptom |
| ICD-10/ICD-11 codes | Code format detection: [A-Z][0-9]{2}(\.[0-9]{1,4})? | Health data |
| Biometric template formats | Binary header detection for ISO 19795, ANSI/INCITS 378 fingerprint templates | Biometric data |
| Diversity form fields | Field labels matching: `ethnicity | race |
| Genetic marker identifiers | SNP identifiers (rs-numbers), gene names (HUGO nomenclature) | Genetic data |
Processing activities that warrant manual special category review:
Under Art. 35(3)(b), processing special category data on a large scale automatically triggers a mandatory DPIA. For Vanguard Financial Services:
| Processing Activity | Scale Assessment | DPIA Required? |
|---|---|---|
| Employee health records | 12,000 employees — large scale for employer | YES |
| Fingerprint access control | All office buildings, 8,500 users | YES |
| Customer KYC photographs | 2.4 million customers | YES (if facial recognition applied) |
| Diversity monitoring survey | Voluntary, ~3,000 respondents | YES (special category + employment context) |