CDMP Fundamentals • 100 Questions • 90 Minutes
← Back to GDPR Compliance

Top 15 GDPR Compliance Pitfalls

1

Treating GDPR as a one-time project instead of an ongoing program

High Risk

❌ Problem: Organizations invest heavily in initial compliance then stop. No ongoing monitoring, no ROPA updates, no policy reviews. Within 12-18 months, compliance deteriorates as new systems, vendors, and processes are introduced without privacy assessment.

✓ Solution: Establish an ongoing compliance program with regular audits, ROPA reviews, training refreshers, and a governance committee that meets quarterly. Budget for ongoing compliance, not just the initial project.

📜 Real Case: British Airways was fined 20 million GBP (reduced from initial 183 million) partly because security measures had not been maintained and updated over time, allowing attackers to exploit vulnerabilities.

2

Invalid consent mechanisms (pre-ticked boxes, bundled consent, no opt-out)

High Risk

❌ Problem: Using consent mechanisms that do not meet GDPR standards: pre-ticked checkboxes, consent bundled with terms of service, 'by continuing to use this site you agree' banners, or making it difficult to withdraw consent.

✓ Solution: Implement GDPR-compliant consent: separate unchecked checkboxes for each purpose, clear and specific language, easy one-click withdrawal, and a consent ledger that records when and how consent was obtained.

📜 Real Case: Google was fined 50 million EUR by CNIL (France) in 2019 for lack of transparency and valid consent for ad personalization. Consent was not sufficiently informed, specific, or unambiguous.

3

Not knowing where all personal data is stored

High Risk

❌ Problem: Organizations often have personal data in places they do not know about: shadow IT tools, personal devices, email attachments, shared drives, legacy databases, test environments, and third-party systems. Without a complete data map, compliance is impossible.

✓ Solution: Conduct thorough data discovery across all systems, including unstructured data, email, cloud services, and paper records. Use automated discovery tools where possible. Repeat data discovery annually and whenever major systems change.

📜 Real Case: Multiple organizations have been caught out when personal data in forgotten systems was exposed in breaches. If you do not know data exists, you cannot protect it, include it in DSARs, or delete it on request.

4

Missing or inadequate Data Processing Agreements with vendors

High Risk

❌ Problem: Failing to have Article 28-compliant DPAs with all processors, or having DPAs that are missing required clauses (breach notification, sub-processor controls, audit rights, deletion obligations).

✓ Solution: Audit all vendor relationships for DPA coverage. Use a standard DPA template that covers all Article 28 requirements. Integrate DPA execution into the vendor onboarding process so no vendor can process data without one.

📜 Real Case: Multiple supervisory authorities have fined organizations for using processors without adequate DPAs. The German Federal Commissioner has been particularly active in enforcing Article 28 compliance.

5

Relying on consent when legitimate interests would be more appropriate

Medium Risk

❌ Problem: Over-using consent as a lawful basis when legitimate interests, contract, or legal obligation would be a better fit. This creates operational problems because consent can be withdrawn at any time, potentially disrupting processing that is genuinely necessary for the business.

✓ Solution: Before defaulting to consent, evaluate whether legitimate interests (with a documented LIA), contract, or legal obligation is a better fit. Reserve consent for situations where you genuinely need active agreement, especially marketing and optional data sharing.

📜 Real Case: While not typically fined for choosing consent over legitimate interests, organizations create operational chaos when customers withdraw consent for processing that was actually necessary for the business relationship.

6

Not responding to data subject requests within the 30-day deadline

High Risk

❌ Problem: Missing the one-month deadline for responding to DSARs, erasure requests, or other rights requests. This is one of the most common complaints individuals make to supervisory authorities.

✓ Solution: Implement a rights management system with deadline tracking, automated reminders (at day 14 and day 25), and escalation procedures. Pre-build data extraction queries for common systems. Staff adequately for expected DSAR volumes.

📜 Real Case: The Swedish Data Protection Authority fined a company 75,000 EUR for not responding to a data subject access request in time. Individual complaint-driven enforcement actions for delayed DSARs are very common across all EU supervisory authorities.

7

Inadequate breach detection and delayed notification

High Risk

❌ Problem: Not having systems to detect breaches quickly, leading to late discovery and missed 72-hour notification deadlines. Or discovering a breach but taking too long to assess it and decide on notification.

✓ Solution: Implement security monitoring, intrusion detection, and anomaly alerts. Create a clear breach assessment workflow with pre-defined severity criteria. Have pre-drafted notification templates ready. Practice with tabletop exercises.

📜 Real Case: Marriott International was fined 18.4 million GBP by the ICO after a breach affecting 339 million guest records went undetected for four years. Early detection and prompt notification are critical.

8

Privacy notices that are vague, incomplete, or written in legal jargon

High Risk

❌ Problem: Privacy notices that are too vague ('we may share your data with third parties for business purposes'), incomplete (missing required Article 13/14 information), or written in impenetrable legal language that no ordinary person can understand.

✓ Solution: Rewrite privacy notices in plain language at an 8th-grade reading level. Use layered notices (short summary + full version). Include ALL required information per Articles 13 and 14. Test with non-lawyers for comprehension.

📜 Real Case: WhatsApp was fined 225 million EUR by the Irish DPC partly due to lack of transparency — the privacy notice did not adequately explain how data was shared with other Meta companies.

9

Not conducting DPIAs for high-risk processing

High Risk

❌ Problem: Launching new systems, products, or processes that involve high-risk processing (profiling, large-scale monitoring, new technology, special category data) without conducting a Data Protection Impact Assessment first.

✓ Solution: Integrate DPIA screening into the project initiation process. Create a screening questionnaire. Train project managers to trigger DPIAs. Ensure the DPO is consulted on all DPIAs.

📜 Real Case: The Belgian Data Protection Authority fined a company for deploying a fingerprint-based time tracking system for employees without conducting a DPIA. The system processed biometric data (special category) on a large scale.

10

Ignoring international data transfer requirements

High Risk

❌ Problem: Transferring personal data outside the EU/EEA without implementing appropriate safeguards (SCCs, adequacy decisions, BCRs). This is especially common with cloud services and remote teams in non-EU countries.

✓ Solution: Map all cross-border data flows. For each transfer, implement the appropriate mechanism: check adequacy decisions first, then SCCs with Transfer Impact Assessments. Consider data localization for the most sensitive data.

📜 Real Case: The Austrian and French Data Protection Authorities found that use of Google Analytics violated GDPR because personal data was transferred to the US without adequate safeguards post-Schrems II.

11

No data retention policy or retaining data indefinitely

Medium Risk

❌ Problem: Keeping personal data forever 'just in case' without defined retention periods. Or having a retention policy on paper but not enforcing it technically. Legacy databases with years of accumulated personal data.

✓ Solution: Create a data retention schedule with justified retention periods for every data category. Implement automated deletion or archival. Conduct a legacy data cleanup to dispose of data past its retention period.

📜 Real Case: Deutsche Wohnen was fined 14.5 million EUR by the Berlin DPA for storing tenant personal data in an archiving system that had no capability to delete data that was no longer needed.

12

Insufficient staff training and awareness

Medium Risk

❌ Problem: Not training staff on GDPR, or providing generic one-time training that does not relate to people's actual roles. Staff do not know how to recognize a DSAR, what constitutes a breach, or how to handle personal data properly.

✓ Solution: Implement role-specific training at onboarding with annual refreshers. Use real scenarios relevant to each department. Track completion and test comprehension. Supplement formal training with ongoing communications.

📜 Real Case: Many breaches are caused by untrained staff: sending personal data to wrong recipients, falling for phishing, or mishandling customer data. Training is explicitly mentioned in GDPR Article 39 as a DPO responsibility.

13

Not having a DPO when one is required, or having a conflicted DPO

Medium Risk

❌ Problem: Not appointing a DPO when required by Article 37, or appointing someone with a conflict of interest (e.g., the Head of IT or Marketing Director who makes data processing decisions). The DPO must be independent.

✓ Solution: Assess whether a DPO is mandatory (public authority, large-scale monitoring, large-scale special category data). If required, appoint someone independent without conflicts. Consider an external DPO for smaller organizations.

📜 Real Case: The Belgian DPA fined a company 50,000 EUR for having a DPO who also served as the head of compliance, audit, and risk management — creating an inherent conflict of interest.

14

Using personal data for purposes beyond the original collection purpose

High Risk

❌ Problem: Collecting personal data for one purpose and then using it for something completely different without a new lawful basis or informing the data subject. Classic example: collecting emails for order confirmations then using them for marketing without consent.

✓ Solution: Strictly enforce purpose limitation. Before using data for a new purpose, assess compatibility with the original purpose using the Article 6(4) compatibility test. If incompatible, obtain a new lawful basis and update the privacy notice.

📜 Real Case: H&M was fined 35.3 million EUR by the Hamburg DPA for extensive surveillance of employees — collecting personal data about their health, religious beliefs, and family situations during casual conversations and storing it for managerial purposes far beyond any legitimate HR function.

15

Treating anonymization and pseudonymization as the same thing

Medium Risk

❌ Problem: Believing that replacing names with IDs (pseudonymization) removes GDPR obligations. Pseudonymized data is still personal data under GDPR. Only truly anonymized data (which can NEVER be re-identified) falls outside GDPR scope, and true anonymization is very difficult to achieve.

✓ Solution: Understand the critical difference: pseudonymized data = still personal data, still subject to GDPR but with reduced risk. Anonymized data = NOT personal data, GDPR does not apply. Be extremely cautious about claiming data is anonymized — re-identification is often possible with small datasets or with additional context.

📜 Real Case: Research has repeatedly shown that supposedly anonymized datasets can be re-identified. A study demonstrated that 99.98% of Americans could be re-identified from just 15 demographic attributes in an anonymized dataset.