GDPR Article 32

Article 32 of the General Data Protection Regulation (GDPR) requires Data Controllers and Data Processors to implement technical and organizational measures that ensure a level of data security appropriate for the level of risk presented by processing personal data.

In addition, Article 32 specifies that the Data Controller or Data Processor must take steps to ensure that any natural person with access to personal data does not process the data except on instruction of the controller, processor, European Union law, or member state law.

Compliance with Article 32 requirements can be demonstrated by adherence to an approved code of conduct as specified in Article 40 or an approved certification as specified in Article 42.[1]

Compliance Description

Data security measures should, at a minimum, allow:

  • Pseudonymizing or encrypting personal data.
  • Maintaining ongoing confidentiality, integrity, availability, access, and resilience of processing systems and services.
  • Restoring the availability of and access to personal data, in the event of a physical or technical security breach.
  • Testing and evaluating the effectiveness of technical and organization measures.

Although pseudonymization and encryption are required technical measures, Article 32 gives Data Controllers flexibility in determining which additional technical measures best ensure data security. However, when selecting a measure, the Data Controller must document an evaluation of the measure along four criteria:

  • State of the Art: An evaluation of the latest and most advanced data security and privacy enhancement tools available. For example, some newer technologies are behavior analytics that profile normal behavior patterns and trigger alerts when a divergence occurs, privileged user monitoring that checks user activities and blocks access to data if necessary, and Format Preserving Encryption (FPE) that encrypts data employing the existing database format.
  • Processing Profile: An evaluation of the nature, scope, context, and purposes of the data processing.
  • Risk Profile: An evaluation of the likelihood and severity of risks to the rights and freedoms of natural person when processing personal data. Risks include “accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data transmitted, stored, or otherwise processes.” Conducting a risk assessment is best done with a Privacy Impact Assessment (PIA), as specified in Article 35 of the GPDR.
  • Cost: An evaluation of the cost of implementation relative to the risk profile.

Compliance Methods

Complying with Article 32 requires both organizational and technical strategies. Organizational strategies are similar to those specified for Article 25 compliance. Technical strategies include:

  • Change management: Monitors, logs, and reports on data structure changes. Shows compliance auditors that changes to the database can be traced to accepted change tickets.
  • Data discovery and classification: Discovers and provides visibility into the location, volume, and context of data on premises, in the cloud, and in legacy databases. Classifies the discovered data according to its personal information data type (credit card number, email address, medical records, etc.) and its security risk level.
  • Data loss prevention: Monitors and protects data in motion on networks, at rest in data storage, or in use on endpoint devices. Blocks attacks, privilege abuse, unauthorized access, malicious web requests, and unusual activity to prevent data theft.
  • Data masking: Anonymizes data via encryption/hashing, generalization, perturbation, etc. Pseudonymizes data by replacing sensitive data with realistic fictional data that maintains operational and statistical accuracy.
  • Data protection: Ensures data integrity and confidentiality through change control reconciliation, data-across-borders controls, query whitelisting, etc.
  • Ethical walls: Maintains strict separation between business groups to comply with M&A requirements, government clearance, etc.
  • Privileged user monitoring: Monitors privileged user database access and activities. Blocks access or activity, if necessary.
  • Secure audit trail archiving: Secures the audit trail from tampering, modification, or deletion, and provides forensic visibility.
  • Sensitive data access auditing: Monitors access to and changes of data protected by law, compliance regulations, and contractual agreements. Triggers alarms for unauthorized access or changes. Creates an audit trail for forensics.
  • User rights management: Identifies excessive, inappropriate, and unused privileges.
  • User tracking: Maps the web application end user to the shared application/database user to the final data accessed.
  • VIP data privacy: Maintains strict access control on highly sensitive data, including data stored in multi-tier enterprise applications such as SAP and PeopleSoft.

Learn how Imperva solutions can help meet Article 32 compliance requirements.

[1] Codes of Conduct and Certifications that comply with GDPR requirements are still being developed.

You might be interested in:

GDPR Article 44

Article 44 of the GDPR prohibits the transfer of personal data beyond EU/EEA, unless the recipient country can…

Learn More

GDPR Article 25

Article 25 of the General Data Protection Regulation (GDPR) communicates requirements for data privacy by design and data…

Learn More
Live Chat Agents Unavailable