GDPR Encryption Requirements: A Comprehensive Guide

The General Data Protection Regulation (GDPR) requires organizations that process personal data to implement appropriate measures to protect that data from unauthorized access, loss, or misuse. Article 32 requires security of processing, which means that personal data must be handled in a way that reduces the risk of harm to the rights and freedoms of data subjects.

While the law avoids mandating specific safety technologies, it adopts a risk-based approach, where it requires data controllers to use measures that are appropriate to the level of risks involved.

Encryption is frequently highlighted as a key safeguard for personal data. And in recent enforcement decisions, regulators have made their position clearer, stating that where data is sensitive or the risk is high, encryption is often treated as a baseline expectation. Failing to use it, or relying on outdated encryption standards, is increasingly difficult to justify under the duty to ensure the security of processing.

GDPR does not clearly define when encryption is required or what level is sufficient. Controllers are expected to assess risk, follow “state of the art” security, and justify their choices — often without clear technical benchmarks. These decisions are usually judged after a data breach has already occurred. This guide explains GDPR encryption requirements in practice, focusing on how regulators interpret them and how organizations are expected to apply encryption to meet their legal duties.

What Encryption Means in the Context of GDPR

Under the GDPR, encryption isn’t a standalone tool but a technical measure integrated into broader organizational safeguards for protecting personal data. The regulation emphasizes outcomes, such as minimizing risks to data subjects in case of breaches, over the mere presence of security tools.

At its core, encryption is about making personal data unreadable to anyone who is not authorized to see it. When data is encrypted properly, a person who gains unauthorized access to it sees only unintelligible information, making it effectively useless.

GDPR assesses encryption by examining whether this unreadability actually reduces risk in reality. Article 34 treats encryption as effective only where it renders personal data unintelligible to unauthorized persons, thereby reducing the harm that would otherwise arise from exposure. So, if the data can still be read, linked to a person, or exploited, then encryption has not achieved its purpose.

However, the GDPR does not specify exactly how encryption must be implemented. No specific algorithms, tools, or key lengths are mandated. This is intentional. Security technology changes quickly, and fixed technical rules would become outdated or unsafe. Instead, Article 32 adopts a principle-based approach, requiring safety measures to be appropriate for the type of data, the level of risk, and the state of technology at the time. Encryption suitable for basic internal records may be insufficient for medical data, biometric data, or large customer databases.

Encryption is therefore evaluated in context. Regulators look at whether encryption choices were reasonable when decisions were made. Strong encryption that is poorly managed, for example, where decryption keys are widely accessible, may fail GDPR expectations. By contrast, encryption supports compliance when it is properly chosen, correctly implemented, and supported by internal policies such as access controls, key management procedures, and staff responsibilities.

Is Encryption Mandatory Under GDPR?

GDPR does not impose a blanket rule that encryption must always be used. What the law does require is appropriate security measures, and in many situations, encryption becomes the only reasonable way to meet that obligation.

GDPR follows a risk-based approach rather than fixed technical rules. Article 32 requires controllers to implement technical and organizational measures that are appropriate to the level of risks posed by the processing. This means the law does not treat encryption as a universal expectation, but it does require measures that effectively reduce the likelihood and impact of unauthorized access, loss, or misuse of personal data.

In reality, this often leads to encryption being expected in very many cases. Where personal data is sensitive, processed at scale, stored on portable devices, transmitted over networks, or exposed to higher security risks, encryption is one of the few measures capable of making data unintelligible if something goes wrong. In such contexts, failing to encrypt may be difficult to justify.

This is why “not mandatory” does not mean optional. GDPR does not allow organizations to dismiss encryption simply because it is not expressly required. Instead, controllers must be able to explain why their chosen measures are sufficient given the risks involved. If personal data is compromised and no effective safeguards were in place to prevent unauthorized parties from exploiting the data, regulators may conclude that Article 32 was not satisfied.

A prime example is the enforcement action by the Polish supervisory authority (UODO) against Res-Gastro M. Gaweł Sp. k. a catering company that was fined after an employee lost a USB flash drive containing unencrypted personal data. The data included an employee’s name, address, date of birth, PESEL number, passport details, contact information, and salary data. The authority found that the company had failed to properly assess the foreseeable risk associated with portable storage media and had not implemented appropriate technical safeguards, despite the sensitivity of the data. As a result, UODO imposed an administrative fine of approximately PLN 240,000 (around €55,000) for breaches of the GDPR’s security obligations, including Article 32.

Encryption, therefore, functions as a practical requirement in many real-world scenarios, even without being a formal legal mandate. The obligation is to protect personal data in a way that is proportionate, effective, and defensible in light of the risks.

When Encryption Is Expected in Practice

GDPR does not name specific situations where encryption is always required. Instead, regulators look at risk: how likely a security incident is, and how serious the impact would be for data subjects if personal data were exposed. There are some situations where the risk is quite high, which means encryption is often expected because unencrypted data would leave individuals exposed if something goes wrong. They include the following;

a) High-risk processing activities

Encryption is commonly expected where processing activities create a high risk to individuals’ rights and freedoms. This includes large-scale processing, systematic monitoring, or processing that could significantly affect people if data were misused.

Under Article 35 of the GDPR, data controllers must conduct a Data Protection Impact Assessment (DPIA) for these high-risk tasks to identify potential dangers. If the risk to individuals is high, Article 32 demands that the security measures used must be equally strong to match that risk. Encryption is increasingly viewed as the most appropriate safeguard in these scenarios because it changes the outcome of a breach. While a hacker might succeed in stealing a file, encryption ensures they cannot read or exploit the information inside. 

To put it simply, the more damage a data breach could cause, the harder it should be for anyone to read the data in the first place.

b) Sensitive and special category data

The GDPR requires extreme caution when handling special category data. This includes highly personal information such as health records, genetic data, biometric identifiers, and details about a person’s race or ethnic origin. Unlike general information, this data is protected under Article 9 because it is uniquely sensitive. If this information is exposed, it can cause permanent harm to an individual.

Because the potential for damage is so high, regulators now treat encryption as the baseline or minimum requirement for this type of data. Even if an organization only handles a small amount of sensitive information, the legal expectation for protection remains at the highest level. In modern enforcement cases, if sensitive data is leaked in a readable format, authorities almost automatically rule that the organization failed its legal duty. It is nearly impossible to prove that security was “appropriate” under Article 32 if these high-stakes files were left unencrypted.

c) Cloud and third-party processing

Encryption is frequently expected when personal data is processed by cloud providers or other third parties. In these arrangements, the controller no longer has full physical or technical control over the environment where the data is stored or processed.

GDPR addresses this risk through Articles 28 and 32, which require controllers to ensure that processors provide appropriate security guarantees. Encryption helps bridge the trust gap created by outsourcing. Even if a third party suffers a breach, encrypted data may remain unintelligible, reducing the impact on individuals and the controller’s exposure.

Regulators also often look closely at whether encryption was used to protect data against risks introduced by shared infrastructure, multi-tenant systems, or access by external staff.

d) Portable devices and remote access

Portable devices such as laptops, USB drives, and mobile phones are a recurring source of GDPR breaches. Because these items are small and move between locations, they are frequently lost or stolen. When personal data is stored on these devices, regulators expect encryption to be active by default.

Supervisory authorities have repeatedly fined organisations after unencrypted devices were lost, even where there was no evidence of malicious intent. A primary example is the Polish catering company that was fined around €55,000 by the Polish supervisory authority after losing a USB drive with unencrypted personal data. The reasoning was straightforward: the risk of loss is well known, and encryption was a readily available safeguard. Therefore, failing to use it was a sign of negligence.

Where remote access is involved, encryption also protects data from interception and unauthorised access outside the controller’s controlled environment.

The Legal Basis for Encryption Requirements

a) Article 32 and security appropriate to the risk

Article 32 is the main legal anchor for encryption under GDPR. It requires controllers and processors to implement appropriate technical and organisational measures to ensure a level of security that matches the risk. Encryption is explicitly mentioned as an example of such a measure, but only in relation to risk, not as a blanket rule.

What matters legally is not whether encryption is used, but whether the security measures chosen actually reduce the likelihood and impact of unauthorized access, loss, or disclosure. If the risk to individuals is high and encryption could reasonably mitigate that risk, regulators expect it to be considered and, in many cases, applied. Failure to do so shifts the legal burden onto data controllers to explain why encryption was not appropriate in that context.

In practice, Article 32 turns encryption into a default expectation in high-risk scenarios, even though it is not formally mandatory.

b) Recital 83 and protection against unauthorised access

Recital 83 explains why Article 32 takes this approach. It states that personal data should be processed in a way that prevents unauthorized access and use, particularly where processing could lead to physical, material, or non-material damage. Encryption is highlighted as a way to prevent unauthorized persons from understanding personal data, even if they gain access to it.

Essentially, the recital clarifies GDPR’s logic. The regulation is less concerned with whether a breach occurs and more concerned with what happens after a breach. If encryption ensures that stolen or leaked data cannot be understood, the risk to data subjects is significantly reduced.

Recital 83, therefore, reinforces the idea that encryption is about damage control, not just breach prevention.

c) Article 25 and data protection by design

Article 25 adds another layer by requiring data protection by design and by default. This means security measures must be built into processing systems from the outset, not added after an incident occurs. Encryption fits naturally into this obligation because it can be integrated at the design stage to limit exposure throughout the data lifecycle.

Where personal data is processed in environments that increase risk, such as cloud platforms, remote access systems, or portable devices, regulators expect encryption to be considered early. Choosing not to encrypt in such settings often signals that data protection was treated as an afterthought, which undermines compliance with Article 25.

In enforcement actions, regulators frequently cite Article 25 alongside Article 32 to show that encryption was not only missing, but missing by design. A clear example is the Meta plaintext password case, where the Irish Data Protection Commission fined Meta €91 million after finding that millions of user passwords had been stored in plaintext on internal systems for several years. Although there was no evidence of external misuse, the DPC held that passwords are inherently high-risk data and should have been protected by appropriate cryptographic measures, such as encryption or hashing, from the outset. The regulator found breaches of Article 5(1)(f) and Article 32, and treated the issue as a structural design failure.

What Counts as Adequate Encryption Under GDPR

Encryption at rest and in transit

Encryption at rest and encryption in transit address two distinct and well-documented risk moments in the data lifecycle, and GDPR treats both as security-relevant in different ways.

Encryption in transit protects personal data while it is being transmitted between systems, for example, when data is sent from a user’s browser to a server, transferred between internal systems, accessed remotely by staff, or shared with a cloud provider. Article 32(1) requires controllers to protect personal data against unauthorised disclosure during processing, and transmission is one of the most exposed stages. This is why regulators treat unencrypted data sent over public or shared networks as a clear security failure. After all, interception risks are well known, foreseeable, and technically preventable. The absence of transport-level encryption (such as TLS) is therefore difficult to justify under the “appropriate technical measures” standard in Article 32.

Encryption at rest, by contrast, protects personal data once it is stored, whether in databases, servers, backups, laptops, mobile devices, or removable media. This becomes critical in scenarios involving device theft, lost hardware, insider misuse, or system compromise. GDPR does not assume that access controls alone are sufficient in these cases. If an attacker bypasses authentication or physically removes a storage medium, unencrypted data is immediately readable. In enforcement, this is treated as a failure to mitigate a known and recurring risk, particularly where portable devices or large datasets are involved.

It’s important to note that encrypting data in only one state does not compensate for weakness in the other. Encrypting stored data does not protect it if it is transmitted in clear text, and encrypting network traffic does not protect data left readable on a lost laptop or compromised server. From a regulatory perspective, this creates an incomplete security posture. This distinction usually comes up during enforcement.

Where personal data is lost, intercepted, or accessed without authorisation, regulators examine whether encryption was applied at the precise moment the exposure occurred. If it was not, the controller must explain why that risk was foreseeable yet left unaddressed. In most cases, that explanation is difficult to sustain under Articles 32 and 5(1)(f), which require integrity and confidentiality throughout processing. A clear illustration is the British Airways breach, where attackers injected malicious code that intercepted customer details in transit because the necessary protections were not implemented at the point of processing, leading the UK Information Commissioner’s Office to find breaches of Articles 32 and 5(1)(f) and impose a £20 million penalty.

State of the art and proportionality

GDPR does not require “maximum” security. It requires reasonable security, judged against what is technically available and realistically achievable at the time. This balance is captured in Article 32(1), which obliges controllers to take account of the state of the art, the costs of implementation, and the nature, scope, context and purposes of processing when choosing security measures, including encryption.

“State of the art” does not mean experimental or cutting-edge technology. Regulators interpret it as methods that are widely accepted, well-tested, and commonly used to address known risks. For encryption, this includes modern, industry-standard algorithms and protocols that are considered secure by current cryptographic consensus. Using outdated or broken encryption is treated almost the same as using no encryption at all, because the risk of compromise is already known.

Privacy watchdogs have criticised or fined controllers for relying on legacy systems, deprecated protocols, or weak cryptographic methods long after their weaknesses were publicly documented. For example, the French data protection authority fined Dedalus Biologie €1.5 million after a breach exposed the health data of nearly 500,000 people. It was discovered that the company failed to encrypt sensitive data and address known security gaps previously reported by an employee, which was a clear violation of Article 32’s risk-based security requirement.

When it comes to proportionality, GDPR recognises that security must be scaled to risk, not applied uniformly. Encrypting a small internal contact list may not require the same level of protection as encrypting a large database containing financial, health, or identity data. However, proportionality works in both directions. As risk increases, so does the expectation of stronger encryption. Large-scale processing, sensitive data, remote access, or exposure to public networks all push encryption from a “reasonable option” toward an expected baseline safeguard.

Cost is sometimes raised as a counterargument, but regulators treat this carefully. Article 32 allows cost to be considered, but it does not allow cost to override foreseeable harm. Where encryption is inexpensive, widely available, and easy to deploy — as is the case for most modern systems — failing to implement it is unlikely to be justified.

Key Management and Effective Implementation

Encryption protects data only if the keys that unlock that protection are themselves carefully controlled. In GDPR terms, weak or mismanaged keys can nullify encryption, exposing data just as clearly as if it had never been encrypted.

Separation of keys and data

The most basic principle of key management is that encryption keys must be stored separately from the encrypted personal data they protect. If an attacker gains access to both the encrypted data and the keys that unlock it, the encryption provides no practical protection at all. Industry guidance emphasises that keys should be stored in secure, dedicated environments, such as hardware security modules (HSMs) or specialist key vaults, not in clear-text files alongside the data or in application configuration. This separation ensures that a system compromise does not automatically expose both the ciphertext and the decryption keys.

Access controls and role limitations

Strong key management also depends on restricting who can create, view, use, rotate, or revoke encryption keys. GDPR’s security and accountability principles assume that only authorized personnel have access to sensitive operations; keys are a prime example of this. Regulatory guidance from the UK Information Commissioner’s Office (ICO) highlights role-based access control and the principle of least privilege as essential components of secure key management. This requires granting the minimum level of access necessary for a specific task, such as allowing systems to use keys without exposing them to staff, or permitting security administrators to rotate keys without broader data access, and regularly reviewing those permissions as roles change or individuals leave the organization.

Encryption and Personal Data Breaches

How encryption affects breach risk assessments

Under GDPR, a personal data breach is any security incident that leads to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to personal data. Controllers must first decide whether such an event has occurred and then assess whether it is likely to result in a risk to the rights and freedoms of natural persons. If the risk is present, the breach must be notified to the supervisory authority under Article 33. A key part of this risk assessment is whether the personal data was rendered unreadable to unauthorised parties — that is, whether it was protected by effective encryption at the time of the breach. If the data would remain unintelligible even after compromise, for example, because it was encrypted with strong standards and the encryption keys were not exposed, the practical impact of the breach on data subjects is reduced. This reduction of potential harm directly affects the risk assessment required under Article 33.

Article 33: notifying authorities

Article 33 GDPR sets out controllers’ obligation to notify a personal data breach to the relevant supervisory authority without undue delay and, where feasible, no later than 72 hours after becoming aware of it if the breach is likely to result in a risk to individuals’ rights and freedoms. The notification must include specific details such as the nature of the breach, the categories and approximate number of affected persons and records, and the measures taken or proposed to mitigate the breach’s effects. Controllers must document breaches even when no notification is required, so authorities can verify compliance. Because encryption affects the underlying risk level, it also influences whether a breach must be reported to the data subjects under Article 34.

Article 34: Avoiding data subject notification

Article 34 GDPR deals with notification to individuals whose personal data has been compromised. Controllers must communicate a breach to affected individuals without undue delay if the breach is likely to result in a high risk to their rights and freedoms. This is a separate obligation from notifying the supervisory authority, and it is triggered by a higher degree of risk. Article 34(3) provides exceptions: if the controller has implemented appropriate technical and organizational protection measures, and those measures were applied to the personal data affected by the breach, then direct notification to data subjects may not be required. One of the clearest examples provided in the text is where personal data is encrypted such that it is unintelligible to persons who are not authorised to access it. In that situation, the controller can demonstrate that the breach, although reported under Article 33, does not necessitate individual notifications because the encrypted data could not be understood or exploited.

How Regulators Assess Encryption Decisions

a) Risk-based analysis

Under Article 32 GDPR, encryption is not required in every case. Instead, controllers must implement technical and organizational measures that are appropriate to the level of risk posed by processing. This is, of course, taking into account the state of the art, costs of implementation, and the nature, scope, context and purposes of processing, as well as the likelihood and severity of risk to individuals’ rights and freedoms. Encryption is specifically mentioned as an example of such measures, alongside pseudonymisation.

Regulatory guidance from the UK Information Commissioner’s Office (ICO) states that encryption should be considered according to the nature of the processing and the risks personal data faces. It also states that controllers should assess the scope and context of their processing when deciding whether encryption is appropriate.

In enforcement, national authorities have cited Article 32’s risk-based language when concluding that inadequate encryption contributed to a violation. The French data protection authority (CNIL) fined Nexpublica France €1.7 million for cybersecurity failures, including insecure processing arrangements. It highlighted that failing to address known risks to personal data constituted a breach of the obligation to implement appropriate security measures.

b) Documentation and DPIAs

Documentation is central to how regulators judge encryption decisions. GDPR’s accountability principle requires controllers to demonstrate compliance, and not merely state it. Documentation includes records of processing activities, risk assessments, and the reasoning behind selecting (or not selecting) specific security measures.

Where processing is likely to result in high risk, GDPR requires a Data Protection Impact Assessment (DPIA) under Article 35 GDPR. DPIAs are formal risk assessments that identify and evaluate privacy risks before processing begins, including whether encryption or alternative safeguards are necessary. These assessments must be completed and documented for processing that involves systematic monitoring, sensitive data, large-scale activities, or new technologies. Regulators explicitly expect DPIAs to be performed early and updated throughout a project’s lifecycle.

A well-conducted DPIA typically includes:

  • Identification of risks to the rights and freedoms of individuals,
  • An assessment of the likelihood and severity of those risks,
  • A list of measures (such as encryption) to mitigate risk,
  • Documentation explaining why specific choices were made or rejected.

When encryption is not used, authorities expect documentation showing why it was not appropriate, given the assessed risks. Lack of such documentation has figured in enforcement decisions: authorities have penalised organisations that failed to demonstrate that they properly considered available technical safeguards during planning and did not record that analysis. For example, in a late 2024 decision, the Polish supervisory authority fined a commercial controller 81 000 € after a ransomware attack exposed sensitive personal data because the company had not conducted an adequate risk analysis, could not demonstrate why appropriate safeguards had been chosen or rejected, and could not show that it considered reasonably available protections. The controller also failed to verify that its processor offered sufficient guarantees

c) State of the art and evolving expectations

Encryption choices must reflect the current state of the art. GDPR does not set fixed technical standards, recognizing that technology evolves. However, regulators treat industry-accepted cryptographic standards as the baseline for “state of the art”.

The ICO’s guidance explicitly links GDPR compliance to using solutions that meet current standards, such as the Advanced Encryption Standard (AES) and accredited cryptographic modules (e.g., FIPS 197 and FIPS 140-3). Controllers are reminded to assess whether chosen encryption algorithms, key lengths, and implementations align with widely recognised frameworks.

A failure to keep security measures aligned with emerging threats can itself constitute a breach of Article 32. A classic example is the Danish Data Protection Agency case. On it, it was ruled that the use of outdated versions of the Transport Layer Security protocol (TLS 1.0/1.1) on a platform processing names, addresses, and social security numbers likely violated GDPR’s security requirements because these protocols contain well-known vulnerabilities and do not ensure confidentiality or integrity. Similarly, the Spanish AEPD fined a controller for transmitting personal data over unencrypted HTTP rather than HTTPS, noting that HTTPS is a widely accepted standard for encryption in transit and its absence exposed personal data to foreseeable risk.

d) Effectiveness over checkbox compliance

Regulators assess how effective encryption actually is in practice, not whether the organisation said it was encrypted.

This principle appears clearly in guidance and enforcement reasoning. The ICO states that encryption solutions should be evaluated in context, considering residual risks and whether encryption achieves the intended security outcomes. Controllers must understand the risks that remain after encryption and take steps to mitigate these as part of a comprehensive security approach.

Common GDPR Encryption Mistakes

a) Treating encryption as optional by default

A core misunderstanding among controllers is the thinking that encryption is optional in all circumstances because GDPR does not mandate it universally in black-and-white language. That interpretation is factually incorrect when considering GDPR’s risk-based obligations under Article 32. Article 32 requires appropriate technical and organisational measures appropriate to risk, and encryption is explicitly listed as an example of such a measure.

b) Encrypting Data but Exposing the Keys

Encrypting personal data is meaningless if the encryption keys are poorly protected. This is one of the most misunderstood failures under GDPR. Organizations often store encryption keys in the same environment as the encrypted data, hard-code them into application code, or share them widely across teams for convenience.

From a regulatory perspective, this collapses the entire security model. If an attacker gains access to both the encrypted data and the key through a compromised server, leaked repository, or misconfigured cloud permission, the data is effectively exposed. GDPR does not assess encryption in isolation; it evaluates whether the measure actually reduces risk in practice.

Modern breaches frequently involve credential theft rather than brute-force attacks. If encryption keys are accessible through standard admin access or stored in plain configuration files, encryption offers little real protection. Proper key management, such as separation of duties, restricted access, rotation, and secure storage, is what makes encryption a meaningful safeguard under GDPR, not the algorithm by itself.

c) Over-Reliance on Vendors

Another common mistake is assuming that using a cloud provider or third-party service automatically satisfies GDPR encryption requirements. Many organizations believe that if a vendor advertises “encrypted by default,” their responsibility ends there. Under GDPR, this assumption is incorrect.

While vendors may provide encryption tools, the organization remains responsible for deciding how personal data is encrypted, where it is encrypted, and who can access it.

In today’s ecosystem, data often flows between multiple vendors—CRMs, analytics tools, email platforms, and support systems. Each transfer point introduces risk. GDPR expects organizations to understand these flows and ensure encryption is consistently applied, not just rely on vendor marketing claims. Delegating responsibility does not remove accountability.

d) Failure to Review Encryption Decisions Over Time

Encryption decisions are often made once, during system setup, and then forgotten. This is a serious mistake. GDPR treats security as an ongoing obligation, not a one-time compliance task. What was appropriate encryption five years ago may no longer be adequate today.

For example, an organization might continue using outdated encryption standards, legacy key lengths, or old assumptions about data sensitivity. At the same time, the volume of personal data may have increased, new categories of data may be processed, or access may have expanded to new teams or regions.

Therefore, in the current environment, where threats evolve quickly and regulatory scrutiny is increasing, failing to review encryption choices creates hidden compliance gaps. Controllers are required to reassess technical measures in light of current risks, technologies, and processing activities. Encryption that is never revisited slowly loses its protective value, even if it remains technically “in place.”

Final thought

Article 32 makes clear that organizations must continuously assess whether their security measures, including encryption, remain appropriate in light of evolving risks, technologies, and the nature of the personal data they process.

Enforcement decisions show that failures often arise not just from the absence of encryption, but also from poor implementation, weak key management, over-delegation to vendors, or outdated configurations left unreviewed over time. For controllers, the lesson is straightforward: encryption decisions must be deliberate, documented, and regularly reassessed.

When encryption is treated as part of an ongoing risk-management obligation rather than an optional safeguard, controllers are far better positioned to meet GDPR requirements and withstand regulatory scrutiny.

Leave a Comment

X