Skip to content
Blog

Why PETs (privacy-enhancing technologies) may not always be our friends

How privacy-enhancing technologies can exacerbate rather than ameliorate technology and data governance concerns

Elizabeth Renieris

29 April 2021

Reading time: 8 minutes

afety concept: Locks on digital background

The pandemic has accelerated the digitalisation of daily life, as well as the roll-out of an array of digital tools and technologies. From contact tracing or exposure notification apps for COVID-19 to so-called ‘vaccine passports’ or digital apps used to present someone’s vaccination or health status, much of the public debate and conversation over these technologies has focused on concerns about individual privacy and security.

Proponents of these technologies emphasise their privacy and security features, highlighting things like ‘local’ or on-device data storage and processing, various degrees of decentralisation, the use of data minimisation and pseudonymisation techniques. Often, they propose the use of privacy-enhancing technologies or ‘PETs’ as a way to reassure the public and dampen critique, thereby enabling the roll-out and adoption of technologies that should also raise serious concerns about legitimacy, efficacy and power.

Though newly popular in light of the pandemic, PETs are not new. In fact, the term can be traced back to as early as 1995, when the Information and Privacy Commissioner of Ontario and the Dutch Data Protection Authority co-authored a report on technologies that enable anonymous online transactions, as well as privacy and anonymity more broadly.

But in the more than 25 years that have passed since the term was first coined, it is unclear that PETs have been effective in safeguarding privacy or data protection principles. In many ways, the situation regarding personal data has never been more precarious, with one high-profile data breach after another making headlines.

After outlining what PETs are, this article proposes that PETs have suffered from a number of challenges and shortcomings that may actually contribute to the insecurity and precarity of our digital experience, and draws some lessons for the future of data protection and privacy in the face of a rapidly digitising world.

What are PETs?

There is no single definition or standard for what constitutes a PET, though the term is typically used to refer to technologies or approaches that can help mitigate privacy and security risks. Some popular examples of PETs include forms of encryption such as format-preserving and homomorphic encryption, cryptographic protocols like secure multi-party computation and secret sharing, differential privacy and obfuscation techniques, and various means of anonymisation or pseudonymisation.

Leading academic researchers define PETs as a ‘wide array of technical means for protecting users’ privacy’, while industry stakeholders use the term to refer to various technical means for protecting privacy by providing anonymity, pseudonymity, unlinkability and unobservability of data subjects.

Policymakers typically use the term ‘PETs’ to refer to technological tools or methods that help to achieve compliance with privacy or data protection legislation or requirements, often in combination with organisational measures, including information security-related policies and procedures, personnel management and access controls, recordkeeping, and audits, among others. The European Union Agency for Cybersecurity (ENISA), identifies PETs as a broad range of technologies that are designed for supporting data minimisation, anonymisation and pseudonymisation, and other core privacy and data protection principles.

These principles are increasingly embedded in laws that require entities that process personal data to leverage best-in-class methods to secure and protect it. For example, Europe’s General Data Protection Regulation (GDPR) mandates that data controllers implement data protection by design and default, including through the use of ‘state of-the-art’ technological processes. What counts as ‘state-of-the-art’ or best-in-class is ever-evolving and requires the constant evaluation of available tools and methods, including PETs.

Boosted in part by the GDPR, the California Consumer Privacy Act (CCPA) and successor California Privacy Rights Act (CPRA), and other emerging data protection and privacy laws and regulations around the world, PETs are a rapidly growing market attracting considerable investment. Because there is no consensus around the definition of a PET – the term is often conflated with the broader concept of ‘privacy tech’, digital rights management techniques, and the even broader notion of ‘RegTech’ technologies designed to help with regulatory compliance – making it hard to accurately price the market. However, the market for homomorphic encryption tools in Europe alone, which is expected to grow from US $31.99 million in 2019 to US $66.50 million by 2027, demonstrates that the commercial interest is substantial.

What are the downsides of PETs?

Despite their promise and growing demand, there are significant downsides to PETs, some less obvious than others. One of the most common critiques of PETs is that they can be complex and difficult to use, resulting in user errors that can actually undermine individual privacy and security. Another common complaint is that they are expensive and require vast computational capacity not available to many market participants (leaving aside the environmental impact of those computational resources).

Given the complexity and resource limitations, PETs can also be difficult for law and policymakers to audit or govern. The combination of usability and accessibility challenges and a lack of accountability means the use of these technologies can create a false sense of safety or security that is often not borne out in practice. With false assurances, the use of PETs could actually incentivise more data collection and sharing, further undermining core data protection principles like data minimisation.

Moreover, given the lack of a common definition and standards for PETs, it can be difficult to evaluate the efficacy of any given PET in a specific context. Take the example of differential privacy, a technique developed in the early 2000s to leverage the power of ‘big data’ while reducing the likelihood of harmful data disclosures or malicious use of that data by injecting mathematical ‘noise’ to obscure individual identities in a data set.

While differential privacy is aimed at preserving the privacy of individuals and groups while making datasets accessible for research and data analysis (as well as commercial product and service delivery and enhancement) there are also significant trade-offs, especially in terms of accuracy, fairness, explainability, and even security and robustness. There is also evidence that techniques like differential privacy have merely shifted risks from external to internal threats to privacy and security, such as by incentivising bad actors to rely on insider attacks rather than more traditional cyber crimes such as hacking.

Even where PETs succeed with data minimisation and do not result in sharing more personal data, they can legitimate activities that many would otherwise find objectionable or concerning. For example, some of the newest and more advanced PETs are often available only to entities who already wield disproportionate power over the digital environment through vast troves of data and computational resources.

Large technology companies like Apple and Google, who control vast swaths of market share over both the hardware and software that mediates our digital experiences, are often the first to embrace and implement the use of these tools and techniques to defend and expand their control and dominance, while assuaging the concerns and critiques of privacy and security advocates and regulators.

The COVID-19 contact tracing app debate, and negotiation between Apple and Google, and sovereign nation states was a clear example of this. As one expert scholar explained, ‘Data is just a means to an end, and new, cryptographic tools are emerging that let those firms’ same potentially problematic ends be reached without privacy-invasive means. These tools give those controlling and coordinating millions or even billions of computers the monopolistic power to analyse or shape communities or countries, or even to change individual behaviour, such as to privately target ads based on their most sensitive data – without any single individual’s data leaving their phone.’

What does this mean for the future of data protection and privacy?

PETs have long been celebrated for their potential to mitigate privacy and security risks, support data protection principles, and, in some circumstances, achieve compliance with certain aspects of data protection and privacy-related rules and regulations. And, as those regulations proliferate around the world, the market for and interest in PETs grows in proportion. Yet, despite the growing interest, use, and commercial value of PETs, our data governance crisis only seems to deepen. In some respects, PETs may be partly to blame.

In practice, PETs are complex, expensive and resource intensive, making them hard to implement and prone to user error. And despite their benefits, the use of PETs can further consolidate power and control in the hands of those who already have too much of both, typically those with the resources to exploit them. PETs can also create a false sense of safety and security, and thereby incentivise and legitimate activities or practices that we might otherwise find objectionable. As a result, they can help perpetuate the status quo and prevent more sweeping reforms or changes to business-as-usual by reducing the urgency to act.

In the final analysis, PETs are just technological tools and methods, and rarely is the answer to the challenges posed by technology more technology. There is no question that PETs can be useful in reducing some of the risks associated with legitimate uses of data but, if we are not careful, an overreliance on PETs risks making privacy the handmaiden of surveillance and technological overreach.


Image credit: maxkabakov

Related content