For the last two years I’ve been lamenting the lack of standardization around the term “privacy enhancing technologies.” In fact, as I see it, the term has been bastardized to mean whatever the speaker wants it to mean. Shortened with the moniker PETs, the term is used in both the privacy professional’s community and in the academic realm of cryptographic research. Newer incarnations, “privacy enabling technologies” and “privacy enhancing techniques,” do not even make the cut on Google’s Ngram service, which rates occurrences of terms in books (See chart below).
In 2008, the British Information Commissioner Office recognized the definitional problem in a paper on Privacy by Design and PETs:
There is no widely accepted definition for the term Privacy Enhancing Technologies (PETs) although most encapsulate similar principles; a PET is something that:
1. reduces or eliminates the risk of contravening privacy principles and legislation.
2. minimises the amount of data held about individuals.
3. empowers individuals to retain control of information about themselves at all times.To illustrate this, the UK Information Commissioner’s Office defines PETs as:
“… any technology that exists to protect or enhance an individual’s privacy, including facilitating individuals’ access to their rights under the Data Protection Act 1998”.The definition given by the European Commission is similar but also includes the concept of using PETs at the design stage of new systems:
Defining Privacy Enhancing Technologies
“The use of PETs can help to design information and communication systems and services in a way that minimises the collection and use of personal data and facilitates compliance with data protection rules. The use of PETs should result in making breaches of certain data protection rules more difficult and / or helping to detect them.”
The problem with such definitions is that they are broadly written and thus broadly interpreted and can be used to claim adherence to protecting privacy when in fact, one is not. This also leads to the perverse isolation of privacy protection as synonymous with data protection, which it is not. Privacy is as much about risk of aggregation, intrusion, freedom of association and other forms of invasions in the personal space that we designate and distinguish from the social space where we exist in a larger society.
I see the Privacy by Design (PbD) camp dancing around this. Ann Cavoukian, the Ontario Information and Privacy Commissioner and chief PbD champion, has promoted PETs for years and this evangelism is evident in the PbD foundational principle of full functionality. However, even she has allowed this termed to be applied loosely to make it more palatable to her audience. PbD and PETs thus become buzzwords to attach to an effort in a marketing ploy to give the appearance of doing the right thing, but often results in minimal enhancing of privacy.
I thus suggest the follow definition, and one to which I use in my own vernacular, a privacy enhancing technology is “a technology whose sole purpose is to enhance privacy.” Firewalls, something I see too often referred to as a PETs by laypersons, can enhance privacy but it’s purpose is not necessarily to do so. It is a security technology, protecting confidentiality and also securing the integrity and availability of the systems it protects. Data loss prevention, similarly, can be actually very privacy invasive but could enhance the privacy of data on some occasions. However, the primary purpose is to protect against loss of corporate intellectual property (be it personal information of customers or not), not enhance privacy.
Technologies which would qualify can be found in mixmaster networks (whose sole purpose is to obscure the sender and receiver identity of email) or zero knowledge proofs and related secure multi-party computations (which allow for parties to calculate public functions on private data without revealing anything other than the conclusions of the public function).
Some technologies may be privacy enhancing in application but the technology wasn’t created for the purpose of enhancing privacy. My purpose here is not to split hairs on the definition, per se. My purpose is to expose the dilution of the term to where it becomes doublespeak.