Privacy engineering and transparency

One of the principle tenets of privacy engineering is transparency. Data subjects should be able to see exactly how the data is being used. However, a complex and time intensive tool, ala Google Dashboard, is not as good as a intuitive and obvious design at the forefront.

Consider the following scenarios (which I stole from my own comment on Kashmir Hill’s Forbes blog.)

(1) A patron enters a club where an employee sits with two clickers. When a female customer enters, the employee clicks the clicker in his right hand; a male, the left. The clickers transmit data to the internet where a mobile App publicizes the clubs male/female ratio.

(2) A patron enters a club where a camera takes a picture of the customer’s face. A computer hooked to the camera analyzes the picture and makes a call as to whether the person is male or female. The computer then transmits the data to the internet where a mobile App publicizes the clubs male/female ratio.

What’s the difference? Even in my description, it’s unclear what the data is in the second example. The male and female count? The customer’s photo? To a customer, the mere fact that a picture was taken raises a concern as to what is being done with this image. The company (and the club) may say the data is aggregated, but how can the customer trust them? In the first example, the method by which the collection takes place makes it obvious to the customer that nothing beyond the male/female information was collected. In other words, the design of the system in the first example provides obvious cues to customer. The customer doesn’t need to audit the App companies computer system to know that information isn’t being inadvertently used, purposefully repurposed, or susceptible to hacks, leaks or legal subpoenas. The design of the system is privacy protective by default.