And the winners are…

The project I was assisting with, InformaCam, wasn’t quite ready for prime time at the end of the day, but several great project were. Some of the awards that were given include:

Best scanning project: TOSback – it scans privacy policies from thousands of sites and uploads it to github to monitor changes over time.

Best education project: privacy bucket – gives a detailed ongoing analysis of the information the sites you visit think they know about you.

Best control project: an easy to use encrypted chat system that supports up to 21 users, includes an android application and web interface.

The judges also awarded some impromptu awards
sitescoper – a website that previews a site for you and assigns risk based on how many trackers there are, suspicious keywords, etc

Most ready for prime time: Mobile Scope – a proxy service which essentially performs a man in the middle attack for YOU so that you know what mobile applications are transmitting encrypted from you phone. Perfect for detecting those pesky PATH apps stealing your contact data. The proxy could also corrupt the data before sending on to the app owner.

The zuckerburg opportunity of the day award to pestagram a combination of instagram and pinterest. It changes the context so that images most people assume are private (in Instagram) they come to realize are actually publicly available.
best listener award – privacy bucket take off with $$$
cans and strings – ostel

Calyx Institute

I met Nick Merrill yesterday who is launching the Calyx Institute. The goal of the Institute is to provide a privacy and security centric ISP. Once that embraces limited data collection and strong encryption as a bulwark against threats to privacy and security. Nick has the distinction of being the only person to fight against a National Security Letter and it’s onerous gag order provisions and win after a prolonged 8 year battle.


I’ve spent part of the day here at the WSJ Data Transparency hack-a-thon helping InformaCam set up an AWS EC2 instance and FTP server. They are going to use it to allow their Android app to upload images with encrypted blurred faces. The app uses facial recognition to create a blurred version (see ObscuraCam) but with the clear picture in the images’s meta data encrypted with the public key of various friends thus allowing them to decrypt it and display the full image (online present). In addition, a host of other meta data will be encrypted and signed with the user’s digital key, creating a evidentiary trail should in be necessary in a court of law. More information is available at the InformaCam link above.

My usefulness at this hack-a-thon is limited for sure. Most of the people here are far better and more competent developers than I. I did attempt to help figure out a problem with the Cryptocat’s encrypted chat on an Android device but lacking the correct version of Android, we couldn’t make much headway.

For the remainder of the event I’m going to work on expanding SalaryBallPark to illustrate the three tiers of the privacy pyramid: promising, protecting, and preserving. I’m also using the time to catch up on my Crypto Class.

P.S. I’ve also broken down and joined Twitter @PrivacyMaverick. Don’t expect me to tweet much.

New York Privacy events

I just wrapped up the NYU/Princeton’s Mobile and Location Privacy: A Technology and Policy Dialog. Some very interesting discussion from some of the leaders in the privacy world including Kashmir Hill, Julian Sanchez, Helen Nissenbaum, Sid Stamm, Askhan Soltani, and many more.

The event was broken into three round tables with a few speakers in between: Models of Self Regulating and Regulating Privacy; Phones, Drones & Social Networks: New Technologies and the Fourth Amendment after Jones; Privacy and the Many Layers of Mobile Platforms.

Some of the other things I learned at the conference include

That NYU Privacy Research Group has a blog which I’ll be adding to my list of sites I’ll be reading.

The idea that a smartphone with an application to remind people to take medicine becomes a medical device.

That Grindr uses UUID to essentially provide anonymity to users (by not requiring any other uniquely identifying information). However, now that Apple is deprecating the UUID, Grindr may be forced to ask for additional information from user to authenticate them.

Julian Sanches talked about how some companies, such as Google, are arguing they are not communications provider in some instances, such as with Google Maps, but rather that recipients of information in the context of ECPA. An interesting concept I haven’t heard before and that I want to research further.

Askhan Soltani, when asked how he was reading encrypted traffic over SSL to check what information phones were transmitting, replied that he created his own root CA and thus falsified the certificates apps were using for SSL, essentially causing a man in the middle attack. He said, some corporations do the same thing in order to do deep packet inspection on “encrypted” traffic on their networks. The can do this because they can control the root certificates on the devices distributed to their employees, something that can’t control under BYOD (Bring You Own Devices). This is good reason you should always validate your SSL certs against publicly know signatures.

Finally, I made the comments that The Girls Around Me application, is actually a good thing because it brings visibility to the information that people are exposing. More specifically, under Ryan Calo’s Privacy Harm regime, it transfers the risk of objective harm (being stalked) into a more subjective harm (that some creep might know my location).

The second point I made is why can’t my phone lie for me? We are concerned when an alarm clock application asks for location information but why can’t I just tell my phone to respond with a lie? Whenever a retailer asks my zip code, I tell them 90210. The usually give me a funny look but they accept it.