I met Nick Merrill yesterday who is launching the Calyx Institute. The goal of the Institute is to provide a privacy and security centric ISP. Once that embraces limited data collection and strong encryption as a bulwark against threats to privacy and security. Nick has the distinction of being the only person to fight against a National Security Letter and it’s onerous gag order provisions and win after a prolonged 8 year battle.
Author: privacymaverick
#WSJdata
I’ve spent part of the day here at the WSJ Data Transparency hack-a-thon helping InformaCam set up an AWS EC2 instance and FTP server. They are going to use it to allow their Android app to upload images with encrypted blurred faces. The app uses facial recognition to create a blurred version (see ObscuraCam) but with the clear picture in the images’s meta data encrypted with the public key of various friends thus allowing them to decrypt it and display the full image (online present). In addition, a host of other meta data will be encrypted and signed with the user’s digital key, creating a evidentiary trail should in be necessary in a court of law. More information is available at the InformaCam link above.
My usefulness at this hack-a-thon is limited for sure. Most of the people here are far better and more competent developers than I. I did attempt to help figure out a problem with the Cryptocat’s encrypted chat on an Android device but lacking the correct version of Android, we couldn’t make much headway.
For the remainder of the event I’m going to work on expanding SalaryBallPark to illustrate the three tiers of the privacy pyramid: promising, protecting, and preserving. I’m also using the time to catch up on my Crypto Class.
P.S. I’ve also broken down and joined Twitter @PrivacyMaverick. Don’t expect me to tweet much.
New York Privacy events
I just wrapped up the NYU/Princeton’s Mobile and Location Privacy: A Technology and Policy Dialog. Some very interesting discussion from some of the leaders in the privacy world including Kashmir Hill, Julian Sanchez, Helen Nissenbaum, Sid Stamm, Askhan Soltani, and many more.
The event was broken into three round tables with a few speakers in between: Models of Self Regulating and Regulating Privacy; Phones, Drones & Social Networks: New Technologies and the Fourth Amendment after Jones; Privacy and the Many Layers of Mobile Platforms.
Some of the other things I learned at the conference include
That NYU Privacy Research Group has a blog which I’ll be adding to my list of sites I’ll be reading.
The idea that a smartphone with an application to remind people to take medicine becomes a medical device.
That Grindr uses UUID to essentially provide anonymity to users (by not requiring any other uniquely identifying information). However, now that Apple is deprecating the UUID, Grindr may be forced to ask for additional information from user to authenticate them.
Julian Sanches talked about how some companies, such as Google, are arguing they are not communications provider in some instances, such as with Google Maps, but rather that recipients of information in the context of ECPA. An interesting concept I haven’t heard before and that I want to research further.
Askhan Soltani, when asked how he was reading encrypted traffic over SSL to check what information phones were transmitting, replied that he created his own root CA and thus falsified the certificates apps were using for SSL, essentially causing a man in the middle attack. He said, some corporations do the same thing in order to do deep packet inspection on “encrypted” traffic on their networks. The can do this because they can control the root certificates on the devices distributed to their employees, something that can’t control under BYOD (Bring You Own Devices). This is good reason you should always validate your SSL certs against publicly know signatures.
Finally, I made the comments that The Girls Around Me application, is actually a good thing because it brings visibility to the information that people are exposing. More specifically, under Ryan Calo’s Privacy Harm regime, it transfers the risk of objective harm (being stalked) into a more subjective harm (that some creep might know my location).
The second point I made is why can’t my phone lie for me? We are concerned when an alarm clock application asks for location information but why can’t I just tell my phone to respond with a lie? Whenever a retailer asks my zip code, I tell them 90210. The usually give me a funny look but they accept it.
Notice
Whether you call it ‘visceral’ notice or privacy in context, the goal should be a better understanding of the tacit agreement between parties about the use of information.
PbD workshop
I’ll be putting on a 3 hour privacy by design workshop in June. As far as I know, the workshop is free (I’m not making arrangements so I’m not sure). I will post an update in a month or so with more details.
The Stanford Experiment
I don’t think the Stanford experiment is quite as amazing as some of the things I’ve seen out of Khan Academy. However, I am taking the Cryptography class and learning quite a lot. But I would say that it’s more like a traditional learning experience: watch video lectures, tackle problems. Slightly more interactive but not revolutionary. I guess the shear size of the audience is what’s revolutionary.
Orlando Startup Weekend
This past weekend I participated in StartUp Weekend in Orlando. From their website: “Startup Weekend is a global network of passionate leaders and entrepreneurs on a mission to inspire, educate, and empower individuals, teams and communities. Come share ideas, form teams, and launch startups.” “Startup Weekend is an intense 54 hour event which focuses on building a web or mobile application which could form the basis of a credible business over the course of a weekend. The weekend brings together people with different skillsets – primarily software developers, graphics designers and business people – to build applications and develop a commercial case around them.”
I’m happy to announce that my team came in 4th place out of 16 teams in the competition. While we didn’t win any direct prizes, we still have an opportunity if one of the top 3 teams fails to incorporate and move forward. Myself and Andrew Miller with assistance from Jeremy Holstein came up with an idea to shift the uncertainty that merchants face in achieving sales to the uncertainty that some consumers are willing to accept in reduced choice of product. If we incorporate and follow through with our business idea, we have a chance for getting $10k in seed money.
You can review our deck though this might not give one the entire picture without the talk we gave along with it. If anybody wants to learn more or invest millions of dollars (Disclaimer: said tongue in cheek not a solicitation for money or an offer to sell :-))
It was so much fun, I wish I could go to Tampa Startup Weekend but I’ll be at the Wall Street Journal Data Transparency Weekend.
I, also, just found out I’ll have the pleasure of attending the NYU/Princeton Conference on Mobile and Location Privacy: A Technology and Policy Dialog.
The PATH to Privacy redux
Matt Gemmell provided a much more detailed investigation into a solution for PATH’s privacy problem in his blog. I only touched upon the need the think about privacy. Kudos to Matt for his solution. Clearly, I need to start writing more indepth solutions to today’s privacy problems.
*Hat tip to Jon Udell’s Wired Article for point out the blog post.
Great article on the “real” danger of big data
I’ve always had this thought experiment going on in my head. What if subliminal advertising were real? What if by inserting something in my ad, I could guarantee that you bought it? What if it was effective on your 90% of the time? 75%?
Do not Track versus Do not Target
A recent post on the InfoLawGroup blog and my recent reading of Ryan Calo’s paper on privacy harm had me thinking.
The disconnect between the government and industry is that industry wants to prevent objective harm, the actual use of the information, and the government seems to want to protect against subjective harm, the knowledge that you are being observed. It would seem, on first blush, that industry doesn’t really get much by making this argument. However, even if they don’t target someone for specific ads, they can still get information about demographics of the audience for more broad based targeting.
The government’s argument should be about the potential for chilling effect on people if they are aware they are being tracked (though not targeted). Though it would be interesting for someone to do an empirical study on this, if one has not already been done.