Visualizing Privacy: Using (Usable) Short Form Privacy Policies
Morgan Eisler, @mogasaur, works at Lookout, a mobile security company. This year over 2 billion people worldwide use the internet — more added every minute. Many (most?) of these are mobile devices. Many companies have privacy policies, but only 12% of Internet users read privacy policies all the time — and only 20% of those that read them (even occassionally) understand them. Simply too long!
If nobody reads them, can we really say that the customers are making a choice? Certainly not an informed one. Users trust their providers — but, expectations rarely match, and can cause negative surprises that lead to loss of trust and loss of revenue.
Consider «Yo». An app that you can share all of your contacts with, and it will send them push notifications — a literal «Yo». The app became very popular, and was hacked over night — suddenly peoples phone numbers were no longer private. This wasn’t even an app created by a company — just a few friends having fun.
At Lookout, they made a really short form policy that you could view on just one page on a mobile device — but was it helpful?
Lookout created a new short form policy that was quite simple — greyed out icons for things like «Government» to show that they were not showing their data with the Government. For people they did share with, like «Carriers» — you could click on the icon and get more information.
Did usability studies and found that customers liked it — but did they understand it? People, for example, weren’t sure what the icon of «user files» meant — it looked like pictures. Did that mean it only applied to pictures? Used usability studies to clear up some of the icons.
The Flattening of the Security Landscape and Implications for Privacy
Customers are like sheep (which is not at all like they are portrayed in movies). Sheep are stubborn and if you try to push them too hard, they scatter (enter picture of sheep dog working hard :). Even though Shelly Bird isn’t «in» security, when a security breach happens — customers come to her. She has to pay attention to everything before deployment, like making sure the bios is up to date.
Shelly thinks of security as a bowl — a container to store and protect your data/applications/etc. Also, like a castle — defense in depth.
Ten years ago, during a deployment, a customer said she had to remove IPsec from all of the machines. Huh? The router/switch engineers said: That’s our job! What about the «last mile»? Same customer didn’t want IPv6 — convinced their firewall would be confused and not able to process it.
Once Shelly got though all of this — then the Intrusion Detection folks were unhappy! They could no longer read the packets.
Essentially, fear of change.
Shelly could see that the more she could push the work down the stack — the faster things worked. For example, high level app encrypting a disk took four hours, but letting the OS do it — two!
There are other bigger problems here — credentials! The government likes to authorized users to have something physical to prove their affiliations. Shelly ended up with a dozen of these cards. Ugh. Now they are moving them into the mobile device, using TPMs as a trust anchor. This is claims based authentication, allowing business to move faster.
This is still very complicated, though, as the US Government doesn’t even have trust across branches.
People want to have multiple identities, people travel/move around and have different reasons for doing different transactions — lots of work to get this right.
The Data Brokers: Collecting, Analyzing and Selling Your Personal Information
Runa Sandvick works for Freedom the Press — they protect the press and help to inform the press of their rights. Like those that have been arrested in Ferguson for not moving fast enough.
While she often talking about NSA, today she’s talking more about consumer privacy.
It’s surprising how much companies know about you by just watching your patterns. You are volunteering this information in exchange for a discount. Like the father that found out from Target that his daughter was pregnant. She wasn’t even buying diapers or anything that obvious, but changed the products she was using in a way that indicated pregnancy to target.
But this stuff happens online, too, and we don’t even know about it.
And this information isn’t just kept by the one company you are shopping at — it’s getting collected by data brokers. For example, OfficeMax addressed a letter to a man with the title «Daughter Died in Car Crash». Where did they get that data? Why did they have that?
Data brokers sell lists of rape victims, alcoholics and erectile dysfunction sufferers. Where are they getting this? Why are they collecting it?
When asked directly, data brokers talk about caring about privacy, but don’t want to share things like: how to see what information they have about you? How to remove/correct information? How to decline to share?
You can use Tor to protect yourself from these data brokers. The only way the site will know it’s you is if you log in. There’s no way for them, otherwise, to know who you are so they won’t have anything to track against. Runa only uses Chrome for cat photos. 🙂
wow — this really goes beyond the annoying targeted banner ads!