RSA: Time for some cryptographic dogfood

One of the most effective ways to improve your software is to use it early and often.  This used to be called eating your own dogfood, which is far more evocative than the alternatives. The key is that you use the software you’re building. If it doesn’t taste good to you, it’s probably not customer-ready.  And so this week at RSA, I think more people should be eating the security community’s cryptographic dogfood.

As I evangelize the use of crypto to meet up at RSA, I’ve encountered many problems, such as choice of tool, availability of tool across a set of mobile platforms, cost of entry, etc.  Each of these is predictable, but with dogfooding — forcing myself to ask everyone why they want to use an easily wiretapped protocol — the issues stand out, and the companies that will be successful will start thinking about ways to overcome them.

So this week, as you prep for RSA, spend a few minutes to get some encrypted communications tool. The worst that can happen is you’re no more secure than you were before you read this post.

What Price Privacy, Paying For Apps edition

There’s a new study on what people would pay for privacy in apps. As reported by Techflash:

A study by two University of Colorado Boulder economists, Scott Savage and Donald Waldman, found the average user would pay varying amounts for different kinds of privacy: $4.05 to conceal contact lists, $2.28 to keep their browser history private, $2.12 to eliminate advertising on apps, $1.19 to conceal personal locations, $1.75 to conceal the phone’s ID number and $3.58 to conceal the contents of text messages.

Those numbers seem small, but they’re in the context of app pricing, which is generally a few bucks. If those numbers combine linearly, people being willing to pay up to $10 more for a private version is a very high valuation. (Of course, the numbers will combine in ways that are not strictly rational. Consumers satisfice.

A quick skim of the article leads me to think that they didn’t estimate app maker benefit from these privacy changes. How much does a consumer contact list go for? (And how does that compare to the fines for improperly revealing it?) How much does an app maker make per person whose eyeballs they sell to show ads?

A Quintet of Facebook Privacy Stories

It’s common to hear that Facebook use means that privacy is over, or no longer matters. I think that perception is deeply wrong. It’s based in the superficial notion that people making different or perhaps surprising privacy tradeoffs are never aware of what they’re doing, or that they have no regrets.

Some recent stories that I think come together to tell a meta-story of privacy:

  • Steven Levy tweeted: “What surprised me most in my Zuck interview: he says the thing most on rise is ‘sharing with smaller groups.'” (Tweet edited from 140-speak). I think that sharing with smaller groups is a pretty clear expression that privacy matters to Facebook users, and that as Facebook becomes more a part of people’s lives, the way they use it will continue to mature. For example, it turns out:
  • 71% of Facebook Users Engage in ‘Self-Censorship’” did a study of people typing into the Facebook status box, and not hitting post. In part this may be because people are ‘internalizing the policeman’ that Facebook imposes:
  • Facebook’s Online Speech Rules Keep Users On A Tight Leash.” This isn’t directly a privacy story, but one important facet of privacy is our ability to explore unpopular ideas. If our ability to do so in the forum in which people talk to each other is inhibited by private contract and opaque rules, then our ability to explore and grow in the privacy which Facebook affords to conversations is inhibited.
  • Om Malik: “Why Facebook Home bothers me: It destroys any notion of privacy” An interesting perspective, but Facebook users still care about privacy, but will have trouble articulating how or taking action to preserve the values of privacy they care about.

On Cookie Blocking

It would not be surprising if an article like “Firefox Cookie-Block Is The First Step Toward A Better Tomorrow” was written by a privacy advocate. And it may well have been. But this privacy advocate is also a former chairman of the Internet Advertising Bureau. (For their current position, see “Randall Rothenberg’s Statement Opposing Mozilla’s Intention to Block Third-Party Cookies.”

But quoting from “the first step:”

First, the current promise of ultra-targeted audiences delivered in massively efficient ways is proving to be one of the more empty statements in recent memory. Every day more data shows that what is supposed to be happening is far from reality. Ad impressions are not actually in view, targeting data is, on average, 50% inaccurate by some measures (even for characteristics like gender) and, all too often, the use of inferred targeting while solving for low-cost clicks produces cancerous placements for the marketer. At the end of the day, the three most important players in the ecosystem – the visitor, the content creator and the marketer – are all severely impaired, or even negatively impacted, by these practices.

It’s a quick read, and fascinating when you consider the source.

Privacy, Facebook and Fatigue

Facebook’s new Graph search is a fascinating product, and I want to use it. (In fact, I wanted to use it way back when I wrote about “Single Serving Friend” in 2005.)

Facebook’s Graph Search will incent Facebook users to “dress” themselves in better meta-data, so as to be properly represented in all those new structured results. People will start to update their profiles with more dates, photo tags, relationship statuses, and, and, and…you get the picture. No one wants to be left out of a consideration set, after all. (“Facebook is no longer flat“, John Battelle)

But privacy rears its predictable head, not just in the advocacy world:

Independent studies suggest that Facebook users are becoming more careful about how much they reveal online, especially since educators and employers typically scour Facebook profiles.

A Northwestern University survey of 500 young adults in the summer of 2012 found that the majority avoided posting status updates because they were concerned about who would see them. The study also found that many had deleted or blocked contacts from seeing their profiles and nearly two-thirds had untagged themselves from a photo, post or check-in. (“Search Option From Facebook Is a Privacy Test“, NYTimes)

Perhaps a small set of people will, as Batelle suggests, slow down their use of ironic, silly, or outraged likes, but the fundamental problem is that such uses are situated in a context, and when those contexts overlap, their meanings are harder to tease out with algorithms. People engage with systems like Yelp or LinkedIn in a much more constrained way, and in that constraint, make a much simpler set of meanings. But even in those simple meanings, ‘the street finds its own uses for things.’ For example, I get the idea that this 5-star review may be about something more than the design on a shirt.

There’s another study on “Facebook Fatigue:”

Bored or annoyed by Facebook? You’re not alone. A majority of people surveyed by the Pew Internet and American Life Project said they had taken sabbaticals from the social network at some point, to escape the drama, or the tedium. (“Study: Facebook fatigue — it’s real“, Jennifer Van Grove, CNet)

When our nuanced and evolved social systems are overlaid with technology, it’s intensely challenging to get the balance of technology and social right. I think the Pew research shows that Facebook has its work cut out for it.

Happy Data Privacy Day! Go check out PrivacyFix

It’s Data Privacy Day, and there may be a profusion of platitudes. But I think what we need on data privacy day are more tools to let people take control of their privacy. One way to do that is to check your privacy settings. Of course, the way settings are arranged changes over time, and checking your settings regularly is a drain.

Enter PrivacyFix.

PrivacyFix is a Firefox & Chrome plugin that you might want to check out. It looks at your Facebook and G+ settings, and helps you fix things. It also helps you send opt-out email to web site privacy addresses, which is awesome.

Not having a Facebook or G+ account, I can’t really test it. I do find the model of a plugin that works when you’re on their site (versus local UI) to be confusing. But maybe I’m not their target audience. Anyway, I did want to refer back to my Lessons from Facebook’s Stock Slide, in which I talked about intent versus identity.

Facebook tracks
Google tracks

I don’t know if PrivacyFix’s estimates of revenue are accurate. But unless they’re off by 2 orders of magnitude for each of Facebook (under-estimating) and Google (over-estimating), then wow.

Privacy and Health Care

In my post on gun control and schools, I asserted that “I worry that reducing privacy around mental health care is going to deter people who need health care from getting it.”

However, I didn’t offer up any evidence for that claim. So I’d like to follow up with some details from a report that talks about this in great detail, “The Case for Informed Consent” by Patient Privacy Rights.

So let me quote two related numbers from that report.

First, between 13 and 17% of Americans admit in surveys to hiding health information in the current system. That’s probably a lower-bound, as we can expect some of the privacy sensitive population will decline to be surveyed, and some fraction of those who are surveyed may hide their information hiding. (It’s information-hiding all the way down.)

Secondly, 1 in 8 Americans (12.5%) put their health at risk because of privacy concerns, including avoiding their regular doctor, asking their doctor to record a different diagnosis, or avoiding tests.

I’ll also note that these numbers relate to general health care, and the numbers may be higher for often-stigmatized mental health issues.

Proof of Age in UK Pilot

There’s a really interesting article by Toby Stevens at Computer Weekly, “Proof of age comes of age:”

It’s therefore been fascinating to be part of a new initiative that seeks to address proof of age using a Privacy by Design approach to biometric technologies. Touch2id is an anonymous proof of age system that uses fingerprint biometrics and NFC to allow young people to prove that they are 18 years or over at licensed premises (e.g. bars, clubs).

The principle is simple: a young person brings their proof of age document (Home Office rules stipulate this must be a passport or driving licence) to a participating Post Office branch. The Post Office staff member checks document using a scanner, and confirms that the young person is the bearer. They then capture a fingerprint from the customer, which is converted into a hash and used to encrypt the customer’s date of birth on a small NFC sticker, which can be affixed to the back of a phone or wallet. No personal record of the customer’s details, document or fingerprint is retained either on the touch2id enrolment system or in the NFC sticker – the service is completely anonymous.

So first, I’m excited to see this. I think single-purpose credentials are important.

Second, I have a couple of technical questions.

  • Why a fingerprint versus a photo? People are good at recognizing photos, and a photo is a less intrusive mechanism than a fingerprint. Is the security gain sufficient to justify that? What’s the quantified improvement in accuracy?
  • Is NFC actually anonymous? It seems to me that NFC likely has a chip ID or something similar, meaning that the system is pseudonymous

I don’t mean to try to allow the best to be the enemy of the good. Not requiring ID for drinking is an excellent way to secure the ID system. See for example, my BlackHat 2003 talk. But I think that support can be both rah-rah and a careful critique of what we’re building.

Lessons from Facebook’s Stock Slide

So as Facebook continues to trade at a little over half of their market capitalization of 3 months ago, I think we can learn a few very interesting things. My goal here is not to pick on Facebook, but rather to see what we can take away and perhaps apply elsewhere. I think there are three key lessons that we can take away:

  • The Privacy Invasion Gnomes are Wrong
  • Intent Beats Identity
  • Maximizing your IPO returns may be a short term strategy

Let me start with the “Privacy Invasion Gonmes.” The short form of their strategy is:

  1. Gather lots of data on people
  2. ???
  3. Profit

This is, of course, a refinement of the original Gnome Strategy. But what Facebook shows us is:

The Privacy Invasion Gnomes are Wrong

Gathering lots of data on people is a popular business strategy. It underlies a lot of the advertising that powers breathless reporting on the latest philosophical treatise by Kim Kardashian or Paris Hilton.

But what Facebook shows us is that just gathering data on people is actually insufficient as a business strategy, because knowing that someone is a a Democrat or Republican just isn’t that valuable. It’s hard to capitalize on knowing that a user is Catholic or Mormon or Sikh. There’s a limit to how much money you make being able to identify gays who are still in the closet.

All of which means that the security industry’s love affair with “identity” is overblown. In fact, I’m going to argue that intent beats identity every time you can get it, and you can get it if you…keep your eye on the ball.

Intent beats Identity

The idea that if you know someone, you can sell them what they need is a powerful and intuitive one. We all love the place where everyone knows your name. The hope that you can translate it into an algorithm to make it scale is an easy hope to develop.

But many of the businesses that are raking in money hand-over foot on the internet aren’t doing that. Rather, they’re focused on what you want right now. Google is all about that search box. And they turn your intent, as revealed by your search, into ads that are relevant.

Sure, there’s some history now, but fundamentally, there’s a set of searches (like “asbestos” and “car insurance”) that are like kittens thrown to rabid wolves. And each of those wolves will pay to get you an ad. Similarly, Amazon may or may not care who you are when they get you to buy things. Your search is about as direct a statement of intent as it gets.

Let me put it another way:
Internet company revunue per user

The graph is from Seeking Alpha’s post, “Facebook: This Is The Bet You Are Making.”

So let me point out that two of these companies, Facebook and LinkedIn, have great, self-reinforcing identity models. Both use social pressure to drive self-representation on the site to match self-representation in various social situations. That’s pretty close to the definition of identity. (In any event, it’s a lot closer than anyone who talks about “identity issuance” can get.) And both make about 1/9th of what Google does on intent.

Generally in security, we use identification because it’s easier than intent, but what counts is intent. If a fraudster is logging into Alice’s account, and not moving money, security doesn’t notice or care (leaving privacy aside). If Alice’s husband Bob logs in as Alice, that’s a failure of identity. Security may or may not care. If things are all lovey-dovey, it may be fine, but if Bob is planning a divorce, or paying off his mistress, then it’s a problem. Intent beats identity.

Maximizing your IPO returns may be a short term strategy

The final lesson is from Don Dodge, “How Facebook maximized the IPO proceeds, but botched the process.” His argument is a lot stronger than the finger-pointing in “The Man Behind Facebook’s I.P.O. Debacle“. I don’t have a lot to add to Don’s point, which he makes in detail, so you should go read his piece. The very short form is that by pricing as high as they did, they made money (oodles of it) on the IPO, and that was a pretty short-term strategy.

Now, if Facebook found a good way to get intent-centered, and started making money on that, botching the IPO process would matter a lot less. But that’s not what they’re doing. The latest silliness is using your mobile number and email to help merchants stalk find you on the site. That program represents a triumph of identity thinking over intent thinking. People give their mobile numbers to Facebook to help secure their account. Facebook then violates that intent to use the data for marketing.

So, I think that’s what we can learn from the Facebook stock slide. There may well be other lessons in an event this big, and I’d love to hear your thoughts on what they might be.