The Evolution of Apple’s Differential Privacy

Bruce Schneier comments on “Apple’s Differential Privacy:”

So while I applaud Apple for trying to improve privacy within its business models, I would like some more transparency and some more public scrutiny.

Do we know enough about what’s being done? No, and my bet is that Apple doesn’t know precisely what they’ll ship, and aren’t answering deep technical questions so that they don’t mis-speak. I know that when I was at Microsoft, details like that got adjusted as we learned from a bigger pile of real data from real customer use informed things. I saw some really interesting shifts surprisingly late in the dev cycle of various products.

I also want to challenge the way Matthew Green closes: “If Apple is going to collect significant amounts of new data from the devices that we depend on so much, we should really make sure they’re doing it right — rather than cheering them for Using Such Cool Ideas.”

But that is a false dichotomy, and would be silly even if it were not. It’s silly because we can’t be sure if they’re doing it right until after they ship it, and we can see the details. (And perhaps not even then.)

But even more important, the dichotomy is not “are they going to collect substantial data or not?” They are. The value organizations get from being able to observe their users is enormous. As product managers observe what A/B testing in their web properties means to the speed of product improvement, they want to bring that same ability to other platforms. Those that learn fastest will win, for the same reasons that first to market used to win.

Next, are they going to get it right on the first try? No. Almost guaranteed. Software, as we learned a long time ago, has bugs. As I discussed in “The Evolution of Secure Things:”

Its a matter of the pressures brought to bear on the designs of even what (we now see) as the very simplest technologies. It’s about the constant imperfection of products, and how engineering is a response to perceived imperfections. It’s about the chaotic real world from which progress emerges. In a sense, products are never perfected, but express tradeoffs between many pressures, like manufacturing techniques, available materials, and fashion in both superficial and deep ways.

Green (and Schneier) are right to be skeptical, and may even be right to be cynical. We should not lose sight of the fact that Apple is spending rare privacy engineering resources to do better than Microsoft. Near as I can tell, this is an impressive delivery on the commitment to be the company that respects your privacy, and I say that believing that there will be both bugs and design flaws in the implementation. Green has an impressive record of finding and calling Apple (and others) on such, and I’m optimistic he’ll have happy hunting.

In the meantime, we can, and should, cheer Apple for trying.

RSA: Time for some cryptographic dogfood

One of the most effective ways to improve your software is to use it early and often.  This used to be called eating your own dogfood, which is far more evocative than the alternatives. The key is that you use the software you’re building. If it doesn’t taste good to you, it’s probably not customer-ready.  And so this week at RSA, I think more people should be eating the security community’s cryptographic dogfood.

As I evangelize the use of crypto to meet up at RSA, I’ve encountered many problems, such as choice of tool, availability of tool across a set of mobile platforms, cost of entry, etc.  Each of these is predictable, but with dogfooding — forcing myself to ask everyone why they want to use an easily wiretapped protocol — the issues stand out, and the companies that will be successful will start thinking about ways to overcome them.

So this week, as you prep for RSA, spend a few minutes to get some encrypted communications tool. The worst that can happen is you’re no more secure than you were before you read this post.

What Price Privacy, Paying For Apps edition

There’s a new study on what people would pay for privacy in apps. As reported by Techflash:

A study by two University of Colorado Boulder economists, Scott Savage and Donald Waldman, found the average user would pay varying amounts for different kinds of privacy: $4.05 to conceal contact lists, $2.28 to keep their browser history private, $2.12 to eliminate advertising on apps, $1.19 to conceal personal locations, $1.75 to conceal the phone’s ID number and $3.58 to conceal the contents of text messages.

Those numbers seem small, but they’re in the context of app pricing, which is generally a few bucks. If those numbers combine linearly, people being willing to pay up to $10 more for a private version is a very high valuation. (Of course, the numbers will combine in ways that are not strictly rational. Consumers satisfice.

A quick skim of the article leads me to think that they didn’t estimate app maker benefit from these privacy changes. How much does a consumer contact list go for? (And how does that compare to the fines for improperly revealing it?) How much does an app maker make per person whose eyeballs they sell to show ads?

A Quintet of Facebook Privacy Stories

It’s common to hear that Facebook use means that privacy is over, or no longer matters. I think that perception is deeply wrong. It’s based in the superficial notion that people making different or perhaps surprising privacy tradeoffs are never aware of what they’re doing, or that they have no regrets.

Some recent stories that I think come together to tell a meta-story of privacy:

  • Steven Levy tweeted: “What surprised me most in my Zuck interview: he says the thing most on rise is ‘sharing with smaller groups.'” (Tweet edited from 140-speak). I think that sharing with smaller groups is a pretty clear expression that privacy matters to Facebook users, and that as Facebook becomes more a part of people’s lives, the way they use it will continue to mature. For example, it turns out:
  • 71% of Facebook Users Engage in ‘Self-Censorship’” did a study of people typing into the Facebook status box, and not hitting post. In part this may be because people are ‘internalizing the policeman’ that Facebook imposes:
  • Facebook’s Online Speech Rules Keep Users On A Tight Leash.” This isn’t directly a privacy story, but one important facet of privacy is our ability to explore unpopular ideas. If our ability to do so in the forum in which people talk to each other is inhibited by private contract and opaque rules, then our ability to explore and grow in the privacy which Facebook affords to conversations is inhibited.
  • Om Malik: “Why Facebook Home bothers me: It destroys any notion of privacy” An interesting perspective, but Facebook users still care about privacy, but will have trouble articulating how or taking action to preserve the values of privacy they care about.

On Cookie Blocking

It would not be surprising if an article like “Firefox Cookie-Block Is The First Step Toward A Better Tomorrow” was written by a privacy advocate. And it may well have been. But this privacy advocate is also a former chairman of the Internet Advertising Bureau. (For their current position, see “Randall Rothenberg’s Statement Opposing Mozilla’s Intention to Block Third-Party Cookies.”

But quoting from “the first step:”

First, the current promise of ultra-targeted audiences delivered in massively efficient ways is proving to be one of the more empty statements in recent memory. Every day more data shows that what is supposed to be happening is far from reality. Ad impressions are not actually in view, targeting data is, on average, 50% inaccurate by some measures (even for characteristics like gender) and, all too often, the use of inferred targeting while solving for low-cost clicks produces cancerous placements for the marketer. At the end of the day, the three most important players in the ecosystem – the visitor, the content creator and the marketer – are all severely impaired, or even negatively impacted, by these practices.

It’s a quick read, and fascinating when you consider the source.

Privacy, Facebook and Fatigue

Facebook’s new Graph search is a fascinating product, and I want to use it. (In fact, I wanted to use it way back when I wrote about “Single Serving Friend” in 2005.)

Facebook’s Graph Search will incent Facebook users to “dress” themselves in better meta-data, so as to be properly represented in all those new structured results. People will start to update their profiles with more dates, photo tags, relationship statuses, and, and, and…you get the picture. No one wants to be left out of a consideration set, after all. (“Facebook is no longer flat“, John Battelle)

But privacy rears its predictable head, not just in the advocacy world:

Independent studies suggest that Facebook users are becoming more careful about how much they reveal online, especially since educators and employers typically scour Facebook profiles.

A Northwestern University survey of 500 young adults in the summer of 2012 found that the majority avoided posting status updates because they were concerned about who would see them. The study also found that many had deleted or blocked contacts from seeing their profiles and nearly two-thirds had untagged themselves from a photo, post or check-in. (“Search Option From Facebook Is a Privacy Test“, NYTimes)

Perhaps a small set of people will, as Batelle suggests, slow down their use of ironic, silly, or outraged likes, but the fundamental problem is that such uses are situated in a context, and when those contexts overlap, their meanings are harder to tease out with algorithms. People engage with systems like Yelp or LinkedIn in a much more constrained way, and in that constraint, make a much simpler set of meanings. But even in those simple meanings, ‘the street finds its own uses for things.’ For example, I get the idea that this 5-star review may be about something more than the design on a shirt.

There’s another study on “Facebook Fatigue:”

Bored or annoyed by Facebook? You’re not alone. A majority of people surveyed by the Pew Internet and American Life Project said they had taken sabbaticals from the social network at some point, to escape the drama, or the tedium. (“Study: Facebook fatigue — it’s real“, Jennifer Van Grove, CNet)

When our nuanced and evolved social systems are overlaid with technology, it’s intensely challenging to get the balance of technology and social right. I think the Pew research shows that Facebook has its work cut out for it.

Happy Data Privacy Day! Go check out PrivacyFix

It’s Data Privacy Day, and there may be a profusion of platitudes. But I think what we need on data privacy day are more tools to let people take control of their privacy. One way to do that is to check your privacy settings. Of course, the way settings are arranged changes over time, and checking your settings regularly is a drain.

Enter PrivacyFix.

PrivacyFix is a Firefox & Chrome plugin that you might want to check out. It looks at your Facebook and G+ settings, and helps you fix things. It also helps you send opt-out email to web site privacy addresses, which is awesome.

Not having a Facebook or G+ account, I can’t really test it. I do find the model of a plugin that works when you’re on their site (versus local UI) to be confusing. But maybe I’m not their target audience. Anyway, I did want to refer back to my Lessons from Facebook’s Stock Slide, in which I talked about intent versus identity.

Facebook tracks
Google tracks

I don’t know if PrivacyFix’s estimates of revenue are accurate. But unless they’re off by 2 orders of magnitude for each of Facebook (under-estimating) and Google (over-estimating), then wow.

Privacy and Health Care

In my post on gun control and schools, I asserted that “I worry that reducing privacy around mental health care is going to deter people who need health care from getting it.”

However, I didn’t offer up any evidence for that claim. So I’d like to follow up with some details from a report that talks about this in great detail, “The Case for Informed Consent” by Patient Privacy Rights.

So let me quote two related numbers from that report.

First, between 13 and 17% of Americans admit in surveys to hiding health information in the current system. That’s probably a lower-bound, as we can expect some of the privacy sensitive population will decline to be surveyed, and some fraction of those who are surveyed may hide their information hiding. (It’s information-hiding all the way down.)

Secondly, 1 in 8 Americans (12.5%) put their health at risk because of privacy concerns, including avoiding their regular doctor, asking their doctor to record a different diagnosis, or avoiding tests.

I’ll also note that these numbers relate to general health care, and the numbers may be higher for often-stigmatized mental health issues.

Proof of Age in UK Pilot

There’s a really interesting article by Toby Stevens at Computer Weekly, “Proof of age comes of age:”

It’s therefore been fascinating to be part of a new initiative that seeks to address proof of age using a Privacy by Design approach to biometric technologies. Touch2id is an anonymous proof of age system that uses fingerprint biometrics and NFC to allow young people to prove that they are 18 years or over at licensed premises (e.g. bars, clubs).

The principle is simple: a young person brings their proof of age document (Home Office rules stipulate this must be a passport or driving licence) to a participating Post Office branch. The Post Office staff member checks document using a scanner, and confirms that the young person is the bearer. They then capture a fingerprint from the customer, which is converted into a hash and used to encrypt the customer’s date of birth on a small NFC sticker, which can be affixed to the back of a phone or wallet. No personal record of the customer’s details, document or fingerprint is retained either on the touch2id enrolment system or in the NFC sticker – the service is completely anonymous.

So first, I’m excited to see this. I think single-purpose credentials are important.

Second, I have a couple of technical questions.

  • Why a fingerprint versus a photo? People are good at recognizing photos, and a photo is a less intrusive mechanism than a fingerprint. Is the security gain sufficient to justify that? What’s the quantified improvement in accuracy?
  • Is NFC actually anonymous? It seems to me that NFC likely has a chip ID or something similar, meaning that the system is pseudonymous

I don’t mean to try to allow the best to be the enemy of the good. Not requiring ID for drinking is an excellent way to secure the ID system. See for example, my BlackHat 2003 talk. But I think that support can be both rah-rah and a careful critique of what we’re building.