Security Breaches Are Good for You: My Shmoocon talk

security-breaches.jpgAt Shmoocon, I talked about how “Security Breaches are Good for You.” The talk deviated a little from the proposed outline. I blame emergent chaos.

Since California’s SB 1386 came into effect, we have recorded public notice of over 500 security breaches. There is a new legal and moral norm emerging: breaches should be disclosed. This is the most significant event in information security since Aleph1 published “Smashing the Stack for Fun and Profit,” and brought stack-smashing to the masses.

The reason that breaches are so important is is that they provide us with an objective and hard to manipulate data set which we can use to look at the world. It’s a basis for evidence in computer security. Breaches offer a unique and new opportunity to study what really goes wrong. They allow us to move beyond purely qualitative arguments about how bad things are, or why they are bad, and add quantifatication. The public awareness of the data lost on laptops is one example of this. There’s no doubt that the data we get from these laws is imperfect, but look at the alternative: the FBI/CSI survey.

The talk will cover why breaches are an important opportunity, cover some threats to the emergent data, and discuss what we can do to improve the quality and quantity of the data that can drive security science.

Rather than posting slides, I’ve posted slides with a running commentary, because I didn’t think the slides were particularly self explanatory.

[Update: fixed spelling.]

7 thoughts on “Security Breaches Are Good for You: My Shmoocon talk

  1. I’ll have a read in a bit.
    threts – sp
    And of course there are security events other than customer data disclosure – any thoughts on how those can be subjected to evidence-based assessment?

  2. I’m mad at you for doing this better than I did.
    What about other kinds of breaches? The apparent moral standard only applies to personal information. Seems like there are lots of other kinds of breaches. Do we need *gasp* a law?

  3. This is great. Thanks. I’m a newly minted prof at the University of Colorado, a former federal prosecutor, and a former CS major and sysadmin. I recently wrote a paper called the Myth of the Superuser about how people obsess too much about powerful, malevolent computer users. My principal conclusion is that there isn’t enough data and scientific analysis in computer security — it’s all just hype and paranoia. If it gets picked up for publication, I’ll be sure to cite your talk on this point.

  4. I believe that the moose graph needs to clarify a few points; is the moose-watching being done in a (reasonably close to) inertial frame? I’m afraid that the effects of temporal/spatial distortions at observed moose-like velocities may affect the gathered data.

  5. Ant, Bliv, your points raise an important issue that I’ll talk about tomorrow in a new post.
    Paul, thanks for the pointer–I’ll take a look.
    Brullig, those moose observations were taken by surplus soviet observation satellites previously used to obtain blackmail information on senior US officials. They’re reputedly good enough to read the serial numbers off the bills as they’re passed, but without direct observation it’s hard to know if we should trust that claim.

  6. How very Shmoo’y. Did you notice that the end of today’s SF Gate story on the massive TJX catastrophe has this comment?
    http://www.sfgate.com/cgi-bin/article.cgi?f=/n/a/2007/03/29/financial/f085202D95.DTL
    “TJX shares rose 35 cents, or about 1 percent, to close at $26.85 on the New York Stock Exchange.”
    A breach report is certainly new and useful data but, like most isolated data sets, there is no simple and direct relationship to share price. I sometimes have to laugh when execs talk about breaches only in terms of direct and immediate impact rather than the stream of “fraudulent” purchases they have to eat or the long tail of costs related to fixing identity theft.
    BTW, who is the “you”?

  7. You wrote:

    So breach disclosure is good for you. It allows us to overcome fears. It allows us to discuss some of our problems in a forthright manner. We can use the data to start investigating what happens and why. The data isn’t great, but I expect it will get better.

    Schechter and Smith may be a good reference for you. Snippet from my Silver Bullets paper:

    Schechter & Smith use an approach of modelling risks and rewards from the attacker’s point of view which further supports the utility of sharing information by victims:

    Sharing of information is also key to keeping marginal risk high. If the body of knowledge of each member of the defense grows with the number of targets attacked, so will the marginal risk of attack. If organizations do not share information, the body of knowledge of each one will be constant and will not affect marginal risk. Stuart E. Schechter and Michael D. Smith ” How Much Security is Enough to Stop a Thief?”, Financial Cryptography 2003 LNCS Springer-Verlag.

    Yet, to share raises costs for the sharer, and the benefits are not accrued to the sharer. This is a prisoner’s dilemma for security, in that there may well be a higher payoff if all victims share their experiences, yet those that keep mum will benefit and not lose more from sharing. As all potential sharers are joined in an equilibrium of secrecy, little sharing of security information is seen, and this is rational. We return to this equilibrium later.

Comments are closed.