Quarter of a million Welsh profiles added to DNA database since 2000.
[I forget who linked to this one.]
CCTV in the spotlight: one crime solved for every 1,000 cameras
[Via the security metrics mailing list.]
Which I’m not — but if I were, now would be the time.
‘Unbreakable’ quantum cryptography hacked without detection using lasers
It’s opening in New York this weekend, and the New York Times has a review.
So I’m having a conversation with a friend about caller ID blocking. And it occurs to me that my old phone with AT&T, before Cingular bought them, had this nifty feature, “show my caller-ID to people in my phone book.”
Unfortunately, my current phone doesn’t have that, because Steve Jobs has declared that “Apple’s goal is to provide our customers with the best possible user experience. We have been able to do this by designing the hardware and software in our products to work together seamlessly.” Setting aside Michael Arrington’s excellent deconstruction, there’s a little feature, easy to implement if you have access to the call setup function (dial with a prepended *67). But we don’t have the ability to do that. Because that would be better than the best possible experience, and obviously, that’s not possible.
So I was thinking about the question of the value of privacy, and it occurred to me that there may be an interesting natural experiment we can observe, and that is national security clearances in the US. For this post, I’ll assume that security clearances work for their primary purpose, which is to keep foreign intelligence agents out of sensitive jobs. But articles like this indicate that it’s worth a $5-15,000 salary premium.
Part of the premium is getting a clearance for an employee is slow and expensive, as this Govcentral article says, “…it can take noncleared employees between six months and two years to receive a new clearance — an unacceptable time frame for many organizations that have significant contracts to deliver in the near term. In addition, the clearance process often is very expensive.”
But even with that issue, has the number of jobs requiring a clearance gone up that quickly as to create that degree of salary imbalance? At some point, the number of cleared people should catch up with the surge in government employment. At that point, the difference between a cleared and uncleared employee is down to (1) the cost of getting a clearance and (2) the market impact of having your life examined and judged by strangers.
Is that $1,000 a year for being unable to select the strangers?
Quick follow up to Adam’s Monday post New on SSRN. Rob Westervelt over at SearchSecurity.com tells us about a social network privacy study finds identity link to cookies. Turns out that passing unique identifiers in referring URLs isn’t such a smart idea after all. Color me shocked. The full paper is linked to from Rob’s article.
I remember when Derek Atkins was sending mail to the cypherpunks list, looking for hosts to dedicate to cracking RSA-129. I remember when they announced that “The Magic Words are Squeamish Ossifrage.” How it took 600 people with 1,600 machines months of work and then a Bell Labs supercomputer to work through the data. I had a fun little stroll down memory lane reading about average machines not having more than 16MB of ram, and how they borrowed a server with 2, later 3 900 MB disks. 129 decimal digits fits in 430 bits. The RSA-129 paper concludes:
We conclude that commonly-used 512-bit RSA moduli are vulnerable to any organization prepared to spend a few million dollars
and to wait a few months.
Fast-forwarding to this week, David Molnar mentions that “We’re living in the future now:”
The 512-bit RSA key used for signing applications and firmware updates for the TI-83 has been factored. By some person working on his or her own. With one computer.
David links to “Calculator hackers crack OS signing key, opening a closed platform,” and following links, we get to “fun number theory facts:”
A mathematical morsel for your entertainment and edification.
is the product of
A little more seriously, the identity of a blog is constructed between the authors, commenters and readers, and I’m continually amazed by what emerges here.
At the same time, what’s emerging is currently not very chaotic, and I’m wondering if it’s time for some mixing it up. Suggestions welcome.
In 2007, Artist Kristin Sue Lucas went before a judge to get a name change to…Kristin Sue Lucas. She’s put together a show called “Refresh” and one called “Before and After.” My favorite part is where the judge wrestles with the question “what happens when you change a thing to itself:”
JR: And I don’t mind the time. I just don’t know that I have the
legal authority to change your name when it’s not a change. The
code sections talk about changing. Can I give you an order that
doesn’t change your name at all? That keeps your name the same? Is
that the same as granting a name change? And I think not. And I’m
going to do this, I’m going to continue this matter for two
weeks… and try to think about these issues in this time…
There’s new papers by two law professors whose work I enjoy. I haven’t finished the first or started the second, but I figured I’d post pointers, so you’ll have something to read as we here at the Combo improvise around Cage’s 2:33.
Paul Ohm has written “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization,”
Computer scientists have recently undermined our faith in the privacy-protecting power of anonymization, the name for techniques for protecting the privacy of individuals in large databases by deleting information like names and social security numbers. These scientists have demonstrated they can often ‘reidentify’ or ‘deanonymize’ individuals hidden in anonymized data with astonishing ease. By understanding this research, we will realize we have made a mistake, labored beneath a fundamental misunderstanding, which has assured us much less privacy than we have assumed. This mistake pervades nearly every information privacy law, regulation, and debate, yet regulators and legal scholars have paid it scant attention. We must respond to the surprising failure of anonymization, and this Article provides the tools to do so.
Michael Froomkin has posted a draft of “Government Data Breaches.”
This paper addresses the legal response to data breaches in the US public sector. Private data held by the government is often the result of legally required disclosures or of participation in formally optional licensing or benefit schemes where the government is as a practical matter the only game in town. These coercive or unbargained-for disclosures impute a heightened moral duty on the part of the government to exercise careful stewardship over private data. But the moral duty to safeguard the data and to deal fully and honestly with the consequences of failing to safeguard them is at best only partly reflected in current state and federal statute law and regulations. The paper begins with an illustrative survey of federal data holdings, known breach cases, and the extent to which the government’s moral duty to safeguard our data is currently instantiated in statute law and, increasingly, in regulation.