The costs of liability

It’s become common for people thinking about security economics to call for liability around security failures. The idea is that software creators who who ship insecure products could be held liable, because they’re well positioned to address the problems.

I don’t think this is a trouble-free idea. There are lots of complexities. As one example, are open source vendors going to be liable? Fyodor, who writes and gives away nmap? RedHat.com? What about Apple, when they include a package, say bind or bzip, both of which were included in their latest security update. Including such third party software allows Apple to provide basic functionality at lower cost.


Now, the UK Information Commissioner has proposed that doctors who lose laptops with patient data could be subject to a £5,000 fine.

Mr Thomas said: “If a doctor, or hospital [employee] leaves a laptop containing patients’ records in his car and it is stolen, it is hard to see that is anything but gross negligence.”

The commission can currently issue enforcement notices but these “do not impose any element of punishment for wrongdoing”. But Lord Lyell of Markyate, a former Attorney-General, said it would be disproportionate to criminalise doctors for losing a laptop.

Mr Thomas said the intention was not to prosecute for a single incident, but that for gross negligence there was “a need to have some deterrent in place”. He said anyone holding personal data should know the basics of “encryption” to protect that material. (“Doctors may be prosecuted if their laptops are stolen,” Times Online, UK)

I’m with Lord Lyell here, and think that there’s a great deal of specific thinking to be done before we should impose more liability for software flaws. Software creators, including Mozilla, know that it’s hard to make bug-free software, so my employer probably thinks similar things.


Possibly related, “Government ignores Personal Medical Security.”


Via PogoWasRight.

7 thoughts on “The costs of liability

  1. The first thing that came to mind when hearing about this was “well, that’s one way to discourage breach reporting”. If individuals are to be held criminally liable for loosing data then any small incentive that they might have once had for reporting their screw-ups to their bosses goes right out the door.

  2. Advocating the devil of course, but the reason we don’t know much about how to do this is simply that we’ve never done it. Once a few cases shake out, liability on software will pass from strange and scary to routine and expensive.
    Probably open source will get a pass, because there is no strong contract (e.g., contract for consideration) involved, so contract law does not say so much. It is where there is a paid contract that it gets more interesting. At a superficial reading, Microsoft would thus be much more liable than Mozilla.
    Another thing to consider is the meaning of gross negligence or criminal negligence which for some reason is more customary than other legal things. So to bounce it back to the computer industry, we are now at the point of making loss of data a crime for purposes of negligence, simply because the damage done is disproportionate. Doctors will have to start treating laptops like their supplies of desirable drugs.

  3. http://worsethanfailure.com/Articles/Finally,-a-Software-Guarantee.aspx
    There is scope for arguing over whether something was installed and used right – and for what it was intended for.
    And then there’s patching – after a patch is released is the vendor is no longer liable for flaws in the outdated version and the unpatched user is liable? Might a court have to decide whether a patch is really usable, as opposed to rushed release as a cop-out? Would the vendor be allowed to charge for patches? Would the vendor be obliged to release patches separately from other features that the users might not want?

  4. Ian,
    I explicitly called out redhat because they have an open source OS which they sell. Liable or not? Arguing that we should just do it, and endure the (if you’ll permit me) emergent chaos seems less clever than planning.

  5. Adam,
    yes, Redhat would be liable if the only discussion was a strong contract, because they sold a copy. Lunix (whoever that is…) would not.
    As to arguing that we should “just do it”, I wouldn’t argue that. Any law that argues for this liability has a nearly perfect chance of doing more harm than good.
    I personally would see that there is sufficient cause for a class action process. Having said that, I’m not a lawyer, they are complex things and it hasn’t happened yet, so I’m wrong somewhere…

  6. I think that you can safely separate software from doctors in this case due to the clause of negligence. Doctors, after accepting personal information from patients, are supposed to be aware that they are responsible for keeping it safe – both in the EU and in the US. This is not new. What is unique is that England is acknowledging that doctors have not done an exceptional job of securing that information – despite knowing the possible consequences.
    On the other hand, a security vulnerability is an unexpected event in theory. It would be hard to regulate. I mean, how can you regulate due diligence in the testing process? You don’t even know what possible attack vectors are around next month. :)

Comments are closed.