From the Heresy Desk

Theatre Security

Before Bruce Schneier started using the term, “Security Theatre” was a term I heard from what I call Real Security People. I was designing a security-oriented NOC, and I interviewed people who built secure sites for a couple of governments, banks, and others. They said that what The Adversary thinks you can do is more important than what you can do. I was told that perception is the majority of security: “Maybe not two-thirds, but definitely more than half.” As the team built the system, we took this to heart, which made it more fun, at the very least. But I also heard from someone I know who nmapped our system and received an nmap in return that he decided it wasn’t a good idea to go further. In that case, at least, the security theatre worked.

We also used a bit of security-through-obscurity. We tweaked some of our network protocols so that they were merely incompatible with the off-the-shelf stuff. Our protocol banners lied. We particularly enjoyed having them declare that they were known vulnerable in odd ways. It was at least informative that the random attacks that came by were not tailored. No one ever tried Sparc vulnerabilities on that server claiming to be SunOS 4 with Bind 3. They hit it with the Windows buffer overflows anyway. That was disappointing, but we also learned an important lesson — the only people who care what your banners say are the good guys. The bad guys find it more economical to just spray you with whatever exploits they have in their bag of tricks. Or at least most of the bad guys.

Security through obscurity has gotten a bad rep in part because there are people who think that merely by being obscure is being secure. There are also people who think that a mediocre security system can be made secure by being obscure. If, however, you start with good security and then put a bit of obscurity on the top, it’s a bonus. Think of security as armor and obscurity as camouflage. Camouflage is not armor; obscurity is not security. People who tell you it is are trying to sell you something. However, if an attacker is faced with armored things that are also camouflaged, their job is harder. If you back up the camouflage with good log analysis, then you can take the element of surprise away from the attacker. The total effect is good security theatre, a theatre that might result in deterrance. Just be honest about it, especially to yourself. If the attacker discovers you have no armor behind the camouflage, then you have a well-prepared opponent.

There are other reasons to eschew obscurity. It isn’t scalable, and it doesn’t lead to market solutions. You can’t shop around for the best obscurity. The notion of a global secret is somewhere between ironic and silly. This is why DRM systems don’t work against determined attackers. However, not everything needs to be open, scalable, and market-driven. If you are building a system that is closed, proprietary, and local (such as the secure NOC I was working on), obscurity can be a valuable spice in the dish that makes a tasty meal tastier.

We are also seeing changes in the threat model that justifies a revision in our defense model. A few years ago, the attackers were using broadcast attacks. They didn’t look at the lies we told them because they were unskilled attackers throwing all the handy exploits they had. They wouldn’t see embarrassments that didn’t fit their model. I have a story about that I’ll post soon.

The trend in attacks is that they are becoming slow, targeted, and with a clear goal — money. They also want not only to succeed, but to succeed undetected. A measure that increases the attacker’s uncertainty increases the attacker’s risk of being caught.

Here’s an informal example. Suppose I divide my system into an external “red” network and an internal “black” network. All connections use TLS with AES-256, but on the black network, we are not using standard AES, we’re using a modified AES that real cryptographers agree is as secure, just incompatible with AES; call it AEN for Advanced Encryption Non-standard. Cryptographers have a formal notion of this that they call “family keys.” AEN is my spice. On the black network, you’re expected to use AEN. We just compiled it into OpenSSL where AES was supposed to be. The resulting system is just as secure as one that uses AES everywhere, but has this extra little twist. It makes the attacker’s job harder, and makes our job of detecting an attack easier. It has costs, of course, which you can think of as well as I can. But in my system, which is not only closed, but I want to be closed, they’re not bad costs to pay. Even better, if I publicize that I’ve done this, I might convince an attacker to target someone else.

If you remember that obscurity is not security, that it is camouflage rather than armor, that it is not scalable, that it is only as good as the obscurity itself is, there might be places you can use it effectively. Also, not all security theatre is bad. What is bad is only having theatre and not backing up obscurity with real security.
Photo of theatre security courtesy of Luigi Rosa.

6 thoughts on “From the Heresy Desk

  1. I’m glad someone wrote this article. Deception and obfuscation are both well known defensive tactics in every other discipline.

  2. Interesting – I visited Couterpane’s (Schneier’s company) NOC in Virginia some years ago. The door was in the back of an office complex with all the windows tinted, and no identifying information anywhere. Even if you got into the reception area, the only place to go was the small bathroom there – all the rest of the doors had biometrics and no signs. You had to go through a “man trap” to get to the conference room, and then they de-polarized the glass in the room to let you see the NOC.
    Security Theatre and Obscurity, indeed.

  3. I agree, excellent article. Obscurity isn’t armor, but then again, there are things that you don’t necessarily NEED to publish, either — or make it too easy for an attacker to know. The creative use of nonstandard infrastructure, as long as it doesn’t overly complicate support, is a good tactic to add to the bag of tricks.

  4. I like to see the orthodoxy challenged in a thoughtful way. The take home message is also good: Obscurity only works if you don’t need it to be secure in the first place.

  5. I’m laughing over here. Not because you are insane, but because you are reading my mind. Not more than a few months ago, I proposed similar content for a magazine article. The result — they laughed at me… “everyone knows that security thru obsurity is a joke”.
    Working at nCircle means you’ve got plenty of remote detection tools at your disposal. I first started a side project a few years ago where I was intentionally trying to fool our systems. That idea spawned a contest in our VERT organization. The goal was to obfuscate some app so much that we couldn’t detect it, but it still had to completely function. I don’t recall who one, but the most interesting entry was a developer who altered LambaMOO code to respond to FTP commands.

  6. My favorite way to look at this debate is the idea of the “defensible secret”. Tuesday’s Enigma key is a defensible secret: it’s feasible to keep it away from the enemy and if it’s compromised you can change it easily. The inner workings of your Enigma machine are not a defensible secret: the Polish resistance will eventually capture one and hand it over to the British.
    A secret becomes less defensible if there are a lot of people looking for it, if it’s a long-duration secret, or if it’s so interesting that as soon as one person finds it out that person will spread it all over the Internet.
    Trying to hide the fact that physical locks are vulnerable to bump keys is security by obscurity. The secret can’t be defended for long enough to replace the vulnerable locks.
    Trying to hide what brand of lock I have on my front door is a grayer area. Not many people will care and they won’t feel much urge to publicize it. By itself, the information won’t hurt me much anyway, except that it might encourage burglars to break a window instead.
    The Internet has really changed the calculations, and has made security through obscurity far less useful than it was a thousand years ago. There was a time when only locksmiths knew about bump keys and you could assume the information would spread slowly. Not today.

Comments are closed.