More stuff, more problems: my first Die Hard post

As I’ve built up this site, I’ve decided to throw more and more stuff into it. I mentioned in a previous post how its original iteration was just an HTML document slapped onto a nonfunctional security gateway. Now, it’s a more complex animal: a full blog website, which might even have some variety of commenting and modding privileges. Heck, if I wanted to, I could probably include my own forums and chatrooms. But that’s the work of a much more dedicated engineer than myself.

One of the things we have to consider when we are building technical architecture out is to how much risk we are willing to expose ourselves. What I mean by risk may imply different stuff by different models, so I’m going to use a very mathematical proposition here: risk is likelihood times impact. In a more boolean-style format, we could view it like this:

r = l * i

I should note this is not my original idea, but I think I lifted it from OWASP.

To describe this more tangibly, I will use the character Hans Gruber and John McClane from the Willis/Sinatra masterpiece known as Die Hard. McClane’s estranged wife, Holly, is attending an annual Christmas party for the Nakatomi Corporation in downtown Los Angeles. A group of Hessian (or Bavarian, or Alsatian, or perhaps even Austrian) mercenaries, some garishly dressed as Sinterklaas, break up the party, take hostages, and make their way to the Nakatomi server room where they attempt to extract some useful information from one of the primary databases.

“Herr Gruber,” states one of the goons, his ears pricking up: “I am beginning to hear strange echoes through the HVAC system and believe that some kind of bald, shoeless menace may be coming for us.”

“A wise concern, mein Schnecke.” Gruber responds. “With that in mind, what is the likelihood that this unshod beast will be upon us before we are able to extract this expensive data?”

“There is a field of possibility, you see. I would indicate a 50% likelihood that we are able to escape before this fellow arrives, and a 50% chance that we will be caught, detained and/or murdered by him.”

“Very well.” Gruber sums it up to his men in this fashion: the noise coming down the air duct (McClane) is a significant threat to their current enterprise (theft of the Nakatomi data). Since there is a 50% likelihood that he will interrupt this transaction in a way that will make it irrecoverable (possibly with a machine gun), he sets the L value for the prior equation (r = l * i ) at 0.5. This risk is very significant, and I doubt that I would have personally been able to show Mr. Gruber’s bravado at a time like this. Likewise, granted the value of the data obtained in said transaction (likely worth millions), Gruber & Co. stand to risk a great deal. They are also risking completely losing their investment in the heist if they choose to abandon it prematurely.

We can also consider the likelihood of risk from the perspective of the Nakatomi workers, investors, board, and their families. The likelihood of being invaded or taken hostage by German mercenaries has shrunk since the 1990s, but during the Cold War this may have been likelier. Let us assume that in the current climate of detente, that there is a 10% chance that the events described this Christmas is going to occur. For that possibility, the Nakatomi board should already have failsafes in place: German-fluent hostage negotiators, a more pronounced security presence, perhaps Israeli or Sikh guards that do not take Christmas off, et cetera. Assuming the confidentiality of this data is vital to Nakatomi corporate interests, this could be a company-ending debacle if Gruber & Co. escape. We don’t want that to happen. But how much does one need to invest in preventing it?

That’s a question that becomes harder to answer.

From this point, it becomes difficult for individual stakeholders in any business to determine what the likelihood of these risks are. We can point at dozens of high-profile security incidents over the last few decades and ask ourselves why the military, the FBI, or any other number of supposedly waterproof organizations failed to take the right measures to prevent dangerous leakage. It’s my opinion that this is where we need to shut up and start listening and learning: talk to other people. Get to know who manages ingress and egress. Find out when people sign in and when they piggyback off another person’s security card.

Security starts with you, but it shouldn’t end there. Ask around. Find out what’s useful, and what’s not. But most importantly, don’t assume that everything is going to be alright.