The Wolf Approaches

In the government, the use of the term “grave” means something very specific. That meaning should be obvious to you, but on the off chance that you haven’t had your first cup of coffee yet, it means that whatever the issue is, messing it up could cost someone (or more than one person) their life.

The attack against the water system in Oldsmar, Florida was a potentially grave situation. A water system is not a trivial technology enterprise and as such it has numerous checks – including a human in the loop – to make sure malicious activity or honest mistakes don’t end lives. But the fact that an outsider was able to get such access in the first place makes it clear that there exists a disconnect between what such systems are supposed to be, and what they are.

We give the Sheriff of Pinellas County a pass on the use of the term “wake-up call” because he has not spent a large portion of his life in the belly of the cybersecurity beast. A wake-up call only happens once, how we respond indicates how serious we are about taking action:

  • In 2015 DHS Secretary Jeh Johnson calls OPM breach a “wake-up call”
  • In 2012 General Alexander, Director of the National Security Agency, calls the hacker attack on Saudi ARAMCO a “wake-up call”
  • In 2010 Michael M. DuBose, chief of the Justice Department’s Computer Crime and Intellectual Property Section, called successful breaches such as Aurora “a wake-up call”
  • In 2008 Deputy Secretary of Defense William Lynn called the BUCKSHOT YANKEE incident “an important wake-up call.”
  • In 2003 Mike Rothery, Director of Critical Infrastructure Policy in the Attorney-General’s Department of the (Australian) Federal Government called a hack into a wastewater treatment plant “a wake-up call.”
  • In 2000 Attorney General Janet Reno called a series of denial of service attacks against various companies a “wake-up call.”
  • In 1998 Deputy Secretary of Defense John Hamre called the SOLAR SUNRISE incident “a wake-up call.”
  • In 1989 IT executive Thomas Nolle wrote in Computer Week that poor LAN security was a “wake-up call.”

The details of this particular case are probably never going to see sufficient sunlight, which only adds to the ignorance on these matters at all levels, and sustains the fragility we so desperately need to improve. This is particularly important when you consider how our relationship with technology is only getting more intimate.

These are issues that are decades old, yet if you want to have some idea of what it will take to spur action, keep in mind that we intentionally poisoned 100,000 people via a water system for five years and no one is in jail (yet). The idea that the people rooting around in such systems have the ability to cause such effects but don’t because they appreciate the moral, ethical, and legal implications of the matter is increasingly wishful thinking.

We in security have long been accused of “crying ‘wolf’” and for a long time those critics were right. We knew bad things could happen because Sturgeon’s Law has been in full effect in IT for ages, but it has taken this long for matters to go from merely serious to grave. Like everyone who puts off addressing a potentially fatal issue until the symptoms cannot be ignored anymore, our ability to survive what comes next is an open question.

C.R.E.A.M. IoT Edition

I didn’t get to see the discussion between Justine Bone and Chris Wysopal about the former’s approach to monetizing vulnerabilities. If you’re not familiar with the approach, or the “Muddy Waters” episode, take a minute to brush up, I’ll wait….

OK, so if you’re in one computer security sub-community the first words out of your mouth are probably something along the lines of: “what a bunch of money-grubbing parasites.” If you knew anyone associated with this event you’ve probably stop talking to them. You’d certainly start talking shit about them. This is supposed to be about security, not profiteering.

If you’re in a different sub-community you’re probably thinking something along the lines of, “what a bunch of money-grubbing parasites,” only for different reasons. You’re not naive enough to think that a giant company will drop everything to fix the buffer overflow you discovered last week. Even if they did, because it’s a couple of lines in a couple of million lines of code, a fix isn’t necessarily imminent. Publicity linked to responsible disclosure is a more passive way of telling the world: “We are open for business” because it’s about security, but it’s also about paying the mortgage.

If you’re in yet another sub-community you’re probably wondering why you didn’t think of it yourself, and are fingering your Rolodex to find a firm to team up with. Not because mortgages or yachts don’t pay for themselves, but because you realize that the only way to get some companies to give a shit is to hit them where it hurts: in the wallet.

The idea that vulnerability disclosure, in any of its flavors, is having a sufficiently powerful impact on computer security is not zero, but its not registering on a scale that matters. Bug bounty programs are all the rage, and they have great utility, but it will take time before the global pwns/minute ratio changes in any meaningful fashion.

Arguing about the utility of your preferred disclosure policy misses the most significant point about vulnerabilities: the people who created them don’t care unless it costs them money. For publicly traded companies, pwnage does impact the stock price: for maybe a fiscal quarter. Just about every company that’s suffered an epic breach sees their stock price at or higher than it was pre-breach just a year later. Shorting a company’s stock before dropping the mic on one vulnerability is a novelty: it’s a material event if you can do it fiscal quarter after fiscal quarter.

We can go round and round about what’s going to drive improvements in computer security writ large, but when you boil it down it’s really only about one of and/or two things: money and bodies. This particular approach to monetizing vulnerabilities tackles both.

We will begin to see significant improvements in computer security when a sufficient number of people die in a sufficiently short period of time due to computer security issues. At a minimum we’ll see legislative action, which will be designed to drive improvements. Do you know how many people had to die before seatbelts in cars became mandatory? You don’t want to know.

When the cost of making insecure devices exceeds the profits they generate, we’ll see improvements. At a minimum we’ll see bug bounty programs, which is one piece of the puzzle of making actually, or at least reasonably secure devices. Do you know how hard it is to write secure code? You don’t want to know.

If you’re someone with a vulnerable medical device implanted in them you’re probably thinking something along the lines of, “who the **** do you think you are, telling people how to kill me?” Yeah, there is that. But as has been pointed out in numerous interviews, who is more wrong: the person who points out the vulnerability (without PoC) or the company that knowingly let’s people walk around with potentially fatally flawed devices in their bodies? Maybe two wrongs don’t make a right, but as is so often the case in security, you have to choose between the least terrible option.

The Wolf is Here

For decades we’ve heard that iCalamity is right around the corner. For decades we’ve largely ignored pleas to try and address computer security issues when they are relatively cheap and easy, before they got too large and complicated to do at all. We have been living a fairy tale life, and absent bold action and an emphasis on resiliency, it only gets grim(m)er going forward.

Reasonably affordable personal computers became a thing when I was in high school. I fiddled around a bit, but I didn’t know that computer security was a thing until I was on active duty and the Morris Worm was all over the news. Between the last time Snap! charted and today, we have covered a lot of ground from a general purpose IT perspective. We’ve gone from HTML and CGI to the cloud. From a security perspective however, we’ll still largely relying on firewalls, anti-virus, and SSL.

Why the disparate pace of progress? People demand that their technology be functional, not secure. Like so many areas of our lives, we worry about the here and now, not the what-might-be. We only worry about risks until a sufficiently horrific scenario occurs, or if one is not enough, until enough of them occur in a sufficiently short period of time.

Of course today we don’t just have to worry about securing PCs. By now it is fairly common knowledge that your car is full of computers, as is increasingly your house. Some people wear computers, and some of us are walking around with computers inside of us. Critical infrastructure is lousy with computers, and this week we learned that those shepherd boys crying ‘wolf’ all those years weren’t playing us for fools, they were just too early.

The fragility of our standard of living is no longer the musings of Cassandras. The proof of concept was thankfully demonstrated far, far away, but the reality is we’re not really any safer just because ‘merica. Keeping the lights on, hearts beating, and the water flowing is a far more complex endeavor than you find in the commodity IT world. It is entirely possible that in some situations there is no ‘fix’ to certain problems, which means given various inter-dependencies we will always find ourselves with a Damoclean sword over our heads.

Mixed mythologies notwithstanding, the key to success writ large is insight and resiliency. The more aware you are of what you have, how it works, and how to get along without it will be critical to surviving both accidents and attacks. I would like to think that the market will demand both functional and secure technology, and that manufacturers will respond accordingly, but 50 years of playing kick the can tells me that’s not likely. The analog to security in industrial environments is safety, and that’s one area power plants, hospitals, and the like have down far better than their peers in the general purpose computing world. We might not be able to secure the future, but with luck we should be able to survive it.