SCADA vulnerability disclosures are unconscionable

By John Viega
January 20, 2012

Today was a shameful day for the Internet security industry, as researchers disclosed information about numerous vulnerabilities in critical US infrastructure systems produced by five different vendors, demonstrating that they are happy to make the world a riskier place in order to market themselves.

This incident should certainly serve as a wakeup call to industrial control systems (SCADA) vendors. If they don't tighten their controls we could soon see a terrorist mucking around with a nuclear power plant over the Internet. Such scenarios sound like Hollywood fantasy to many, but those of us in the security industry have been warning people that this is a real worry. Hopefully, this incident will affect meaningful change with SCADA vendors.

However, there's an even uglier story here. This irresponsible stunt underscores that many in the industry care far more about marketing themselves and their companies than they do about security.

First, according to IDG, the researchers gave the vendors no notice. It's irresponsible to deny vendors sufficient time to research the problems, let alone fix them, and make sure that every customer affected by the problem deployed any fixes.

Frankly, some of these systems are so old that the developers are probably long gone. They were created for an age before the Internet, so they may actually need to be entirely redesigned; these problems likely aren't minor. And, it can be costly and time consuming to get new stuff into production--think of all the testing everybody will want to do before trusting an upgrade with, say, nuclear power. In this situation, I would expect that it would take a couple of years to adequately address the problem, even if working diligently.

Second, the researchers didn't just find problems in SCADA systems and report them to vendors -- they told the world about the problems. It seems to me, this violates an unspoken code of ethics in the security world, especially for people who actually care about security and have common sense. But, maybe we need to spell out our ethics more explicitly. And the first rule of any written code should be: Never release vulnerabilities that immediately threaten national security and your fellow human beings.

Making matters worse, researchers (along with security firm Rapid7) gave everybody tools that can be used to exploit these vulnerabilities with relative ease. This raises everybody's risk tremendously. Suddenly, we need a Jack Bauer, because a terrorist situation like we see on 24 has become far more plausible than it was just yesterday.

Why did these idiots pull such a stunt? I can't see any reason other than to market themselves. They might argue they're making the world safer because they're forcing the vendors to fix their products. But we're all demonstrably less safe today than we were yesterday, because many thousands of people around the world are now in the position to attack critical infrastructure.

Disclosing vulnerabilities can be a good thing if it actually is used to bring about better security, be it for your Facebook data, personally identifiable information, or for your nation's nuclear reactors. For instance, before the disclosure movement started, Microsoft was well known for sitting on security problems and never fixing them. But, as a result of the disclosure movement, it has in the past decade, invested well more than one billion dollars improving the security of its products. The end result is that Microsoft most likely has more secure products than it otherwise would have had, and people are better off for it.

However, the way that these SCADA vulnerabilities were handled leaves nobody better off. If it had been me, I would have notified the vendors, and let them know up front that it was important for them to be transparent with me about their progress. I'd also let them know that, if they didn't make reasonable progress, there would be consequences. Disclosure might be one option, but I'd personally have first tried the FBI, under the assumption that knowingly ignoring security problems in critical infrastructure might just be criminally negligent.

I don't think I ever would have disclosed the details of these vulnerabilities--not even years after they were fixed. There's too much of a chance that someone out there didn't upgrade. But I might be able to tolerate people who want to benefit from their own work disclosing a couple of years after fixes become generally available, if they gave everybody ample notice of what they're doing at the time the fix becomes available.

In my opinion, if there are any critical infrastructure attacks resulting from this disclosure, the researchers should be personally and criminally liable. This is unconscionable behavior. And I continue to be ashamed to be part of an industry that tolerates increasing the risks it's trying to defend against, all for the sake of marketing.


You might also be interested in:

News Topics

Recommended for You

Got a Question?