Monday, 3 October 2011

On disclosure

In my twenty plus years in information security I have seen many things. I started as many in the security world begin, with the view that we need to maintain secrecy. That obscurity has some (even if a minor) effect. However, in the course of years and in completing over 1,000 reviews and audits in those decades, I have documented and watched the effects of darkness on security.

The results have changed my views from one of covering the truth in order to minimise damage to that of the sunshine principle. Only when we bring exploits into the light of day do we hope to counter them. This is not simply from a perspective of personal opinion, but through the one science that can help us measure these effects, economics. Everything has a cost and only through treating security as a relative and not an absolute can we hope to effectively protect our systems and through them, society.

In writing on SCADA issues, I have been told that I will not ever see another audit of such a system. That is clearly false as some of my best supporters in this are people who are clients. There are those who will not hire me for saying that I want things open, and these are also the types of people I would not ever want to deal with again. There is a reason for this, those who run SCADA systems responsibly suffer greater costs and are less efficient in the short term than those who manage their systems well. The difference comes in the long term, but this in itself can be problematic.

In my years testing systems and responding to incidents, I have seen more failures than I care to remember. Some have been spectacularly complex, others have been dismal and it amazed me as to how poorly the controls could be set.

Right now, we do not have top report system breaches in most places. The result of this is that for every public breach (and here there are many), we see but the tip of the iceberg. From my experience, and this is simply anecdotal, I would estimate 100 or more breaches occur for each one that becomes publically known.

It has been demonstrated that security disclosure has a negative impact on a company’s share price. This depends strongly on the nature of the organisation. Some organisations actually have no economic impact from a breach. For others, the effect is catastrophic.

Toys [1], vendors and the existing compliance regime have to take some of the blame for the failures we see. I worked with one critical system using SCADA equipment that managed to pass audits for five years running. An independent analysis of the system concluded that if this system failed, upwards of 30,000 lives could be lost as an expected risk value. The system was at one stage infected with a scanning worm and Trojan. We contained the worm, but the system was never rebuilt.

The answer was to implement firewalls and monitoring controls that stopped the Trojan from connecting to a C&C server.

We talk of the difficulty in creating a Stuxnet type code sample, but we forget that the aims of Stuxnet involved fine control. The creators wanted a specific outcome. This is difficult to achieve. Chaos on the other hand is simple. Maintaining systems is difficult and requires more knowledge and sophistication than most groups have. Crashing critical monitoring systems on the other hand, well that is easy.

I like aircraft systems for the most part, even these have issues from time to time, but the people I worked with in the past always took matters seriously. I note these as they are the better run systems and when we see the best system have issues, then we can put into the light those systems who do far less to ensure they remain secure.

Even the best run and maintained systems suffer from problems. In Gatelink and engine management, Boeing once had only WEP implemented over the 802.11B connections. This has now improved and EAP-TLS with AES using mutually authenticated endpoints is the norm, but it was only testing and discovery that actually made this change occur.

More, it was necessary to actually demonstrate that line of sight connectivity using a focused antenna could be made from a fishing boat 2Km away and still send and transmit data. The initial belief was that directionally focused antenna would restrict the range that wireless signals could be intercepted. Yes, this is only available on the ground and specialist knowledge of the systems is necessary. That is also a problem, even in controlled systems; we see the design and scheme leaked onto the Internet and on P2P networks.

Security through obscurity is simply false security and leaves us vulnerable with no way to measure the true risk.

So, I have become a proponent of mandatory disclosure.

The reporting of news and events is in itself a problem. We rarely report on the wins and just sensationalise the losses. Instead of commending companies for what they do right, we look at failures and do nothing but knee jerk reactions and fire fighting.

We are about to enter a world where everything is online. My DVD player is online and I have ordered internet connected wireless light globes for testing. With IPv6 finally just around the corner, we will see many of the controls we have relied on fail us. NAT (network address translation) and firewalls fail under the distributed architecture model of IPv6. These have offered some control and protection for a while, but that time has long past.

With a little knowledge, one can do a Google search on a SCADA or PLC and find multiple examples of systems that are supposed to be air-gapped and never connected to the world online and vulnerable to attack. We have to start treating the risks associated with these systems seriously and stop putting our heads in the sand. We are already seeing attacks against SCADA systems. Why are these not all being taken down now and causing loss and damage? Well one aspect is that they are being Trojanised and remotely controlled. Not for an effect now, but in case.

Carl von Clausewitz wrote how “the backbone of surprise is fusing speed with secrecy.” We provide that secrecy when we allow compromised systems to remain online, when we hide the fact that our critical systems have been compromised. He further reminded us that “War is not an independent phenomenon, but the continuation of politics by different means.”

We are seeing the Chinese and North Koreans compromise more and more critical systems each week. Some of these are in the press, but these are either the failures or the systems that are already mined for all they are worth. For the most part, these breaches remain secret and out of public purview. To the North Koreans, the entire purpose of a computer network seems to be as a platform for politics through any means.

To paraphrase Clausewitz, everything in security is very simple. But the simplest thing is difficult.

Judge Learned Hand formulated a legal rule. He said that everything had to be measured in terms of ensuring that “B<PL”. That is, the burden of untaken precautions (B) has to be less than the product of the probability of an outcome (p) by the severity of outcome (L).

This is more than simply looking at the short term costs and effects, but the probability of more dire events and consequences as well. To allow Judge Hand’s rule, we cannot hide in the dark, we need to expose the disease that comprises many of our security controls to the antiseptic of sunlight.

Now, my focus is on teaching. I still do some work and consulting, but only as I do not believe a teacher who is teaching security can do so without exposure to the “real world”. If I do not work on sites that want to have an audit such that they can fulfil a compliance obligation and not actually take the risks seriously, I could not be happier.

Frankly, it is about time we actually started to expose risks and then allow society to place a value on these risks.

[1] I define “Toys” as security product for the sake of product. A prime example is IDS without monitoring and/or tuning.

No comments: