A real-world case study of SCADA security

In this second of two articles, friend and colleague Professor Michael Miora, CISSP-ISSMP, FBCI concludes his case study on the security of supervisory control and data acquisition (SCADA) systems. All of the following is entirely Miora's own work with minimal editing. I have added a few comments about the psychology of risk perception at the end of his contribution.

The meaning

Having established the current security situation in the SCADA systems, we made a series of recommendations to close the gaps. Some of the recommendations were of the quick-win variety, giving vast improvement quickly and at low cost. Other recommendations were more complex and required time and effort for implementation. The clear message given in the report was that the water and power distribution networks owned and operated by this organization were vulnerable to serious service disruptions or degradations by moderately trained external personnel without access to internal networks or information.

The report also highlighted some major physical security issues. Even prior to 9/11, it was well recognized that a major issue for water distribution was public access to reservoirs and filtration systems. Our water distribution systems have been built over the last century and a half, mostly without regard to the threat of intentional contamination or other tampering. These systems were open to physical contamination.

Some concluding thoughts

Since 9/11, the focus on physical security has increased significantly. There are a variety of products available to help prevent intrusions onto active reservoirs and to monitor activity via video surveillance. Local authorities now routinely patrol reservoirs as well. The Environmental Protection Agency (EPA) has a Water and Wastewater Security Product Guide to help authorities find products that match security needs.

To this day, many water distribution systems are still struggling with the physical security efforts. One such example is the city of Boulder, Colo. "Boulder's supply of drinking water faces lingering vulnerabilities to terrorism and other acts of intentional contamination, seven years after a consultant recommended dozens of security upgrades, a recent city assessment concludes." Note that this is not the entity under discussion in this paper.

Where are they now?

In the decade since the initial assessment was performed, the organization we assessed has not conducted a re-assessment. There was at least one attempt, but the solicitation process bogged down in a deluge of needless bureaucracy and no contract was ever awarded. This writer wonders whether the threat continues to be mitigated as it was in the months following the initial assessment, or if piecemeal system and operational modifications have eroded the good work the organization did when it received our initial assessment.

[Kabay comments: One wonders if management succumbed to the sense that absence of evidence of tampering equates to absence of vulnerabilities. In my experience as a consultant, I have regrettably run up against upper management who seem to believe that wishful thinking is a reasonable substitute for a global perspective on industry experience. If they, personally, have not (yet) been involved in a security debacle, they seem to believe that they and their organizations are immune to risks that have been documented in similarly placed organizations.

In the classic paper "Perception of Risk" [SCIENCE 236(4799):280-285 (17 April 1987)], which is available free to members of the American Association of the Advancement of Science (AAAS) or for a modest fee, Paul Slovic (professor of psychology at University of Oregon) wrote that "In many cases, risk perceptions may form afterwards, as part of the ex post facto rationale for one's own behavior. …[P]eople, acting within social groups, downplay certain risks and emphasize others as a means of maintaining and controlling the group." Slovic continued, "….[L]aboratory research on basic perceptions and cognitions has shown that difficulties in understanding probabilistic processes, biased media coverage, misleading personal experiences, and the anxieties generated by life's gambles cause uncertainty to be denied, risks to be misjudged (sometimes overestimated and sometimes underestimated), and judgments of fact to be held with unwarranted confidence. Experts' judgments appear to be prone to many of the same biases as those of the general public, particularly when experts are forced to go beyond the limits of available data and rely on intuition…."

In other words, we humans are not very good at judging risks in complex systems.

* * *

Michael Miora has designed and assessed secure, survivable, highly robust systems for industry and government over the past 30 years. Miora, one of the original professionals granted the CISSP in the 1990s and the ISSMP in 2004, was accepted as a Fellow of the Business Continuity Institute (BCI) in 2005. Miora founded and currently serves as president of ContingenZ Corporation. He was educated at UCLA and UC Berkeley, earning Bachelors and Masters Degrees in Mathematics. His published works include contributions to the definitive Computer Security Handbook, 4th and 5th Editions by Wiley & Sons. Miora is an adjunct professor in the MSIA Program at Norwich University and is a member of the editorial board of the Business Continuity Journal.

Learn more about this topic

A laundry list of power industry incidents to learn from

Increasing security of SCADA systems in power industry

Attack on power systems: Industry/government consensus

Join the discussion
Be the first to comment on this article. Our Commenting Policies