Cyber security risk management: How secure is secure?
Risk management best practices can be applied to cyber threats just as they are used against physical threats. So, how does one know when a network is secure enough?
Cyber threats are here to stay, but just like physical hazards and threats, they can be countered with risk management best practices. But how does one know when a network is secure enough? After all, hackers have to succeed only once, but defenders must succeed every time to maintain business continuity. The deck seems stacked against the defenders, right?
FBI Director James Comey warned, "There are two kinds of big companies…those who’ve been hacked…and those who don’t know they’ve been hacked." The same is true for any component of critical infrastructure. Within oil refineries, power plants, and water treatment facilities, curious and malicious actors are active and quite successful at gaining access to control systems. They exploit old, unprotected weaknesses and new, unknown vulnerabilities to compromise the operation, integrity, or availability of our equipment. Sometimes the attack doesn’t even originate from a person per se, rather from malicious code transmitted through communication with a system beyond the borders of company control. Alarmingly, according to recent reporting by Verizon, many attacks go on for months before being realized, and then take even more months to remove from the system.
To compound the issue, the incorporation of big data analytics and Internet of Things (IoT) considerations into workflows continues to expand the attack surface. These attacks, either from other enterprises or foreign nation states, have enormous monetary and safety implications. Despite these growing risks, research on cybersecurity in the oil and gas industry published earlier this year by EY indicates that security budgets are not being increased to combat the rising threats. The study points out that the oil and gas industry IT funding issues "are often further compounded by a separation of roles and responsibilities for operational technology (OT) and cyber…," potentially leading to redundant spending and confusion- of priorities and of resource allocation.
Even more alarming is what this study uncovered about the industry’s self-reported level of preparedness for the growing cyber threat. Fully 61% of oil and gas industry companies "believe it’s unlikely or highly unlikely they would be able to detect a sophisticated attack." In addition, 29% "Have no real-time insight on cyber threats," and only 13% are willing to state their IT structure and related security is "fully meeting the organizational needs."
Perhaps the most written-about and analyzed incident of an industrial cyber attack is Stuxnet, familiar to many controls engineers because of its targeting of industrial PLCs. Stuxnet was a computer worm discovered in June 2010, designed to sabotage Iran’s nuclear program. As recently as a few months ago, it was reported that a related version of the Stuxnet virus was employed in a failed attempt to compromise North Korea’s nuclear weapons program.
Whether as a result of cyber espionage, cyber attack, or general human or system error, the implications of a failed cyber security program in the oil and gas industry can be profound. The nature of the industry means risks, and ramifications could extend to compromised safety, environmental, and geopolitical issues. As Stamatis Karnouskos described in his research, "Stuxnet Worm Impact on Industrial Cyber-Physical System Security," "Much of our critical infrastructure is controlled by cyber-physical systems responsible for monitoring and controlling various processes." These SCADA systems "are industrial control systems responsible for a wide range of industrial processes," including those used in oil and gas pipelines and refineries, power generation, infrastructure, and general manufacturing applications.
Compounding the problem, how many engineers in an organization are likely to log in remotely to explore system faults, and how many of them and their leaders are gaining remote access to systems for reporting purposes? Often, these activities are conducted from minimally-protected personal computers or tablets. Still, the products and locations themselves are at high risk. The authors spoke with an unnamed industry veteran who currently consults with the largest oil and gas companies and whose previous experience includes roles, such as leading inspections and compliance for some of these same companies. He said, "With all my years in the industry (inspecting products and systems), frankly, I see people conducting themselves the same way they did 20 years ago. We are constantly asked to evaluate wireless industrial devices, and yet cyber security is [almost] never mentioned by the client or indicated in the specs until we bring it up. So I wonder, how many products on the market have addressed it?"
One of the most prevalent attack vectors is to compromise a wireless communication network, thereby gaining access to the control systems that keep oil and gas operations safe and profitable. Equipment can be hacked to malfunction which denies service or presents unsafe conditions. Furthermore, there has been threats of data theft related to intellectual property, research data, and financial information.
But regardless of where the attack originates or how it works, managers at all levels are held accountable for all negative outcomes—not to mention restoring the system to trustworthy functionality.
Manage the risk
Cyber threats have now been added to the list of the many components of risk that leaders, managers, and engineers must address as part of a robust risk management program. But now that the threat has been identified, at least vaguely, what can be done? The good news is that cyber hazards can be approached as any other hazard: With a disciplined risk management posture to maintain continued, reliable operation.
A proper risk management program must take into account the highly complex, interconnected nature of the systems we control and their interactions with the outside world, which we do not control. Additionally, the reality is that legacy systems have been in operation for decades and mix with state-of-the-art upgrades, which can leave back doors and holes for attackers to exploit. This is perhaps the most dangerous case of all: Thinking the doors are locked when they aren’t.
These new technologies are often tremendous multipliers in efficiencies, use, availability, and profitability. But to reap these benefits, their darker side must be accounted for as well. An app that allows a control engineer to optimize processes from home also creates a path for others to monitor-and possibly control the processes, too. Similarly, the implementation of smart components in our systems, with the potential to use of wireless or cloud technology, presents the same opportunities for disastrous system compromise.
This is where cyber hazards differ from conventional hazards. While the laws of the physical universe may not change, the laws of the cyber universe do. Imagine waking up each morning and legitimately having to ask the question, "Do Newton’s Laws of Gravity still apply today?" or "Will Ohm’s Law still be relevant tomorrow?" A cyber hazard is a moving target, and it must be observed regularly to manage associated risks.
Good offense is best defense
However, this type of risk management posture is not new; it’s only a new application. For example, for the Air Force, each new advance in defense technology that promised to bring victory was quickly met by a corresponding advance that prophesied defeat. Military strategist and fighter pilot Colonel John Boyd knew this and turned it to his advantage. He developed a system to observe threats, orient resources to counter strengths and exploit weaknesses, decide the most advantageous response, and then to act before his adversary could. Whoever has the shorter observe, orient, decide, and act (OODA) loop wins the dogfight, or in the engineering case, the ownership of control systems.
Engineers now must convince their managers and leaders that to reap the financial benefits of new technologies, they must manage the risks as well. This means having a clear framework through which to assess risks—cyber or otherwise—and having a robust program to stay continually ahead of the game. National Institute of Standards and Technology (NIST) Special Publication 800-39 uses a similar concept to the OODA loop for information security risk management. It provides many helpful guidelines as well.
Setting up for safety
However, planning alone doesn’t guarantee an execution that will properly address risks. A safety instrumented system can be designed perfectly, but if it is improperly installed or the prescribed proof testing isn’t carried out appropriately, the residual risk still remains. In fact, the system is in a more dangerous situation because the security team believes it has managed the risk, but in reality, it has not.
A robust cyber security risk management program must gain buy-in and involvement at all levels of the organization—from executives through management and on to engineers and technicians. It must pool the collected knowledge and experience of representative people from strategic planning, oversight, management, and day-to-day operations. Only then can cyber risks be managed properly.
How secure is secure enough? The answer is that there is no one-size-fits-all answer. Just like every other hazard, cyber risks must be managed by a team of people who connect the practical day-to-day activities of your operation to the latest threats in the world today. Neither is it a one-and-done problem, it requires ongoing vigilance and diligence to prevent the compromise of systems and to mitigate the effects when they are.
Erik Reynolds is a consultant at Intertek, and has 14 years of experience in design, development, and deployment of advanced aerospace systems, supporting systems engineering life cycle activities of leading global companies. Suku Nair is a university distinguished professor and the chair of computer science and engineering department at the Southern Methodist University at Dallas. His research interests include cyber security, fault tolerance, software defined networks, and virtualization technologies.
About the authors
Previously an Air Force flight test engineer and design team member for NASA payloads, Reynolds has expertise in product design for high reliability performance in harsh environments. Reynolds is a certified functional safety expert and certified project management professional. He is working toward a PhD in systems and engineering management at Texas Tech University.
Nair is the founder of the cyber security program at SMU, which currently enjoys the NSA/DHS Center of Excellence in Information Assurance Education designation and $10 million in endowment support. He has published more than 140 journal and conference papers in the area of high assurance computing and networking. His research has been supported through funds from National Science Foundation (NSF), National Security Agency (NSA), National Institute for Standards and Technology (NIST), Office of Naval Research (ONR), and various industry including Lockheed Martin, Alcatel, Raytheon, IBM, and AT&T. He has been a consultant to various IT, Telecom, and cyber security companies. Some of his recent awards include the University Distinguished Professorship, IBM faculty award, Distinguished University Citizen award, and the SMU Ford Research Fellowship. He received his MS and PhD in electrical and computer engineering from the University of Illinois at Urbana in 1988 and 1990, respectively.
Original content can be found at Control Engineering.