Limiting the Risk of Human Error in Cybersecurity

Admiral James

By Admiral James "Sandy" Winnefeld, US Navy, Retired, and Former Vice Chairman of the Joint Chiefs of Staff 11.26.2019

LINKEDIN

The vast majority of companies are more exposed to cyberattacks than they have to be. To close the gaps in their security, CEOs can take a cue from the US military. Once a vulnerable IT colossus, it is becoming an adroit operator of well-defended networks. One key lesson of the military’s experience is that while technical upgrades are important, minimizing human error is even more crucial. Mistakes by network administrators and users — failures to patch vulnerabilities in legacy systems, misconfigured settings, violations of standard procedures — open the door to the overwhelming majority of successful attacks.

People matter as much as, if not more than, technology. (Technology, in fact, can create a false sense of security.) Cyber Defenders need to create “high-reliability organizations” (HROs) by building an exceptional culture of high performance that consistently minimizes risk. At the heart of that culture are six interconnected principles, which have helped the military and can help businesses weed out and contain the impact of human error.

1. Integrity.

By this, I mean a deeply internalized ideal that leads people, without exception, to eliminate “sins of commission” (deliberate departures from protocol) and own up to mistakes immediately. In the military, the nuclear navy instills this into people from day one, making it clear there are no second chances for lapses. Workers thus are not only unlikely to take shortcuts but also highly likely to notify supervisors of any errors right away, so they can be corrected quickly and don’t necessitate lengthy investigations later. Operators of propulsion plants faithfully report every anomaly that rises above a low threshold of seriousness to the program’s central technical headquarters. Commanding officers of vessels are held fully accountable for the health of their programs, including honesty in reporting.

2. Depth of knowledge.

If people thoroughly understand all aspects of a system — including the way it’s engineered, its vulnerabilities, and the procedures required to operate it — they’ll more readily recognize when something is wrong and handle any anomaly more effectively. In the nuclear navy, operators are rigorously trained before they ever put their hands on a real propulsion plant and are closely supervised until they’re proficient. Thereafter, they undergo periodic monitoring, hundreds of hours of additional training, and drills and testing. Ship captains are expected to regularly monitor the training and report on crew proficiency quarterly.

3. Procedural compliance.

On nuclear vessels, workers are required to know — or know where to find — proper operational procedures and to follow them to the letter. They’re also expected to recognize when a situation has eclipsed existing written procedures and new ones are called for. One of the ways the nuclear navy maximizes compliance is through its extensive system of inspections. For instance, every warship periodically undergoes tough Operational Reactor Safeguard Examinations, which involve written tests, interviews, and observations of day-to-day operations and of responses to simulated emergencies. In addition, an inspector from the Naval Reactors regional office may walk aboard anytime a ship is in port, without advance notice, to observe ongoing power-plant operations and maintenance. The ship’s commanding officer is responsible for any discrepancies the inspector may find.

4. Forceful backup.

When a nuclear-propulsion plant is operating, the sailors who actually control it — even those who are highly experienced — are always closely monitored by senior personnel. Any action that presents a high risk to the system has to be performed by two people, not just one. And every member of the crew — even the most junior person — is empowered to stop a process when a problem arises.

5. A questioning attitude.

This is not easy to cultivate in any organization, especially one with a formal rank structure in which immediate compliance with orders is the norm. However, such a mindset is invaluable: If people are trained to listen to their internal alarm bells, search for the causes, and then take corrective action, the chances that they’ll forestall problems rise dramatically. Operators with questioning attitudes double- and triple-check work, remain alert for anomalies, and are never satisfied with a less-than-thorough answer. Simply asking why the hourly readings on one obscure instrument out of a hundred are changing in an abnormal way or why a network is exhibiting a certain behavior can prevent costly damage to the entire system.

6. Formality in communication.

To minimize the possibility that instructions are given or received incorrectly at critical moments, operators on nuclear vessels communicate in a prescribed manner. Those giving orders or instructions must state them clearly, and the recipients must repeat them verbatim. Formality also means establishing an atmosphere of appropriate gravity by eliminating the small talk and personal familiarity that can lead to inattention, faulty assumptions, skipped steps, or other errors.

Even though my examples are from the military, they fundamentally apply to and can be reasonably adapted to any organization. Cybersecurity breaches caused by human mistakes nearly always involve the violation of one or more of these six principles. CEOs should ask themselves and their leadership teams tough questions about whether they’re doing everything possible to build and sustain an HRO culture. Building and nurturing a culture of high reliability will require the personal attention of CEOs and their boards as well as substantial investments in training and oversight. Cybersecurity won’t come cheap. But these investments must be made. The security and viability of companies — as well as the economies of the nations in which they do business — depend on it.

Expanse recently announced the addition of retired US Navy Admiral James “Sandy” Winnefeld as an advisor to the company. Winnefeld joins the Expanse advisory board after serving as the ninth Vice Chairman of the Joint Chiefs of Staff and the United States’ number two ranking military officer. This article is part three of a three-part series into some of Winnefeld’s thoughts as he joins the Expanse team. This installment pulls from Sandy’s 2015 article in Harvard Business Review but remains relevant today.