A lot of folks would turn to the developers behind the 737 MAX nosedive prevention software as the primary reason for the catastrophic malfunction that led to the deaths of hundreds of people.
That answer would be much too simple and mostly inaccurate.
Numerous factors went into the failure of the aircraft's safety system, including time-to-market competition and easing of regulations, according to a comprehensive article in Vox.
In the article, the author boils down a series of business, regulatory and political events that ultimately resulted in the development of the MAX's sensory software as a workaround for an engineering flaw involving the pitch of the aircraft.
That software, by doing what it was supposed to do, caused the same false positive over and over, resulting in unacceptable loss of life. And it continues to cause Boeing's customers (the airlines) loss of profit, loss of business and other logistical nightmares.
"In cybersecurity, many solution providers claim zero false negatives but never mention false positives," says John Pescatore, director of emerging technologies at SANS Institute. "More training is required on how to deal with potential false positives before taking actions that will cause business impact."
Barbara Filkins, research director for the SANS Analyst Program, agrees, adding that this example should serve as a wake-up call to the dangers of automation without proper engineering, training and contingency planning.
"Security and development engineers and professionals are not trained the way pilots are. They need to be aware of both known and unknown contingencies that can occur on the flight deck, contingencies that can demand immediate decision and reaction," she explains.
Assessing for unintended consequences of applications doing what they're supposed to do would help reduce this type of risk, and is an important review element of any development or engineering plan, adds Filkins.
Boeing has openly talked about improving software and training fixes. The company is also being cautious with the software fix to this and another related software problem, showing that these lessons, being learned the hard way, will hopefully lead to workable solutions.
"Aviation is multidimensional, and the industry is more risk adverse because they are in the business of transporting human souls," says Filkins. "In that respect, they're more like healthcare, where security must play well with safety."
Deb Radcliff, Creative Director of the SANS Analyst Program, is considered one of the first consistent cyber crime reporters for the business community. Since 1994, her articles in Byte, SC Magazine, Computerworld, Networkworld, CSO, The Register, CNN Online and other publications have won her two Neal awards for investigative reporting. Her work has been translated into many different languages, cited in research and law journals, and used verbatim in college textbooks. In addition to her SANS work, Deb has spoken on business radio and at West Point, H.O.P.E. (Hackers on Planet Earth) and elsewhere.