Good and bad sides of data-driven
As we enter the middle of the third decade of the millennium, the impact of this shift has become apparent. Of course, there are many benefits, including greater customer satisfaction and improved product reliability, along with increased efficiency and cost savings. However, there’s a dark side to the current obsession with numbers and data.
For example, internal Boeing emails made public during investigations following the two fatal 737 Max crashes revealed financial and marketing data winning out over the warnings of design experts, whose knowledge emanated from deep in the trenches. To accommodate the aerodynamics of the extended airframe, a novel maneuvering augmentation system (MCAS) was introduced.
Knowledge based on best practices and risk management dictated that pilots undergo additional simulator training in order to become familiar with the new system. Instead, senior management overruled that approach and mandated faster, cheaper desktop training.
Let’s take a look at other recent disasters that might have been prevented if long-established knowledge had been applied along with the numbers.
Wherever a myriad of laws and regulations enacted by multiple agencies exists, the likelihood of conflicts, contradictions, and errors increases. For example, in the mid-1980s, the U.S. Center for Disease Control and Prevention’s National Institute for Occupational Safety and Health (NIOSH) published a series of regulations and guidelines aimed at reducing injuries and fatalities on farms and food processing facilities. Included among these were safety measures for evacuating excess methane produced by anaerobic wastewater treatment processes. This was a good thing, as this colorless/odorless gas can be poisonous, flammable, and in sufficient quantity, explosive.
Fast-forward 20 years. The U.S. Environmental Protection Agency mandated the use of anaerobic lagoons and tanks using microbes to treat manure-laden wastewater. Other than the added cost, this doesn’t necessarily pose a safety problem on small farms. This is not the case for large operations such as the South Fork Dairy Farm in Dimmitt, Texas, where last April, a methane explosion killed more than 18,000 cows.
This was a perfect example of how connecting the dots between two or more regulating bodies by knowledge graphs or other KM approaches might have identified and mitigated the inherent risk. In fact, all that was really needed to prevent the disaster was the application of a few simple chemical engineering formulas based on established principles (knowledge), very little data (pressure, volume, temperature), and the probability of even a tiny spark. Note that in a purely data-driven world, a significant number of mishaps need to occur in order to produce a sample size sufficient enough to conclude the presence of substantial risk.
Titan was a deep underwater submersible designed to explore sites such as the sunken HMS Titanic resting on the ocean floor at a depth of 12,500 feet. The key word here is “designed.” Instead of titanium or carbon steel, as is used in most Navy submarines, Titan’ s main structural elements consisted of novel combinations of carbon fiber, epoxy, and other materials, with titanium used only in the end caps.
The missing word in this case is “testing.” Granted, some minimal nondestructive testing was performed using listening devices to detect changes in stress on the hull. But these tests were performed with no payload (people, in this case) and were not enough to calculate acceptable safety margins. This was contrary to standard engineering practice, especially given the expectation of operating with human passengers in the extreme, unforgiving environment of the ocean floor. But, as in the case of the 737 Max, speed and profit margins won out over safety margins, with a similarly disastrous outcome.