Crossing the epistemic divide
This article continues the investigation into the distinction between natural and artificial systems, why it’s unnatural to force one to behave as the other, and how both of them together can be better than either one alone.
The relentless growth in digital processing speed, storage capacity, and bandwidth has fueled the illusion that we can solve today’s “wicked” problems purely by raw computational power. This has led to the current obsession with data analytics. This approach quickly breaks down when dealing with the speed, volume, and complexity of the real world, much of which is amplified by that very same growth in technology. But it still doesn’t stop people from trying to use data to manage everything from elections to the global supply chain to the economy itself.
For example, in recent years, the U.S. Federal Reserve has adopted a data-driven approach to carrying out its congressional mandate of achieving maximum employment, stable prices, and moderate long-term interest rates. To this end, it has determined it needs to focus on two independent variables (interest rates and reserve requirements) and two dependent variables (consumer price inflation and unemployment). At present, the targets for the dependent variables are 2% inflation and 4.5% unemployment. If you have your KM hat on, you might be asking: “How the heck did they decide that?”
Fed officials have openly admitted to being “data-dependent,” focusing on data analytics as a key driver of policy decisions. Many organizations, both public and private, have followed suit.
Such efforts have often proven futile. One reason is that complex, adaptive systems made up of billions of minds (and in the coming years, trillions of intelligent devices) can never be accurately modeled, especially considering the combinatorics involved. Yet, despite this, many organizations still try to reduce everything down to a few key variables and control levers, hoping for the best.
As the world races ahead, purely data-driven approaches will become less attractive. Instead, we need to start gaining a deeper understanding of how to bridge the great divide which separates the artificial and the natural. This means striking a balance between algorithms and insight, trend analysis, and foresight. This is where epistemology comes into play.
Epistemology deals with what is known, how it is known, and how it is expressed. By its very nature, the epistemology of a particular domain is bounded. Different epistemologies can even emerge within the same domain, which can give rise to serious miscommunication and misunderstanding. We see this in the seemingly endless debates between opposing political parties, or in trade negotiations between countries with totally different cultures and economic systems.
Epistemic gaps result when the human understanding of a domain, especially when derived by reasoning and inference over long periods of time, is at odds with “what the data is telling us.”
The field of medicine is a good example. Similar to economics, the medical profession exhibits the same tendency toward data-driven approaches to the diagnosis and treatment of disease. Blood pressure, blood glucose, and cholesterol are common dependent variables, while doses of statins, DPP-4 inhibitors, and diuretics, respectively, are the independent variables. But the results have been inconsistent, even deadly. At more than 250,000 deaths per year, medical errors are now the third leading cause of death in the U.S., according to a recent study by Johns Hopkins Medicine.
Part of the problem results from the practice of establishing a standard set of protocols based purely on lab results obtained for a predefined (and narrow) set of variables. But when doctors have a “sense” that something needs to be done that’s contrary to those protocols (an epistemic gap), they hesitate to act out of fear of punishment. Over time, this has resulted in the emergence of other branches such as functional medicine, osteopathy, homeopathy, and integrative medicine. Each of these branches has its own bounded epistemology, which only adds to the confusion.
Education is another example, with its obsession with standardized numeric test scores and one-size-fits-all approaches to learning, as in the Common Core mathematics curriculum. The point is, if humans have such a difficult time crossing epistemic barriers, how can we possibly expect computers to do any better?