KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

Dispatches from the edge

Article Featured Image

In this column, we’ve explored the tendency of our global socio-economic systems and enterprises to expand and contract and to move back and forth between massively centralized and widely decentralized. In the March 2016 issue of this column, when big data and cloud computing were all the rage, we went so far as to predict the emergence of small data—tight, compact, and localized. But little did we know that not only would the data go small and local, but that processing would quickly follow. Welcome to the wild, wonderful world of edge computing. 

From the micro to the macro and back 

Edge computing simply means pushing information processing from a remote server farm somewhere out in the cloud back to the endpoints at which information is generated and/or applied. Those endpoints include the more than 30 billion IoT devices projected to be connected to the internet by 2025. 

Some obvious benefits of this shift include reduction in latency, bandwidth, and power consumption. This doesn’t mean that edge computing devices are disconnected from the cloud. Rather, both the edge and the cloud play equally important roles according to the strengths and weaknesses of each. For example, latency is a weakness of the cloud that is offset by the close proximity of edge computing. Yet information is still sent to the cloud, where greater processing and memory capacity aggregate data from many thousands of other devices, adjust the algorithms, and pass them back to the edge. 

To be clear, cloud computing and big data are not going away anytime soon. And the teraflops and zettabytes will continue along their exponential growth trajectory, at least as long as Moore’s Law refuses to recognize Malthusian limits. But as literally billions of devices continue to come online, it makes little sense to have their data and processing capacity housed in some far away place. 

Drivers and the need for KM 

A main driver of this trend has been a shift in the means of value production from the physical to the intellectual. This implies spending less time on widgets and instead focusing on connectivity and intelligence. As such, solutions have gone from independent stovepipes to complex and interwoven systems of systems, where small changes can cause large effects and vice versa. This demands increased awareness of complexity and its impacts on the entire ecosystem. 

This, in turn, drives systems architecture from the complicated (e.g., an aircraft carrier) to the simple (e.g., mobile apps). It’s truly a brave new world. Yet many will still attempt to apply the old classical systems engineering model of long, linear planning and design cycles with many moving parts and top-down control. But this will likely lead to significant system failure. 

Conversely, complex endeavors supported by resilient infrastructure will continue to succeed. Edge computing plays a major role in achieving that much needed resilience. 

The ever-increasing rate of change is another driver. Richard D‘Aveni, a professor at the Tuck School of Business at Dartmouth, notes that our previously stable, predictable world is giving way to hyper-turbulent “edge of chaos” behavior. This drives solution development time from the nearly infinite float (lag times) of classical systems engineering to a nearly negative float that cannot even stand agile time lags. 

KMWorld Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues