Beyond the hype: cognitive computing and your business, your job, your life
The noise level in the press about cognitive computing is rising dramatically—we can see this from Google Trends, which shows us that the popularity of searches for “cognitive computing” on Google are up almost exactly 100 percent from a year ago. But a fair question about all this attention is: “Is the hype shedding more light on the issues and impacts surrounding cognitive computing? Or is it merely generating hot air and lots of careless page views?”
In this column, we will take a stab at evaluating what the realistic impact of cognitive computing could be at a number of levels that affect us as professionals: at the level of our businesses, at the level of our jobs and at the level of our lives both professional and personal. Each of those levels are the subject of ongoing research at the newly launched Cognitive Computing Consortium. See sidebar following article.
Starting at the business level, it’s immediately apparent that this is where we see a majority of the press attention at the moment. IBM, in particular, has almost single-handedly created the name and the category “cognitive computing.” Earlier this year, IBM CEO Ginni Rometty adopted the phrase “cognitive business” to describe the direction IBM and its clients would be going to turn information assets from big data to digital intelligence. While there is clearly an element of hype in IBM’s corporate message, it is not simply a public relations gesture. IBM has invested more than $1 billion in its own cognitive computing programs, created cognitive computing curricula in collaboration with leading educational institutions across the globe, established a venture arm to invest in cognitive computing startups, built a Watson partner community of some 500 third-party organizations and reorganized its business to create a Watson-branded set of products and services that it estimates at a revenue level of $18 billion in 2015.
New competition at cognitive level
Like IBM, other technology businesses will need to come up with near- to medium-term strategies to compete in new markets that leverage big data and software innovation to deliver cognitive-style intelligent applications across a wide range of business processes and industry functions. Software giants HP and SAS have recently started to use the term “cognitive,” and Facebook, Microsoft and Google have open sourced machine learning toolsets to developers. The tech industry has already become the lead early adopter of cognitive technologies. While most enterprise firms are caught in the last decade’s developments—cloud, big data and analytics—a new competition at the level of cognitive is beginning in earnest.
Outside the tech industry, mainstream businesses will be feeling the impact of cognitive applications in the near term as well. For example, intelligent assistants of various kinds in the healthcare industry (a core focus for IBM and others) are already changing the way caregivers and healthcare payers go about their daily business. Similarly, the rapidly approaching arrival of self-driving vehicles is shaking many of the core assumptions of the auto industry. In financial services, cognitive trading and investment advisory services and multiple insurance-oriented applications are helping streamline formerly all-manual transactions.
As businesses try to develop their strategies for transitioning to the cognitive era, we need to consider the more personally pressing issue many pundits are raising: What happens to jobs? To my job? Pundits presumably feel immune from cognitive competition, but plenty of other professional roles appear to be in line for extinction. A lot has been written in the past six months about how cognitive software can be expected to make whole swaths of white collar professional workers extraneous: from lawyers and engineers to traders and investment advisors to programmers and educators. A forthcoming book by Tom Davenport, Only Humans Need Apply, is a thoughtful consideration of that issue. But many of the news pieces tend to disregard the multiple blue-collar jobs that are in line to be snuffed out by automation involving such innovations as intelligent robots, self-driving vehicles and delivery drones.
In our view, the claims about the demise of the carbon-based professional worker can be described by the famous Mark Twain misquote: “The news of my death is greatly exaggerated.” At the same time, the claims that any role for which a cognitive application replaces a human worker will open up other tasks for that human worker are Pollyannaish to a fault. There is no question that job contents and alignments will face a period of significant change, but in the near to medium term, machines will continue to be supportive of people functions, not the other way around.
Overall, what impact might cognitive computing have on our lives as consumers, citizens, parents, etc.? We’ve already been alerted to trends by the entertainment world, in movies like 2013’s Her, which depicts the development of a man’s relationship with his super-intelligent digital assistant/love object. Let’s call that a vision firmly in the realm of fantasy at this stage. But the development of digital assistants across more and more areas of life is a trend already firmly established.
In Apple’s Siri, Google’s Now, Amazon’s Echo and Microsoft’s Cortana, we see shipping semi-cognitive products engaged with mass audiences. While those products currently have trouble demonstrating competence, much less super-intelligence like the operating system depicted in Her, their improvements have been steady and impressive in specific areas. At the same time, major advances in technology around challenging cognitive functions, like image recognition and machine translation, have enhanced the capabilities of a variety of consumer and business applications.
Cognitive computing today is like the embryonic chick beginning to peck its way out of the egg. No amount of overheated journalism will bring it out faster than its natural cadence. In what we read and hear today, neither the most optimistic visions nor the direst prophesies of doom will come to pass. What we can state will happen is that smart people will be getting better ideas about how these cognitive technologies and approaches can be bent to solve intrinsic human problems—natural ones and those that people have created. The trick will be to be smart about the new problems that cognitive tools will create, as well as enthusiastic about the advances they will bring.