-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

Cognitive computing: Life imitates popular culture

The pivotal moment in The Imitation Game when viewers knew that Alan Turing and his MI6 colleagues were going to win the day was also the moment they saw the first foreshadowing of cognitive computing—where processes were developed that used a hybrid of man and machine to attack problems beyond the understanding of either.

Turing, the subject of the movie and the book by the same name, was the first to theorize that machines would someday become intelligent. In the subsequent six decades, television, books and movies have reflected popular culture’s underlying fascination with—and fear of—intelligent machines. Skynet tries to kill off the human race in the Terminator movies. The Machines enslave in The Matrix trilogy. WOPR almost starts a thermonuclear war in WarGames, and in A.I., David, the child robot, has a slavish devotion to his human mother that almost results in him killing her human son.

In stark contrast to those portrayals, the goal of cognitive computing is to make humans function better and more effectively. Often referred to as artificial intelligence, cognitive computing differs in that it works in conjunction with humans instead of independently of them. Whatever the language, the quest to make computers more human-like continues, and as the field has advanced, so, too, has its reflection in popular culture.

In 1968, 2001: A Space Odyssey moviegoers met HAL 9000, a computer that Roger Ebert described as “made by man in his own image and likeness … ” HAL’s seeming ability to reason in a human-like manner might have felt familiar to some because Americans had already seen thinking computers on television, notably in Star Trek. However, in HAL, we see early theory around artificial intelligence.

According to Michael Negnevitsky in his book Artificial Intelligence, researchers in the 1960s were trying to simulate the complex thinking processes of the human brain by inventing general methods for solving broad classes of problems. Questions remained about what complex processes were behind human thinking, and HAL’s inability to resolve contradictory program commands reflects that.

The 2013 movie Her gives us a different kind of intelligent computer. Samantha is an operating system that interacts with her human user and adapts to changing circumstances. Where HAL is unable to resolve a conflict with his programming, Samantha confronts conflicting information head on, dynamically resolving conflicts as new information arises and situations shift.

Samantha begins life, if you will, after Theodore answers a few questions about himself. Her knowledge moves on from that point as she learns more about Theodore through conversations, questions, reading and interactions, not just with Theodore, but with the world she operates in.

Samantha goes far beyond HAL’s simple ability to learn only from the information humans have given him. Instead, Samantha seeks information on her own, deciding not only what kinds of information to consume, but how to use that information to the best good. Samantha draws Theodore into conversations where he explores his own humanity and becomes a better version of himself.

Part of what makes us able to see Samantha as a real possibility is that there have been accomplishments in the field of cognitive computing—in those broad classes of problems Negnevitsky talks about—that impact our everyday lives. Samantha communicates in natural language. She joins scores of other computer voices that increasingly play a role in our day-to-day lives. Apple’s iPhone assistant, Siri, is a ubiquitous example, and voice recognition systems are now a fixture of customer engagement systems. Financial systems use anti-fraud detection programs that are based on artificial intelligence, and our phones use facial recognition to identify when we are looking at them and to help us take photos.

As yet, there are no machines that are able to do all of the functions that cognitive computers should: adapt, interact, iterate and understand context. But every year there are more and more machines and programs that do parts of cognitive computing. While the field continues to advance, we have movies to remind us of the possible.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues