-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

The future is now: cognitive computing throughout the enterprise today

Article Featured Image

Long before it ever began, 2017 was being heralded as the year of artificial intelligence. With widespread predictions from industry analysts and a growing number of horizontal use cases of cognitive computing’s utility for the immediate future, the social implications of those technologies threatened to rapidly dwarf their practical application to the enterprise today.

In fact, each new headline regarding autonomous vehicles, healthcare research and virtual assistants only increased those expectations while overshadowing one small, glaring fact that has remained overlooked in all of the zeal for the social transformation ascribed to cognitive computing’s prowess.

Prakash Nanduri, co-founder and CEO of Paxata (paxata.com), notes that people in the industry say that if they can do cognitive computing, they can solve cancer. Nanduri, however, cautions, “I’m sorry, you cannot even begin to solve that problem until you have turned data into information.”

That’s just what cognitive computing is doing for the enterprise today—and what it’s been doing for the past couple of years. It plays a fundamental role in accelerating and automating the core tenets of data management (data modeling, data quality, transformation and integration) to fuel the applications and analytics that provide the merit for which data is renowned.

More importantly, it provides those benefits for the most repetitious, vital analytics and application prerequisites, at scale, in time to take advantage of the rapidity with which big data is generated. To that end, the elements of cognitive computing that are widely deployed today and most valuable to the enterprise include:

  • Machine learning—Machine learning is arguably the most commonly found manifestation of cognitive computing, so it’s not surprising it’s available in so many forms. There is both supervised and unsupervised machine learning (the former of which requires human intervention and the latter of which learns on its own, according to Nanduri), as well as that centered upon automation and that centered upon recommendations. “When we think about how we’re going to build machine learning into a workflow, we try to think hard about whether this is a recommendation problem or an automation problem,” Eliot Knudsen, data science lead at Tamr (tamr.com), says. “It’s a little subtle but tends to be important in framing the work we do.”
  • Deep learning—Deep learning and neural network techniques bear similarity to machine learning ones yet involve a degree of inferences and learning by examples—rather than in accordance with training based on predefined rules—that creates a profound difference. According to Jack Porter, president and CEO of Razorthink (razorthink.com), “The three big differentiators between machine learning and deep learning are machine learning can only handle a few variables—it kind of runs out of gas at about 20 variables; the accuracy levels of the predictions are in about the 50 percent range; and it has to detect something that’s linear, not non-linear.”
  • Natural language processing—NLP enables cognitive systems and those endowed with cognitive capabilities to understand language as it’s commonly used, which is inestimable to data quality issues of disambiguation, de-duplication, record matching and freshness of data. It also provides the basis for speech recognition and other language-enabled capabilities for AI.
  • Semantic text analytics—Algorithms for semantic text analytics read and decipher text (in both structured and unstructured data) to glean its underlying meaning regardless of data type or format.
  • Computational algorithms—According to Nanduri, those algorithms rely on traditional mathematics to solve problems and may be used in rules engines for use cases such as determining credit score.

Those facets of cognitive computing are regularly found in the enterprise today, particularly in self-service options for preparing data for analytics application use. They attest to the fact that regardless of what else cognitive computing may achieve, it is already considerably impacting the business user’s data-driven practices.

“To reach an information-driven world where we get the outcome we want, we have to start with data and use smart cognitive computing techniques to make data information,” Nanduri says.

Data modeling

The most utilitarian effect of cognitive computing on data preparation could very well be in the realm of data modeling, the foundation of many data-centric processes, which it impacts in multiple ways. Knudsen says, “Modeling can mean one of two things. It can mean building a machine learning or predictive analytics model, or it can mean the exercise of what goes through to build a data model to build the datasets that you need to do your analytics.” Machine learning in particular is critical to expediting data modeling for various datasets because it can recommend the quickest and most effectual way to map data to merge them. Oftentimes, users simply have to approve such mapping, which provides an overarching form of human oversight for that cognitive process. “Mapping different columns to different attributes is very repetitive,” Knudsen explains. “You have to do it over and over again. We build recommendations into a workflow like that to make that process more efficient.”

In other instances, AI is able to automate facets of data modeling, particularly in terms of building predictive models at scale in real time. In those cases, machine learning and deep learning are able to evolve data models nearly instantaneously. When one considers the speed and the scale by which AI can implement that process, it enables a flexible agility of data modeling, which is “based on the data themselves,” Porter says. “It’s the only way to do it. You can’t base it on the schema of the data; you can’t base it on anything else. There’s no rule of thumb. The thing that drives the architecture of the model is the actual data, what the events are in the data themselves.”

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues