-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

Is Machine Learning the Only Game in Cognitive Town?

New fields almost always go through phases of hype and confusion, and cognitive computing is a case in point. Cognitive computing refers to a new generation of systems that are able to learn from observation, make intelligent decisions, and interact more naturally with humans. An essential technology that makes these three steps possible is natural language processing (NLP), both as a source of insights (for example, scientific insights extracted from research publications, customer insights from social media and surveys, or competitive insights from news) as well as a medium for interacting with humans. But seemingly due to vendors’ excessive reliance on novelty as a source of differentiation, this particular area has offered compulsive attention to machine- learning approaches, creating confusion and a risk of deep disillusionment.

The Universal Applicability Fallacy

A first level of confusion has arisen directly from biased representations of the tradeoffs offered by machine learning, leading to the false notion that it applies universally well to all use cases.

In fact, a well-known drawback of machine learning is its black box nature, which creates governance issues: It is difficult to explain specifically why the system arrives at a particular conclusion, and to correct it if it is erroneous. A concrete example of this situation occurred when a leading IT vendor’s machine-learning based chatbot had to be taken offline less than 24 hours after launch due to its learning and offensive reproduction of other users’ racist language. The potentially serious consequences of erroneous conclusions in areas such as, say, healthcare or homeland security, suggest that machine learning alone may therefore not always offer the proper tradeoffs for these use cases. A second characteristic of machine learning is its labor-intensive requirement for large training sets that need to be built and maintained over time to ensure quality results. Unfortunately, such labor intensity may not properly align with common enterprise use cases that are typically more TCO focused.

The Sole Solution Fallacy

A second layer of confusion in the market derives from the sheer share of voice associated with machine learning, possibly leading the unsuspecting observer to believe it might well be the only available path to delivering on cognitive goals.

But in fact, NLP is hardly a recent development and already has a track record. Like cognitive computing itself, which some simply view as a new stage in the artificial intelligence market, it has been developing for several decades, and has arguably already hit the mainstream, with enterprise-grade platforms in production at leading organizations in the information industry, pharmaceuticals, and government, for example.

A proven approach to NLP leverages at its core a deep understanding of language that includes two key components. The first is an engine that reads text in much the same way that humans do: identifying each word and its role in the sentence to recognize the logical structure of sentences and ultimately understand their meaning. The second is a knowledge graph that maps unique concepts to words and relates them to other concepts. The interaction between these two components enables such a platform to overcome the ambiguity of language, and provides the machine with structured information (also called triples) mapped to a conceptual model (ontology or taxonomy), addressing the learning stage and enabling the decision making stage of cognitive. From the standpoint of its users, this approach provides the benefit of realistic out-of-the-box performance instead of the labor intensity associated with machine learning. Since NLP hit mainstream, a fundamental expectation has been usability, requiring a deliberate effort by vendors to integrate their solutions more seamlessly and provide tools that would make the configuration, governance and maintenance of their platforms as easy and predictable as possible, avoiding the black box approach. Platforms addressing this requirement are already providing their users with real-world cognitive capabilities today within the governance and low TCO expectations of mission-critical enterprise deployments.

A Hybrid Path Forward

Granted, machine learning holds promise. But delivering on that promise will require adapting it to the governance and TCO constraints of enterprise use cases, within which proven NLP solutions already operate today. The path forward may in fact leverage “the genius of the and” to deliberately combine—rather than replace—proven technologies with elements of machine learning in a hybrid platform, one that leverages the unique characteristics of these respective NLP strategies to provide end users a richer set of capabilities and cost-benefit tradeoffs matching their particular use cases. [Disclosure: Expert System has chosen precisely this path.]


Expert System is a leading provider of cognitive computing and text analytics software based on the proprietary semantic technology of Cogito. The products and solutions based on Cogito’s advanced analysis engine and complete semantic network exceed the limits of conventional keywords, and offer a complete set of features including: semantic search, text analytics, development and management of taxonomies and ontologies, automatic categorization, extraction of data and metadata, and natural language processing.

 

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues