Next-generation natural language technologies: The deep learning agenda
The most appreciative advancements in statistical AI, the ones with the most meaning and potential to improve data’s worth to the enterprise, are deep learning deployments of computer vision and natural language technologies.
The distinctions between these applications involve much more than image recognition versus that of speech or language. Horizontal computer vision use cases pertain to some aspects of inter-machine intelligence, e.g., scanning videos or production settings for anomalies and generating alerts to initiate automated procedures to address them.
Conversely, natural language technologies provide the most effective cognitive computing application for furthering human intelligence, decision making, and the action required to extract business value from such perceptivity.
While the utility derived from image recognition largely varies according to the vertical, the capability for machines to understand natural language—for humans to interact with databases in layperson’s terms across sources—strikes at the core of converting the unstructured data of language into informed action.
Few organizations, regardless of their industry, could not benefit from this capacity. The application of deep neural networks and other machine learning models for this universal use case presents the greatest win for the enterprise, resolves the issue of unstructured data, and is currently taking the form of the following capabilities:
♦ Natural language generation: According to Forrester, natural language generation systems (such as those associated with Alexa and conversational AI systems) leverage “a set of rules, templates, and machine learning to generate language in an emergent, real-time fashion.” Accomplished solutions in this space rely on basic precepts of deep learning to generate text for an array of use cases.
♦ Smart process automation: The impact of equipping bots and other means of process automation with algorithms from cognitive statistical models is unprecedented. Instead of simply implementing the various steps necessary for workflows, such approaches can actually complete them by rendering decisions conventionally relegated to humans.
♦ Spontaneous question-answering: Answering sophisticated, ad hoc questions across data sources has always posed a challenge for machine intelligence options. When backed by deep learning techniques and other aspects of AI, organizations can overcome this obstacle to profit from any unstructured, written data they have.
No one can deny the merits of deploying cognitive computing to accelerate data preparation or make back-end processes easier. However, the aforementioned applications of natural language technologies shift that ease and expedience to the front end. They're the means of directly empowering business users with the peerless predictions of deep learning and, more importantly, redirecting its business value from fringe use cases to those impacting mission-critical business processes.
Natural language generation
When applied to natural language technologies, deep learning’s chief value proposition is the capacity to issue predictions— with striking accuracy, in some cases—about language’s composition, significance, and intention. Models involving deep neural networks facilitate these advantages with a speed and facility far surpassing conventional, labor-intensive methods of doing so. According to AX Semantics CTO Robert Weissgraeber, “Neural networks, trained with deep learning, are used in the generation process for grammar prediction, such as finding the plural of ‘feature’ or ‘woman.’”
Natural language generation has swiftly become one of the most useful facets of natural language technologies. Both Gartner and Forrester have recently developed market reports monitoring its progress. More importantly, it’s also revamping BI by accompanying everything from visualizations to reports with natural language explanations. Perhaps even more significantly, natural language generation-powered systems have expanded from conversational AI applications to include “product descriptions, automated personalized personalized messaging like personalized emails, listing pages like the Yellow Pages, and select journalism applications like election reporting, sports reporting, and weather,” Weissgraeber noted.
Natural language generation’s rise can be partly attributed to its extension of natural language processing (which is transitioning from being trained by rules to being to being trained by machine learning models) to include responses. Specifically, natural language generation employs natural language processing components such as dependency parsing and named entity extraction to analyze what the user writes, and then creates hints for the user to make his configuration faster, Weissgraeber explained.
The principal components of natural language generation systems include:
♦ Data extraction and generation: This tool chain handles what Weissgraeber termed “classic data processing.”
♦ Topic- and domain-dependent configurations: Natural language generation systems rely on this component to analyze data’s meaning.
♦ Word/phrase configurations: These configurations are used to select different phrases based on the desired meaning.
♦ Textual management: These elements bring the “text together, with grammar prediction, correct sentences, text length, and formatting,” Weissgraeber said.
When combined with a system push or API for delivery, these natural language generation characteristics utilize deep learning for stunningly sophisticated use cases. Forrester indicates that in finance, this “technology can review data from multiple sources, including external market conditions and a client’s investment goals and risk profile, to produce a personalized narrative for each of an advisor’s clients.”