-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

The expanding world of knowledge management

Article Featured Image

Text analytics and natural language processing

With the volume of unstructured content continuing to increase, the ability to extract key information is more important than ever. Text analytics is used to gain insights from sources such as reports, technical documentation, maintenance records, blogs, client correspondence, and social media. It can be used for fraud detection, risk management, sentiment analysis, and to augment quantitative results from BI applications. According to Mordor Research, the text analytics market will grow from $5.46 billion in 2019 to $14.84 billion in 2025, reflecting a growth rate of over 17%.

A combination of rules-based analysis and machine learning is advocated by many vendors. Use of machine learning alone requires such a large amount of training data that many organizations cannot use it effectively. Some vendors conduct the machine learning portion first and then impose rules, while others pre-categorize the content using a set of rules, and utilize machine learning on this more focused set of content.

One of the applications for text analytics is to convert unstructured information to structured data that can be analyzed to show correlations between different measures. Sentiment analysis is a good example; text analytics classifies comments into positive, negative, or neutral and seeks to relate them to customer retention, lifetime revenue, or other measures, with more granular analyses relating to different customer segments. These metrics fuel business decisions.

Natural language processing is critical to interpreting text and supporting conversational AI. It can be applied equally well to internal operations and external customers. Chatbots are often a gateway to contacts with companies via their website and also can identify potential customers via social media and begin an interaction with them. This market is predicted by Research and Markets to grow at nearly 20% per year, from $8.3 billion to $22.9 billion, between 2018 and 2024.

AI and machine learning

AI has a long and storied history that shows no sign of reaching an end point. From ambitious projects in the 1980s that exceeded the available computing power to the current implementations that are embedded in nearly every aspect of daily life, it continues to evolve. According to data from the National Venture Capital Association, AI startups raised more than $18 billion in 2019, while the overall total for venture capital declined. Among these 1,356 companies was an organization that was automating contract management, a service to help small businesses select insurance policies, and one to simplify writing distributed programs. The field of AI innovators is dynamic and thriving.

When it comes to implementing an enterprisewide AI program, however, it’s a different story. According to a study by Accenture of 1,500 C-level executives, 84% are convinced that they need to use AI to achieve their growth objectives. However, 76% report that they struggle with how to scale such projects. About an equal percent believe that they will actually go out of business in 5 years if they do not scale up.

Yet many companies—about 80%— get stuck in that phase, according to Accenture. The ones that succeeded in implementing a full program were those that had a clear AI strategy, an operating model linked to the company’s business objectives, and a team to implement the strategy. The most successful companies had developed an AI culture supported by enterprise digital platforms—in essence, a digital transformation—and they had the ROI to prove it.

Most knowledge management technologies are incorporating AI into their functionality to improve efficiencies and extend capabilities. One of the enabling technologies for AI is machine learning, which uses algorithms (rules) to either predict outcomes or cluster information into meaningful groups. Supervised learning is generally used to test models for predicting outcomes, and unsupervised learning is used to detect patterns. Many tools are available now to facilitate machine learning, and it is a fast-growing sector. Grand View Research estimates an annual growth rate of 44% for machine learning from 2019 to 2025, growing from a market of $7 billion in 2018.

Governance and compliance

The global enterprise governance, risk, and compliance market is an established one that has undergone more evolutionary than revolutionary change, but recently enacted regulations on privacy have shaken up this sector. The complexities of the global economy, in which regulations vary by country and fines can be high, have made compliance an imperative. The ability to identify individuals throughout multiple databases and forget them on request poses new challenges for companies in almost every vertical. This is a large and steadily growing market; a variety of research reports predict the 2025 market will be about $60 billion, with an annual growth rate of about 12% from 2019 through 2025.

Despite the motivating factors, the kind of data-driven culture needed to succeed in governance continues to be slow to emerge.

A survey of leading corporations by NewVantage Partners in 2019 indicated that a significant majority had neither a data culture nor a data-driven organization. In fact, the percent that identified themselves as data-driven declined from 37% to 31% over a period of 3 years. Barely half even claimed to be treating data as a business asset. These results are consistent with an earlier study by AIIM in which only a tiny fraction of respondents thought their information governance programs reached a level of excellence, and only one-quarter felt that information governance and security were high on the agenda of senior management.

End-to-end data governance is required for data-driven digital transformation, but technology can take this change only so far. In the NewVantage survey, 7% of respondents said technology was an obstacle, versus 93% who cited people and process as obstacles. Technology can help with classifying data through auto-classification, implementing policies for retention, and employing robotic processing automation for activities such as master data management. But designing and implementing a governance program are long-term commitments that require full participation of all stakeholders and dedication to ensuring data quality, along with a mindset that considers data as an asset. At that point, the organization can have a real edge over those that are not making this commitment.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues