-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

Successfully implementing natural language processing and GenAI with knowledge management

The explosive popularity of AI has bled into every facet of modern business, including that of knowledge management. Specifically, the integration of natural language processing via large language models (LLMs) and AI with private knowledge platforms poses significant value.

Experts from Datavid joined KMWorld’s webinar, Integrating LLMs With a Private Knowledge Platform, to discuss the benefits, latest advancements, and best practices for integrating AI/LLMs into private knowledge platforms.

Clive Smith, sales director, Datavid, explained that while the hype around generative AI (GenAI) and LLMs has certainly been a daunting initiative for many enterprises, it has encouraged the widespread education of these technologies.

Datavid, a data consultancy firm that specializes in extracting business value from structured and unstructured data, is responsible for putting that education into practice. With a firm focus on complex data problems and creating information assets, Datavid recognizes GenAI and LLMs as notable strategies for connecting employees to their data when they need it.

Smith explained that Datavid has found three use cases suitable for GenAI, including:

  • Enhanced search
  • Process automation
  • Knowledge worker productivity or “AI assistants”

Tim Padilla, director, sales and consulting North America, Datavid, offered an interesting caveat toward exploring those GenAI/LLM use cases, explaining that, “as we adopt the technology, it's important to stay within our processes. This will help ensure that you get to something that will provide usable value.”

Padilla further added that, “generative AI and predictive machine learning are here, but [are] still emerging technologies.” While this technology is beginning to be seen as essential and inescapable, it is still rapidly changing and evolving with business needs.

AI initiatives can easily get off-track, noted Padilla, so being able to manage emerging the technology with a foundational focus on business value, knowledge management, and data strategy will be key toward producing tangible value.

Padilla broke down what AI technology really consists of, which is the AI model—or computational representations in AI that represent our knowledge, learning from data in its training phase, and building a representative map of that data.

Knowledge, then, is essential for AI models so that they may solve complex problems, according to Padilla. Your knowledge management platform is crucial in being able to solve any data problem and surfacing transformative insights, he explained.

By citing a case study where Datavid helped a large scientific society and publisher build an app that enabled researchers to find relevant content and uncover insights, Padilla introduced these best practices for achieving that reality with AI:

  • Frame the problem of implementation with “how might we” questions (or in non-technical terms) that enhance the competence of AI answers.
  • After defining and understanding the problem, align these ideas to your data strategy with action (identify data that supports solving the framed problem and determine data usage).
  • Use knowledge graphs to organize data and as the enabling technology for AI models.

Silvia Chirila, co-founder, CTO, Datavid, dove into the technical aspects of integrating AI and LLMs with the knowledge base. There are three steps to getting started with AI/LLM and knowledge integration:

  1. Decide upon the model you want to use.
  2. Decide upon the usage/querying pattern.
  3. Measure the success of your selection.

Chirila then walked webinar viewers through different models and usage/query patterns, as well as their level of complexity, the resources they require, their maintainability, time-to-market, and features. She further explored a demo of Datavid’s unified sample architecture that leverages data, semantic/cognitive search engines, and GenAI-based chat.

Ultimately, Chirila pointed to these key takeaways:

  • A lean approach to an AI-based business case is still the way to go.
  • Leverage your existing knowledge management systems, or start designing one.
  • Do not fall under the spell that “AI will solve it all;” collaboration is key.

For an in-depth discussion of successfully integrating knowledge repositories with AI/LLMs, you can view an archived version of the webinar here.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues