The latest AI/ML technologies for optimizing knowledge management
Managing knowledge in a modern enterprise is as multifaceted as it is ever-evolving, where AI and machine learning (ML) technologies can invite game-changing optimization in a data world that demands agility, efficiency, and innovation. In the realm of KM, AI/ML solutions that tackle the finding, capturing, and sharing of knowledge can supercharge enterprises both big and small.
Experts from Access, Verint, and Lucidworks joined KMWorld’s webinar, Game-changing technologies in machine learning and AI, to explore the latest and greatest AI/ML solutions and strategies that empower, fortify, and maximize the value of knowledge management throughout an enterprise.
According to Jason Butler, VP of conversion services at Access, indexing is key for effective KM; from accelerating and easing access to accurate classification, mitigating risk, and reducing costs, great knowledge indexing is indicative of successful KM.
With that being said, AI technologies have a rather critical role to play in knowledge indexing through data discovery, retrieval, and automation, explained Butler.
Traditional indexing requires significant manual labor, where its resulting access is muddled by excel files and slow retrieval. Fortunately, AI-powered indexing, through OCR techniques, can allow organizations to capture more data, faster.
Additionally, data risk can be mitigated through AI, increasing overall trust in indexed data by increasing both the frequency at which the machine correlates record types to known information as well as the rate in which the machine is confident in its selection.
Butler also emphasized that AI can lower enterprise costs with automation, which leads to a reduction in onboarding days and overall expenses. According to Access research, automation can reduce total onboarding by 12 months, as well as reduce expenses dramatically.
John Chmaj, senior director of KM strategy at Verint, centered his conversation on the Verint platform with Da Vinci AI, an open CCaaS (Contact Center-as-a-Solution) that empowers a shared data infrastructure further fueled by AI-based classification and search.
The Verint platform and Da Vinci AI inject AI into business workflows, introducing more automation and elevated CX to an organization. The platform delivers several components of AI-powered innovation, including knowledge automation, simplified KM, cognitive search, contextual knowledge, application integration, and more.
Chmaj further delved into the principles from which Verint Da Vinci AI—the core of the Verint platform, according to the company—can deliver an open approach spanning both proprietary and industry models that keeps up with the ever-changing AI industry.
Firstly, Da Vinci AI operates from an AI philosophy that emphasizes the augmentation of the human workforce for maximum impact, rather than the replacement of humans. Da Vinci empowers humans and bots to work together to drive impact from CX automation, ultimately seeing greater results than a simplistic “get rid of humans” approach.
Da Vinci AI also benefits from the following design principles:
- Data-driven, continuous improvement models, which use real engagement data and evolves over time
- Open and secure, embedding proprietary and commercial AI models into business workflows with appropriate guardrails
- Responsible, using tools and governance processes to eliminate bias
Patrick Hoeffel, head of partner success at Lucidworks, acknowledged that genAI is great at a few things: synthesizing existing text into an answer, understanding existing semantic meaning, writing specific snippets of code, and sounding convincingly authoritative.
However, Hoeffel added that genAI is simultaneously bad in some crucial areas, including providing specific factual details as well as adhering to copyright, cybersecurity, and data privacy and security.
To mitigate against genAI’s weak points, Hoeffel emphasized a few best practices that can aid in adopting this popular technology, including:
- Answer difficult questions with your data using RAG (Retrieval Augmented Generation).
- Leverage a co-pilot or digital assistant.
- Clean up dirty data as it’s being loaded.
- Classify or group data according to its type/
- Add meaningful descriptions in various languages to enrich data.
Hoeffel then dove into explaining various terms that are particularly relevant when discussing genAI, including prompt engineering (the art and science of influencing the results LLMs produce for length, style, tone, format, content, etc.) and RAG (a form of genAI that is commonly used as a remedy for hallucinations).
For an in-depth review of the latest AI/ML technologies and strategies, including demos, use cases, case studies, and more, you can view an archived version of the webinar here.
Companies and Suppliers Mentioned