The ability to personalize is an indicator that an organization is managing its content well enough to deliver what each individual needs.
Modern content requires support for digital form factors, the bridging of information silos, the ability to deliver insight and intelligence, the support of new content types, governance and security, cloud adoption acceleration, the ability to operate at extreme scale, and an easy way to build new apps.
What distinguishes the KM programs that stand the test of time is their sheer determination and dogged approach to evolving ever-closer to the business
Many real-world studies include analyses of data from sources such as anonymized electronic medical records (EMR) and insurance claims.
Understanding natural language processing, common obstacles faced, and methodologies to overcome them
The easiest way for marketers to create a unified communications strategy is by using a content management system
Compliance with new U.S. data privacy laws requires the right information management strategy
For RPA to progress beyond automating simple repetitive tasks with fixed rules, enterprises will need to turn their RPA robots into "smarter" robots that can process a wide variety of unstructured content
Given the increased negative media exposure that comes from project failure, organizations need more tightly integrated, intelligent project management systems, in addition to people who have the requisite skills. This need will grow as systems continue to become more complex and timelines more tightly compressed.
No matter how much "intelligence" is programmed into a computer, it will very likely never understand the results it produces. Doing so takes human cognition, intuition, judgment, and other ways we humans make sense out of data.
In the field of knowledge management, of course, the idea of turning data into information into knowledge has been a foundation concept for knowledge managers. But frankly, the ability to achieve this alchemy of data to knowledge has not been broadly demonstrated in practice. A next generation information refinery is required to make something meaningful and valuable out of the raw data flying around the firm and throughout the internet economy.
We're familiar with the near-term portion of the time spectrum—from femtosecond lasers used in eye surgery to high-frequency trading in milliseconds on the major securities exchanges. Unfortunately, the extreme opposite end of the time spectrum, the "deep future" receives little if any attention. Decisions in fields such as genetic engineering, nuclear energy, geopolitics and the like can have serious implications for human civilization. But the impact of those decisions might not become apparent for many thousands of years and hundreds of generations.