Achieving success in KM
I was struck by the text-centric nature of the presentations at the Content Management Strategies Conference in April in Chicago. An interesting talk addressed the challenge of document versions in an organization. The presentation deck is available at http://bit.ly/1KHreVn.
“DITA Branch and Merger: A Dream or a Nightmare?” focused on an information access challenge I frequently mention. Most enterprise information access systems struggle to provide one-click access to the PowerPoint deck used for the analyst’s meeting at a specific date. Most enterprise search systems generate a list of documents matching the user’s query. The president’s name is no help; the result set is too broad. The file type is of little use; PPT and PPTX files are the new black in most organizations. Plugging in the name of the analyst firm may work, but for the president’s habit of leaving his PowerPoint decks undated and without helpful labels such as the name of the financial services company.
According to Tristan Mitchell of Delta XML: “Run DeltaXML Merge on changed files and transform the resulting merge document to let the user select the changes to apply. Save the merged file in the content management system.”
The words that arrested my attention were “the user.” In most of the organizations with which I have some experience, getting “the user” to do certain knowledge-centric tasks is a difficult hurdle. Asking a person to make a decision about changes can create anxiety or require a meeting. Often changes are requested by another person, and the “owner” of the document is a colleague in a design, graphics or marketing role.
I understand that once “a user” makes decisions, the system can create a container, properly coded and display the version needed for the person looking for the document.
From one perspective, an organization implementing document policies such as DITA (Darwin Information Typing Architecture) has made an important information governance decision. Correctly implemented and applied by users trained in the methodology, manual Easter egg hunts for documents and specific versions of the documents are eliminated. If the organization does work for a government entity requiring formal information typing, the DITA-compliant system will meet requirements for document management.
Wikipedia offers a helpful explanation of DITA and points the user to a number of supporting topics. References to XML are brief. As I worked through the explanation, the authors of the Wikipedia entry assumed that the reader would have knowledge of XML, metadata and elements.
Let’s shift gears. Consider this silver bullet: XML. Extensible Markup Language evokes text tagging, text operations and text-centric authoring tools. XML can be applied to rich media such as videos, digital images, audio recordings and binary files.
In my files is a paper written six years ago by Qing Li and four co-authors titled “Rich Media Indexing and Retrieval in an Object XML Database System.” (A version of the paper is available from the IEEE via DOI 10.1109/JCPC.2009.5420063.) The paper explains that with the application of the Structural Join Index Hierarchy (SJIH) indexing technique, the researchers were able to deal with rich media such as Adobe Flash objects and PowerPoint presentations.
In 2015, vendors are talking about “fusion” and “federation.” Every few years, a trend from the Yesterday Store is located, dusted off and presented as the next big thing.
I keep a copy of “Enterprise Data Architecture Principles for High-Level Multi-Int Fusion” in my ready reference file. (The jargon “multi int” means “multiple intelligence sources.”) Typically those include text, numerical data like geo coordinates, log file, imagery (still or video) and assorted types of files found in offices. One diagram (page 14, KMWorld, Volume 24, Issue 8 or download chart) has made this 2012 article by Marco A. Solano and Gregg Jernigan, both of Raytheon, a touchstone for me.
The diagram helps me put the discussion about the importance of XML in context. It makes clear that “architectural segments”—the decisions about hardware, software, network infrastructure and access controls—are the core of a complex, interdependent beast with two arms. There are the applicable standards: ISO 9001:2008, XML 1.0, ECMA-376, etc. Then there is the second arm, labeled “transitional processes.” The arms embrace vision, strategic direction and principles.
The diagram triggers memories of dozens of discussions with in-house content processing teams. The meetings have taught me that there is a five-stage process through which the team moves. The concept mirrors the Kübler-Ross model, which may have influenced my thinking about my own information access work experiences.
The first stage is denial. The teams usually balk at thinking about information access, content management and content processing as “big picture,” top management problems. The majority of the engagements exposed me to individuals who wanted a quick-and-easy solution. The operative concept was that the problem was “easy.” Technology had made significant advances. Therefore, how difficult would it be to license a system and get an integrator to plug in the standard functions required to deal with the available content?
The second stage is anger. After I presented my worksheets, checklists and case studies with costs and time lines, most of the teams were grumpy if not downright mad. The reality of dealing with digital content, versions and security issues irritated team members. If a senior manager were privy to the discussions, the work became more difficult. Denial by a new hire working on the website is different from denial voiced by a senior vice president. When fantasy and hope collide with reality and dollars, the crash can be painful.
The third stage is wheeling and dealing. Kübler-Ross called it “bargaining.” The idea is that tradeoffs will reduce the magnitude of the job. I recall one situation in which the solution was to use separate Google Appliances to index each unit’s content. Then the person with the idea asked, “How difficult is it for a 22-year-old to write some scripts that run the query across 11 different Google Appliances?” The person then answered his own question, “Not too tough.” Once the shortcuts and tradeoffs have been analyzed, the teams admit that the narrow path through the thornbush of downsides is not going to be very pleasant.
The fourth stage is depression. I call it the “sticking brake problem.” The team works less and less enthusiastically. The magnitude of the job dulls the team’s enthusiasm. The concern about what to tell top management adds to the uncertainty team members often experience. Software is complex; enterprise software can be perceived as mind-numbing complexity. The team grinds along at the speed a glacier would find sluggish. The spur to the team is often the concern that the company seems to want a “decision or else.” Cheerleading from one or two go-getters is an unlikely source of renewed enthusiasm.
The fifth stage is acceptance. At this point, the team is willing to do the very difficult work of figuring out exactly what content has to be managed, processed and made available. The specifics are narrowed into must-have and nice-to-have requirements. The hardware, software and network resources are identified and itemized. Armed with that nuts-and-bolts data, the team meets with senior management to find out exactly what the organization’s commitment to the system will be. If there is no commitment, the status quo will remain in place until another content crisis or a new senior manager decides that action must be taken to address the challenges information access present.
Stepping back, is it possible to identify the underlying barrier between the content an organization wants to manage and the realization of a system that delivers what employees, managers and contract employees need?