Data and Content Integration: The Inflection Point for Better Business

I have a fantasy, and it goes like this: If all the money that is currently spent from corporate IT budgets on integrating new applications into the enterprise suite could be recovered, and that money could be re-directed toward something noble like affordable housing or universal healthcare, it would significantly change the world for many, many people.

That would work, right?

“Nah, it would never happen,” says John Bellegarde. “Think of it this way. Back in the old days, everyone wrote their own accounts-payable system. Now they don’t—they buy off the shelf. And IT costs haven’t gone down. They just spend their money elsewhere. So data integration tools can reduce some particular costs. But other demands will crop up to take their place.”

So much for my fantasy.

But John, who is Vice President of Product Management for Hummingbird, definitely has a point: There has always and will always be the unending dynamic for organizations to create greater and greater value, because competition constantly drives them that way. So data integration, also called enterprise application integration (EAI), which I have always thought of as a cost-avoidance technology now has street cred as a value-creating tool? When did that happen?

“The fact is, it’s both,” says John, accurately. “We have customers who have bought for either one of the reasons, but have gained both benefits. They buy for a tactical reason, but are pleased to find out they can gain a strategic advantage.

“The business users want technology to help them,” John continues. “But they don’t want to know all the details. That’s one of the great things that data integration technologies can offer: they let IT people integrate systems, and make everything work together, while the businesspeople can do their thing without worrying about the details.”

“It’s hard to separate the process from the tool,” adds Mathias Evin, Hummingbird’s product manager for Data Integration Solutions. “Here’s a typical example. A business user will tell you ‘I’m typing this invoice on one system. Then for processing I need to enter the same invoice into another system. Would it be possible for IT to make it so I only have to enter that invoice once?’ This is a typical case where people are investing in technology to optimize system efficiency.

“Then there’s the second example: The business user needs to extract, from the many repositories of raw data, just the necessary information into a report that will help him make the right decision. In this case, the investment in data integration is to improve a decision-making process.” Two very different drivers for the purchase of similar technology.

These simple illustrations have another thing in common—they refer only to integrating two systems with one another. But in the real world, organizations have 10, 12, 25 systems that they similarly want to link together. The resulting spaghetti of interrelations can be more frustrating than the condition it set out to solve.

“That’s why our strategy is to generate a tool that will encompass all business requirements, and integrate all applications, all types of data that you have. That’s why we have a scripting language that can go beyond the limitations of straight mapping between point A and point B, and why we have expanded the scope of our tool to provide a single solution that will solve all business requirements,” says John.

It’s a tall order, but approaching the many integration challenges with a universal application layer is one way to solve a tactical problem while having a strategic view at the same time.

When users buy such an integration platform, they have to look down the road at potential conflicts, and hope to avoid integration efforts and costs that might occur in the future. “When they buy a product, they have to have the vision to buy one that is extensible along the dimensions that their IT organization finds important,” says John. So, despite the tendency for business owners to avoid the dirty details of IT integration, they have to have some inkling of the goals and drivers for IT.

Is the reverse true? Does IT need to be more “business-process” savvy as a result of this partnership between church and state? “Well, they have to code the rules for the business process, so they have to understand part of the process. But they don’t set the rules; those need to be translated by the departmental managers in such a way that IT will understand. Luckily some of that work is usually already done—more on that in a minute.

Meanwhile, the burden does seem to fall more heavily on the IT side these days. “If it’s a loan-origination application, the IT people have to understand loan origination. There’s no free lunch for IT in this matter,” says John.

If there’s any break they get, it’s that the business processes are usually codified in the line-of-business processes already in place. There’s a jumping-off point that’s pretty well established already in most cases. And, also typically, the rules work pretty well already. The goal is not to rip-and-replace, but to use the existing logic in a new improved way.

Leveraging for Success

The systems in place, like the invoice example mentioned earlier, are probably already automated in SAP or whatever ERP system. So—luckily for IT—it’s just a matter of translating, not inventing. The degree to which the ERP system successfully can expose the interface to the invoice system will determine the success of the integration effort. Like most everything, it’s about leverage: If you have already done two things, you can more easily do the third by leveraging the first two things you’ve already built.

“If you’re combining two different kinds of repositories, that’s one problem. But if you’re combining 10, and five have a common element, you only have to be concerned when you’re deriving information from the five that it’s the right common element. From a metadata perspective, there can be a capability built into the tool that proves the lineage of data. The problems can get more complex, but there are still tools to solve them,” says John.

If there are common data elements that cross departmental disciplines, it seems logical that the vendor would take advantage of those elements to “cross-sell” and encourage the organic growth of integration tools throughout the organization. The old foot in the door, right?

“Yes, but it’s really more the IT department itself that comes to that conclusion,” says John. “Once they’ve solved one tactical problem, they look for other areas to tie in. They know that if they can use data integration technology, they can get it done in ‘X’ time. But if they use standard off-the-shelf program development tools—SQL and C—they know it will take them some order of magnitude of ‘X’ to do the same work.”

“There is no way for a technology to solve the entire integration problem,” says Mathias. He used the example of something simple, like a P&L chart or as they say over there, a turnover matrix. “In the dictionary, they’re the same thing. But in practice, there are so many variations based on cultural differences, various regulatory conditions between countries, etc., that there HAS to be a person” who is making the decisions about the business rules and translating and documenting these very subtle yet critical differences among systems.

So it’s clear that the IT professional of today plays a very different role than his counterpart of, say, 10 years ago. He is no longer just the guy who keeps the computers and networks operating. Instead of buying tools, he buys infrastructures. He has a far more strategic position in the organization ... correct?

“The organizations that are successful see it that way,” agrees John. “Because they see their information repositories as assets. They make the latent information in the organization actionable. That’s what ETL (extract, transform and load) tools do for you. The technology enables it, but it’s the executive who then acts on this new data.”

Content in Context

Some years back I lived in the East Village of New York City. In reference to the standards of fashion and dress, we had a saying: “Black is the new navy blue.” So I want to know, to what degree is content management the “new middleware.” With ECM companies increasingly acting like infrastructure companies, often downplaying their specific focus on unstructured data, the typical large ECM vendors (John’s bosses at Hummingbird being no exception) seem to be implicitly taking over the role of integration-layer vendors.

I have tried this theory on several ECM vendors, and haven’t gotten many bites. There’s a reluctance for them to claim competitive stature with the BEAs and Microsofts of the world. But they come close. “All of these content and data technologies are converging. No doubt about it,” says John. “It’s all about providing information as rapidly as possible to make decisions.” The form that raw data takes has little relevance when you see the ultimate goal as “better decision-making” (today’s IT mantra) versus “up-time” (yesterday’s less strategic technical mission).

“Content in and of itself needs structure added to it to be that middleware layer you talk about,” John argues. “The structure that you apply to content—not the content—is its hook into data integration. That’s the point of inflection.”

Mathias adds: “On top of the structured data integration tools, there are metadata management tools that will help organizations build the relationships between the structured data and unstructured documents and content. This helps people who are navigating through the many stores of information.” Metadata is the link between the data integration and the content management sides of the house.

Speaking of inflection points, the “document” is appearing to win out over the “report.” “From the end-user/decision-maker point of view,” Mathias points out, “what they look at are documents. It’s up to someone else to care about where the point of origin for the content in those documents is located. But for the end-user, the form that exposes structured AND unstructured data is a document.” Within those documents are bits of data from ERP systems, ECM systems, CRM systems ... that’s the role of data integration. But the end result: documents.

“You get a lot of value by blending these technologies,” reminds John. “Because when the unstructured data is exposed to the decision-maker in the form of documents, you can now apply knowledge management technologies to something that used to be data.”

Where does this leave portals? “Remember, portals used to be thin veneers...windows into other applications,” says John. “Portals have grown up. In some cases they still provide that thin veneer, but the important portals are the ones that are contextual.” For example? “When you’re acting on information in one aspect of your portal, the rest of your user interface refreshes itself to reflect the work you are doing. Today’s portals provide much more utility to the users, by guiding them to information they should be seeing, based on the work they are doing at that particular time. In this way, portals are a way of exposing what the content management system is able to discover. So the portal and the ECM system work inextricably together.”

Preparing for our talk, I had expected to find these data integration guys rooted to the structured side of the house. “Documents, schmocuments,” is what I expected to hear. But to the contrary, EAI seems poised to erase the boundaries between data and documents in new ways that have yet to be seen. This White Paper contains further evidence of this convergence, and it is in this “context” that I trust you’ll read on.

KMWorld Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues