Designing and managing the database
By Kim Ann Zimmermann
The database is the blueprint to business data, and can be key to whether the business succeeds or fails.
Databases must be structured to provide quick and easy access to all of the information necessary to run an organization. That becomes extremely challenging as more and more data is fed into the database from various sources and systems, including enterprise resource planning (ERP) and customer relationship management (CRM).
Data warehouses built in the 1990s were simply large repositories of all types of knowledge, but the data wasn’t always easy to access. Now the trend is toward data integration in which the information that is being collected is easier to find and analyze.
"What people want is to be able to see a snapshot of their company at a particular time," says Kay Hammer, president and CEO of Evolutionary Technologies International, a provider of enterprisewide data integration solutions. "In the ’90s, we thought data warehousing was going to be the solution," she says. “We now have a new wrinkle to database management. Everyone wants to get more efficient at using the Internet. They want to use just-in-time supply chain methodologies. Companies want to use customer relationship marketing systems and have wireless access to corporate data. Instead of updates once a day, they want to be able to take a snapshot of their business more frequently.”
The Internet has also added another dimension to database management, because knowledge management becomes more crucial as online customers require accurate, real-time information.
"One of the problems with business-to-consumer e-commerce has been accuracy of data," Hammer says. "Some consumers were finding that they could not reliably order from online retailers, and that really hurt the business.”
ETI recently introduced Accelerator for SQL/Teradata, which speeds the transformation and exchange of data between data-warehousing applications hosted on Teradata’s Relational Database Management Systems (RDBMS). With the push toward more personalized customer service and targeted marketing campaigns, many organizations are adding customer-centric data warehouses, says Juan-Carlos Martinez, VP of marketing for Evolutionary Technologies. "ETI Accelerator gives them the means to maximize their investment in Teradata by taking full advantage of its unique, parallel functionality to speed implementation of new applications," he says.
In most integration projects, for data to be shared, it must first be extracted from the source application and transformed into a format compatible with the target application. The ETI software automates the generation of SQL scripts that extract, transform and load data between applications residing on Teradata, thereby leveraging the Teradata engine for high-performance data processing.
Database & capture unite
And high-performance processing is key to effective database management, according to Reynolds Bish, CEO of Captiva Software. The latest version of that company's FormWare 4.0 forms processing software offers an optional, fully integrated database module built on Microsoft's (microsoft.com) SQL Server.
"Our system uses a database to store all information being captured as soon as captured, and that has a lot of benefits," Bish says. "By integrating database technology into our capture software, rather than capturing information and sending it to a database, we can have event-driven workflow rather than processing things in a sequential process. We can use the information captured and stored on a database to route work to specific types of operations and operators, supervisors and administrators.”
Historically capture systems have been built on proprietary file systems using a directory structure and file naming. That method, Bish says, does not allow documents to be routed as they are scanned. Rather, the routing takes place after a batch has been scanned and the information is downloaded to a database. The Captiva system can scan particular rows or columns crucial to indexing and workflow, or the entire document can be indexed as it is scanned.
Wisconsin Physician Service Insurance Corp. (WPS) is using Captiva's database module to improve performance in claims processing. The database module allows batches of work to be grouped and processed in as many as eight user-defined levels. It also provides a more flexible event- and data-driven workflow, improved reporting and tracking capabilities and greater data integrity and security.
"When we have a batch of 100 claims, it's hard to break that up for specialized processing," says Bob Shultz, imaging manager for WPS. "With the database module, a single document image can be routed to an expert, for example, in provider coding. Now we can do the coding on the front end in a single process without having to train every operator. This means less processing time and faster payment of claims."
Building more flexible databases is key to the success of any knowledge management system, says Jeff Canter, executive VP of operations for Innovative Systems. "We really need to be asking questions about the purpose of the database,” he says. “What problem is the database trying to solve?" The trend is toward building a central system of records that allows companies to store multiple views.
"Companies want to be able to put a marketing-oriented spin on their data or pull out specific, operationally oriented information such as what products are selling," Canter says. "What we've seen is the merging of these two spheres of data to form an operational data store. We're taking information that is analytical in nature and operationalizing it, making it available to everyone in the company, including the customer service representative on the front lines."
L.L. Bean boots up
L.L. Bean is using Innovative Systems software to manage its knowledge database more efficiently.
"We want to make sure customers get exactly what they want, and, therefore we have to know who they are, where they live and what they like," says James Beckwith, programmer analyst for L.L. Bean. "We can't convince customers we have their best interests at heart if there are five files in our system with the same name. Even the best trained operators can't get through a clutter of names without offending the customer."
To improve customer service and contain the costs of distributing its catalog, L.L. Bean launched an extensive customer information management project during the 1990s. The project was designed to focus on individual customers rather than households.
"If we miss customers or combine names that shouldn't be combined, we end up not sending catalogs to people who want to buy our products," Beckwith explains. "The reverse is just as bad, due to increasing postal costs. If you send two catalogs to a household and only one is used, then money is wasted. Even a small percentage improvement in accuracy means a lot of dollars saved."
Facilitating Bean's cleanup project are several Innovative Systems products, including the Innovative-Scrub System. The company also provided technical assistance and training to help Bean's systems and marketing staffs. The two companies collaborated, too, on "extra parameters designed to bring together records that reflect the L.L. Bean definition of a household," Beckwith says. "We now have total freedom to match any kind of data we want." Simultaneously, the company began using an online householding tool that Innovative Systems designed.
"I wanted to be able to evaluate new customers one at time to see if there's an existing household they belong to," Beckwith says. "Timeliness is very important. In batch mode, it could take weeks to find the answer. We need to know who the person is and what group he or she belongs to now. With the online piece Innovative Systems created, we're able to achieve the desired results."
Kim Ann Zimmermann is a free-lance writer, 732-636-3612, e-mail kimzim2764@yahoo.com.