Professional Documents
Culture Documents
Data Integration: Best Practices in Business Intelligence and Data Warehousing
Data Integration: Best Practices in Business Intelligence and Data Warehousing
����������
Data Integration
Customer Data Integration
Master Data Management
Data Quality
Special Section: Data Warehouse Appliances
FEATURE
Operational Data Integration
Philip Russom, TDWI Research
TDWI RESEARCH
BI Search and Text Analytics
page 38
Predictive Analytics
page 49
VOL UME 23
TDWI is proud to offer you our new, topically focused What Works in Data Integration.
This collection of customer success stories and expert perspectives provides a resource
for better understanding the tools, technologies, and methods that are central to data
integration. To help you make your way through the many powerful case studies and
“lessons from the experts” articles, we have arranged them into specific categories: gen- www.tdwi.org
eral data integration, customer data integration (CDI), master data management (MDM),
data quality, and a special section on data warehousing appliances. General Manager Rich Zbylut
Director of Research Wayne Eckerson
The goal of What Works is to provide a snapshot of the most innovative business intel- Managing Editor Jennifer Agee
ligence and data warehousing implementations in the industry today. The case studies Graphic Designer Bill Grimmer
included in this volume demonstrate the power of data integration technologies and
solutions for industries ranging from telecommunications to stock trading.
M A S T E R D ATA M A N A G E M E N T
30 MedSolutions: Improving Healthcare through Information
Access and Streamlined Processes
32 Leverage a Disciplined, Standard Approach to Achieve
“One-Source” Truth
33 Master Data Management: Extracting Value from Your Most
Important Intangible Asset
34 LexisNexis: Enhancing Sales Campaign Support and
Improving Data Quality
35 Master Data Management: Myth of the Mega-Vendor Platform
36 Gaining New Insights with Master Data Management
37 Master Data Management and Business Performance
Management
FEATURE D ATA Q U A L I T Y
2 Operational Data Integration
42 Emerson Process Management Unifies Account Data for a
Philip Russom, Senior Manager, TDWI Research
Stronger Revenue Stream
An introduction to one of the hottest and
fastest-growing practices in data integration today. 43 Avoid the Pitfalls of Poor Data Quality
44 Intellidyn Uses Data Integration to Manage Industry-Leading
Consumer Information
DATA INTEGRATION CASE STUDIES AND 45 Data Integration Drives Healthy Customer Relationships
LESSONS FROM THE EXPERTS 46 M&S Money Meets the Compliance Deadline for Basel II
6 Data Integration Defined through an Enterprisewide Data Quality Initiative
48 Data Quality Monitoring: The Basis for Ongoing Information
D ATA I N T E G R AT I O N
Quality Management
7 Real-Time Intelligence Helps Traders Stay Ahead of a
Bull Market S P E C I A L S E C T I O N : D ATA W A R E H O U S E A P P L I A N C E S
8 Corporate Express: Using Business Intelligence to Improve 53 Data Warehouse Appliance Combines Unconstrained
Efficiencies and Enhance Customer Value Analytics with Energy Efficiency for UK’s Leading
9 Exploring the Next Generation of Information Dashboards Mobile Carrier
10 Iron Mountain Conquers Data 54 Data Warehouse Appliances Help Tackle the Data Center
Power Crisis
11 Data Integration: One Size Does Not Fit All
55 HP-Oracle Reference Configurations for Data Warehousing
12 S&H Solutions: A Competitive Edge in Retail Analytics
56 Aquascutum Innovates in Retail
13 Powering Integration and Analytics When the Data Is
the Business 57 Investing in Data Warehouse Query Performance
14 Effective Long-Lived Data Management 58 TEOCO Increases Performance and Customer Satisfaction
16 Full-Service Bank Gets it Done Faster with the Right Tool 59 Appliances—Data Mart or Enterprise Data Warehouse?
C U S T O M E R D ATA I N T E G R AT I O N
TDWI RESEARCH: BEST PRACTICES REPORTS
19 Pitney Bowes Improves Total Cost of Ownership
38 BI Search and Text Analytics
20 Worldwide Leader in Software Achieves Accurate, Complete
Views of B2B Organizational Hierarchies 49 Predictive Analytics
This article introduces one of the hottest and fastest-growing prac- collocation, migration, upgrade, or synchronization of opera-
tices in data integration today: operational data integration. Before tional databases. This article will look at each of these in
we look at different manifestations of operational data integra- detail later.
tion—like database consolidations, migrations, upgrades, and so
on—let’s distinguish it from analytic data integration and explain As a rule of thumb, the database targets distinguish the two prac-
why you should care. tices of analytic data integration and operational data integration.
For example, if data integration feeds an analytic database like a
Distinguishing Enterprise, Analytic, data warehouse or mart, then it’s most likely an analytic practice.
and Operational Data Integration Practices If it feeds or creates a database supporting an operational applica-
Enterprise data integration—defined as every imaginable use of data tion, then it’s probably operational data integration. Of course, data
integration—divides into two broad practice categories (see Table 1): integration tools and techniques enable both practices, and some
enterprise initiatives—like operational business intelligence and
• Analytic data integration is applied most often to data ware- master data management—straddle the fence to bring the two
housing (DW) and business intelligence (BI). It’s also applied practices together.
(less often) to initiatives like customer data integration (CDI) or
master data management (MDM). As a quick aside, let’s remember that data integration uses a vari-
ety of techniques and tool types, including enterprise application
• Operational data integration manifests as implementations, integration (EAI); enterprise information integration (EII); extract,
projects, or initiatives commonly described as the consolidation, transform, and load (ETL); replication; and miscellaneous utilities.
IT professionals implement these techniques with vendor tools, ment and others have drawn data integration specialists from
hand coding, or functions within database management systems. the data warehouse team to perform operational work. This is
It’s true that some techniques and tools are closely associated with a problem, because it sets back the goals of data warehous-
particular initiatives, like ETL with data warehousing or replication ing and goes against the grain of organizational structures and
with database synchronization. Yet, all the techniques and tool funding. Staffing both camps with data integration specialists is
types under the broad rubric of data integration operate similarly: a common solution, despite the resulting redundancy of person-
they copy data from a source, merge data coming from multiple nel. To avoid such redundancy, some corporations create a data
sources, and alter the resulting data model to fit the target system integration competency center, which staffs both analytic and
it will be loaded into. Because of the similar operations, industrious operational data integration practices as a single resource via
users can apply just about any tool (or combination of tools) to any shared services.
data integration implementation, initiative, or project.
Problems that Operational Data Integration Addresses
You should care about the distinction between analytic and opera- Before describing operational data integration implementation
tional data integration practices because the two have different types, let’s step back and consider why these are necessary.
technical requirements and organizational support:
Redundant data and non-standard databases are the main
• Technical requirements. Analytic data integration usually problem. For example, when data repeats across multiple data-
involves hefty transformational processing to create time series bases, it’s hard to keep the databases synchronized. Likewise, data
and other dimensional data structures required of a data ware- may reside in a legacy database that is beyond its prime or in a
house. Analytic data integration also supports dozens of data database brand that is simply not the corporate standard. To put
sources and targets coordinated via a hub-and-spoke archi- it another way, these are problems because they increase IT costs
tecture. By comparison, operational data integration is simple, and inhibit unified visibility into business processes. In a related
involving light data model transformations between relatively trend, many organizations today try to “do more with less” and
few data sources and targets (sometimes only one source and centralize both IT and business operations. These situations eschew
one target) connected via point-to-point interfaces. These are redundancy and promote standards; thus, they seek solutions via
the main reasons why ETL and EII—with their unique transfor- database consolidations, migrations, and so on.
mational capabilities and hub architectures—are the preferred
techniques for analytic data integration, as well as why the Redundant and non-standard applications are a problem, too.
simple interfaces with light transformations found in replication, An implementation sometimes focuses on an operational application
EAI, and hand-coded practices are sufficient for most imple- that’s a legacy needing migration to a more modern brand or an
mentations of operational data integration. application instance that’s redundant and should be consolidated
with other instances. Because there’s a database in the applica-
• Organizational support. Analytic data integration is usually tion’s technology stack, some form of operational data integration
staffed by the data warehouse team and funded by its sponsor, is required to migrate or consolidate the database.
whereas operational data integration projects are often staffed
by a data management or applications group within IT with Implementations of Operational Data Integration
sponsorship from line-of-business managers and others associ- Now that we’ve defined analytic and operational data integration
ated with the applications. As the number of operational data practices and explained why you should care, let’s look at some of
integration projects has increased in recent years, IT manage- the benefits and challenges of common implementations.
Database Consolidation. This is where IT personnel consolidate can be a discrete project or an interim step within larger operational
multiple, similar databases (often with redundant content) into a data integration implementations. For instance, it might make sense
single one, with a single data model. If a single database isn’t a to upgrade all databases to the same release before consolidating
realistic goal, then data is at least consolidated into fewer data- them. And upgrading a legacy or non-standard database to the
bases. The most common project is probably the consolidation most recent version first can make it easier to migrate. Sometimes
of redundant CRM or ERP applications and their databases. As multiple upgrades are required, say from version 6 to 7, then from
another example, many organizations have multiple mid-tier data- version 7 to 8.
bases containing slightly different views of customer data, and
these are ripe for consolidation. On the analytic side of data inte- But is a database upgrade really data integration? Yes, it can be.
gration, consolidations are a popular way to reduce the number of For instance, when customizing a packaged application alters its
redundant data marts. database’s data model, upgrading the application requires opera-
tional data integration to upgrade the database. Furthermore, when
The upside of database consolidation is that it reduces IT mainte- a new version of a DBMS changes how it stores data, a database
nance costs by consolidating data into fewer servers. Furthermore, upgrade can require transformational processing to get the data
it increases visibility into business processes by putting “data eggs” into a model optimized for the new version. Finally, when IT decides
in fewer baskets. The downside is that consolidation is extremely to improve a data model in tandem with an upgrade, some form of
intrusive, in that it kills off the consolidated systems. Their owners, operational data integration is required.
sponsors, and users may resist passing control, typically to a cen-
tral organization, and these people are usually forced to change Database Synchronization. The types of operational data integra-
business processes associated with the consolidated systems. tion we’ve seen so far—database consolidations, collocations,
migrations, and upgrades—move whole databases in one-off pro-
Database Collocation. Collocation and consolidation are similar. In jects that rarely require that integration infrastructure be left in
a database consolidation, diverse data models are merged into a place. Database synchronization is different, in that it leaves data-
single data model in a single database instance. This is hard work, bases in place and exchanges data among them, which requires
so sometimes it’s faster and easier to collocate databases. Colloca- a permanent integration infrastructure for daily data feeds. The
tion copies multiple databases into a single instance of a database downside of synchronization is the initial cost and maintenance
management system (DBMS), typically on a single hardware server, cost of the infrastructure. The upside, however, is that synchroniza-
without merging the data models. In other words, one DBMS tion is non-invasive because it leaves database investments and the
instance manages multiple databases. While collocation reduces IT business processes that depend on them intact. In fact, when it’s
maintenance costs, it doesn’t solve the information silo problem like not possible to consolidate or migrate databases, managing redun-
database consolidation does. dant data usually involves keeping it synchronized across related
databases.
Database Migration. In a migration, database brand A gets
migrated to brand B. Typical examples include migrating a data- High-end implementations of database synchronization include
base from a mainframe to a less expensive open system, from a transformational processing and many-to-many data movement,
legacy platform to a more easily supported one, or from a non-stan- and so ETL is a popular choice. But the reality is that most imple-
dard brand to one that’s a corporate standard. Migrating to a newer mentations move data only from one database to one other with
or standard database increases the flow of information among little or no transformation. For these, replication and even EAI are
systems by making data access and integration easier. This way, suitable techniques.
database migration contributes to advanced integration goals, like
real-time or on-demand information delivery. All other implementation types considered here are inherently one-
way and latent. But database synchronization can address unique
Like consolidation, migration is intrusive, because it kills off the requirements for two-way data movement or real-time information
original system, forcing changes to business processes and appli- delivery. Two-way data synchronization has the added burden of
cations. With database migrations off of legacy platforms (especially resolving conflicting data values, which is best done with high-end
the mainframe), you need to check contracts before committing. replication or ETL tools, or possibly an EAI tool. Real-time data syn-
Sometimes issues of licensing, leasing, depreciation, or amortiza- chronization is usually done with replication, or sometimes with EAI.
tion can halt the retirement of an old or non-standard system. And
beware that database migration is something of a myth; it’s more Operational Data Integration Implementations Compared
like new development when the target system doesn’t exist and So far, we’ve seen that database consolidations, migrations, and
must be designed and built from scratch. upgrades can address the problems of redundant and non-stan-
dard databases, thereby reducing IT administrative costs and
Database Upgrade. This is where version X of a database brand is increasing business visibility into operational data. When these
upgraded to version X+1 of the same brand. A database upgrade
BENEFITS CHALLENGES
Migration
Consolidation
Collocation
Upgrade
Synchronization
aren’t possible, database synchronization and collocation can allevi- the decision-making process more consistency and efficiency. In
ate some problems in a less invasive way. addition, it will serve as an established vocabulary to get communi-
cation and collaboration started.
That’s the good news, and it’s great to have options. The bad news
is that—given the long list of options—it’s sometimes difficult to Recommendations
decide which type of operational data integration implementation Segregate operational data integration from analytic data
(or combination) is most appropriate for a particular situation. integration. The two involve different types of implementations
(or projects, if you will), although both can be done with similar
Table 2 assesses the benefits and challenges of the operational techniques and tool types, like ETL, replication, EAI, and so on.
data integration implementation types discussed in this article. And the two usually have separate sponsorship, funding, and
The assumption is that selecting one or more implementation staffing. A possible exception to this recommendation is to bring
types involves a trade-off between benefits and challenges. the two practices together in a data integration competency center.
For example:
Address redundant or non-standard databases. This is done with
• Database migration and consolidation are great immediate fixes various implementations of operational data integration, namely
that can add value to data, despite high risk because of their database consolidation, collocation, migration, upgrade, and syn-
highly intrusive nature. chronization. These projects are worth doing, because reducing
the number of redundant databases and moving data to standard
• Collocation and upgrades are low-risk because they’re cheap, databases reduces IT administrative costs and increases visibility
fast, and relatively non-intrusive. But they add little or no visible into business processes.
value to data and business processes.
Expect to apply multiple implementations of operational data
• Database synchronization improves data content tremen- integration. For example, database collocations and upgrades are
dously in a non-intrusive way, but suffers high initial costs typical prerequisites for database migrations and consolidations—
(especially when you first put in integration infrastructure) and or vice versa, in some cases. And some databases may need
administrative costs over time (because you must maintain the synchronization after they’ve been migrated or consolidated.
infrastructure).
Realize that database consolidations and migrations aren’t always
You needn’t agree with all the assessments in Table 2, because possible. After all, these are highly invasive. But database colloca-
you can adapt it to reflect your own assessments and those of your tion and synchronization get similar results less intrusively. Apply
colleagues. You might also expand the table to include other imple- your alteration of Table 2 to selecting an operational data integration
mentation attributes that matter to you, like which data integration implementation type or a combination of types. ●
techniques (ETL, EII, EAI, etc.) are best suited to each implementa-
tion type and how applicable each is in your organization (perhaps
based on preexisting tools and expertise). Regardless of how you
adapt it, a table based on consensus-driven assessments can give
do they apply to your organization? than ever, companies need to be able to recog- A strict definition of data warehouse appliance
nize different individuals—often encompassed is: “Server hardware and database software
by the term “party”—across diverse business built specifically to be a data warehouse plat-
lines and touch points. CDI allows them to form.” A looser definition allows appliances
uniquely identify parties by reconciling data to be hardware and software designed for
across different siloed systems. This means any purpose, though bundled and pre-inte-
coalescing diverse data to identify a single cus- grated for data warehousing. In a February
tomer across products, purchase transactions, 2007 TDWI Technology Survey, roughly half
sales channels, salespeople, subsidiaries, and of respondents chose the strict definition, a
geographies. This implies generation of the so- quarter the loose one. However, the focus of
called “golden record”—the authoritative and data warehouse appliances is shifting from
certified profile about an individual customer. ... proprietary to commodity hardware, as well
It’s apt that the processing platform that recon- as more generally from hardware to software
ciles and integrates data is known as a “hub.” components. In fact, some of the newer data
warehouse appliance vendors openly describe
M A S T E R D ATA M A N A G E M E N T their products as software-based accelerators,
page 30–37 not hardware boxes. When added to a user
Master data management is the practice of organization’s existing BI technology stack (or
defining and maintaining consistent defini- another vendor’s appliance), these accelerate
tions of business entities, then sharing them BI development, and—once in place—they
via integration techniques across multiple IT accelerate query performance.
systems within an enterprise and sometimes
beyond to partnering companies or customers. References
Many technical users consider MDM to be an Dyché, Jill, and Evan Levy [2006]. Customer Data
Integration: Reaching a Single Version of the Truth,
integration practice, enabled by integration John Wiley & Sons.
tools and techniques for ETL, EAI, EII, and
replication. When the system of record is a Russom, Philip [2006]. Master Data Management: Con-
sensus-Driven Data Definitions for Cross-Application
hub that connects many diverse systems, mul- Consistency, TDWI Best Practices Report, October.
tiple integration technologies may be required,
including newer ones like Web services and ——— [2006]. Taking Data Quality to the Enterprise
through Data Governance, TDWI Best Practices
service-oriented architecture (SOA). More Report, March.
simply put: MDM is the practice of acquiring,
improving, and sharing master data. TDWI Marketplace: Data Integration
www.tdwi.org/MarketPlace/category.aspx?catId=37
C A S E S T U DY
Real-Time Intelligence
Helps Traders Stay
Ahead of a Bull Market
Commentary by Sharief Zaman
IT Director, SwiftTrade
When regulatory changes revolutionized platform to a Microsoft SQL Server database interface (API) calls. The iWay Universal
the business of day trading in April 2001, on an Intel platform, which consists of Dell Adapter suite includes more than 300
SwiftTrade made a smooth transition to pro- blade servers running the Microsoft Windows adapters to connect almost any information
prietary trading, a new type of enterprise in 2003 operating system. Then we established system, so we have plenty of flexibility should
which a contract trader works on a single a security architecture that governs each we decide to adopt a different computing
corporate account. This trading model user’s ability to access certain layers of infor- platform or database management system in
reduces the individual account costs that mation based on their roles, positions, and the future.
were the demise of daytrading practices. security clearances.
In order to manage the rapid growth that With the data integration solution now in
ensued, we sought a business intelligence place, trading information that was formerly
and data integration solution to manage
With the data integration solution stored in proprietary, back-office applications
a rapidly expanding network of traders at now in place, trading information is accessible via a reporting database that is
dozens of offices around the world. Our busi- continually refreshed with current data. We
ness has grown from 20 branches two years
that was formerly stored in propri- can isolate the information that is pertinent to
ago to more than 70 branches today, with an etary, back-office applications is just one individual trader, or to one branch,
additional 125 planned in the near future. or to the entire firm, depending on what we’re
accessible via a reporting database looking for.
All of this growth motivated SwiftTrade to that is continually refreshed with
build a more effective system for delivering In addition to the nightly data migration
information to every member of our organiza-
current data. procedures, we used the integration technol-
tion. In an industry where every second of ogy to create a trigger-based system that
the workday is precious, we knew our traders automatically updates key reports whenever
would insist on nothing short of accurate, Trading Up designated revenue and trading milestones
real-time information—delivered dynamically Once the basic integration and security are reached. Real-time reports are ideal,
when they needed it. infrastructure was in place, we created a because traders don’t like waiting around
reporting portal using the WebFOCUS Busi- or wasting time looking for the information
To create a new data integration layer and ness Intelligence Dashboard that features they need. Now we can put that information
BI architecture, SwiftTrade purchased about a dozen standard reports, including right in front of them, and use the system
Information Builders WebFOCUS Business branch enumeration figures, daily and his- to recognize outstanding traders. It’s a great
Intelligence Platform and iWay Universal torical profit and loss reports, settlement motivational tool. ●
Adaptor Suite. These software products reports, account reconciliation reports,
allowed us to build a new information- invoicing reports and payment reports, and a Click here to request more information
delivery environment that works efficiently comprehensive bird’s-eye view of the entire about Information Builders.
with vast amounts of operational data, from company from the head office perspective.
invoicing to branch-wide trading statistics, These reports are delivered to the dashboard
and it makes information available to the right via WebFOCUS Reporting Server, which
people at the right time. Trading informa- accesses Microsoft SQL Server data via the
tion that was formerly stored in proprietary iWay SQL Server Adapter.
back-office applications is now accessible
via a reporting database that is continually The new integration layer is fast, flexible,
refreshed with current data. and scalable. Each iWay adapter consists of
a communications interface, a SQL transla-
The integration architecture is straightfor- tor, and a database interface that translates
ward. First, we transferred operational data American National Standards Institute (ANSI)
from a Sybase database on a Sun Solaris SQL into native application programming
C A S E S T U DY
Corporate Express:
Using Business
Intelligence to
Improve Efficiencies
and Enhance
Customer Value
Commentary by Matt Schwartz
Director of Business Analysis, Corporate Express
Corporate Express, Inc., a Buhrmann a very time-consuming process. In the past, “The sales executives were really excited
company, is one of the world’s largest busi- there were 28 versions of the PowerPoint about the new tool,” Schwartz said. “The
ness-to-business suppliers of essential template (instead of a single, centralized tem- time savings and customer value were imme-
office, facilities, furniture, and computer plate); data had to be pulled manually from diately obvious.”
products and services, with 2005 sales of multiple reports and from ad hoc queries;
approximately $4.6 billion in North America and it took up to six hours to prepare each Impressive Time Savings
and $7.3 billion worldwide. Corporate business review. Using the new, centralized template,
Express’s product offerings include office salespeople who need to create business
and computer supplies, imaging and com- Solution reviews can simply hit the refresh button on
puter graphics supplies, office furniture, Matt Schwartz, director of business analy- the MicroStrategy toolbar and answer two
document and print management, desktop sis at Corporate Express, learned about prompts to identify the customer and date
software, promotional products, and other MicroStrategy Office when he attended the ranges of interest. At that point, MicroStrat-
similar products. Corporate Express has MicroStrategy World 2005 user conference egy Office takes over and automatically
operations in Australia, Austria, Belgium, in Las Vegas. MicroStrategy Office is a plug- refreshes the presentation with the latest
Canada, France, Germany, Hungary, Ire- in to Microsoft Excel, PowerPoint, and Word, MicroStrategy reports. The typical refresh
land, Italy, Luxembourg, the Netherlands, which gives Microsoft users easy access and formatting time is less than one hour.
New Zealand, Poland, Spain, Sweden, the to any MicroStrategy report. MicroStrategy
United Kingdom, and the U.S. Through Office users can run MicroStrategy reports “The time savings that we will realize are truly
exclusive strategic partnerships, Corporate and documents directly from their Microsoft significant,” Schwartz said. “We now have an
Express can deliver products to China, Office applications and refresh the reports approach that creates professional business
Finland, Hong Kong, Malaysia, Mexico, with a single click. Since MicroStrategy Office reviews quickly and efficiently. Additionally,
Norway, Portugal, Singapore, Slovenia, Swit- is built on the MicroStrategy platform, all using a single template for our reviews means
zerland, and Taiwan. Corporate Express’s MicroStrategy platform security features, that all of our customers will receive consis-
North American operations have more than report formatting, and prompts work as well tent information.”
200 facilities, including 38 distribution for MicroStrategy Office users as they do for
centers, and the company employs 10,775 MicroStrategy Web and Desktop users. Corporate Express originally piloted the
people. The company’s Web site is MicroStrategy Office–enabled process with
www.CorporateExpress.com. After returning from the conference, 7 of 28 U.S. divisions, and later rolled the
Schwartz worked with Matthew Cryer, a process out to all 28 U.S. divisions.
Challenge: Creating Business Reviews business intelligence analyst/developer at
for Customers Corporate Express, to modify an existing “The PowerPoint Business Review Tool has
Every quarter, the Corporate Express sales PowerPoint business review template to been fully deployed to all geographic markets
organization creates about 15,000 busi- embed MicroStrategy reports. After approxi- in the U.S. with great adoption,” Schwartz
ness reviews for customers using Microsoft mately two weeks of development, Schwartz said. “MicroStrategy has been instrumental
PowerPoint. These presentations allow Cor- showed the new, 25-slide PowerPoint docu- in achieving greater efficiencies and improv-
porate Express customers to examine their ment to some sales executives to get their ing our business performance.” ●
purchasing patterns by product category and buy-in and feedback.
evaluate ways in which they can improve
their purchasing efficiency. This used to be
C A S E S T U DY
Iron Mountain
Conquers Data
Commentary by Dave Weldon
Vice President of Technology, Iron Mountain
Customer Profile “This cycle took anywhere from a day to 60 trends,” says Weldon. “BusinessObjects Data
Iron Mountain stores and protects digital days, and it was happening at a rate of about Federator was the only technical solution we
information, physical records, and artifacts 700 to 800 report requests a month,” says could find that did the job quickly and easily.”
for more than 90,000 customers worldwide. Weldon. The company needed faster and
Specializing in offsite data storage, disaster more accurate reporting, as well as the abil- Increased End-User Adoption and Ease
recovery, comprehensive records manage- ity to provide tailored reports drawn from the of Use
ment, and business continuity services, Iron 180 databases. Iron Mountain implemented Crystal Reports
Mountain’s annual revenue tops $2.2 billion. and InfoView to support standard reports
from its IT group. “These tools made infor-
“Getting a worldwide snapshot of
Challenge mation accessible to about 85 percent of our
Changing business models, new regulatory a customer and what they were business community,” says Weldon, “a far
requirements, and rapid growth through cry from the handful of people who were able
buying used to take two months,
acquisitions posed data access and reporting to use our reports in the past.”
challenges for Iron Mountain. “We were con- and the results were not very
solidating 11 different billing systems to try Weldon praises Crystal for its prompting,
accurate. Now it takes two min-
to understand our customers and what they drill-down, and graphic features that make
were buying,” says Dave Weldon, vice presi- utes—and it contains three years standard reports “easy to use and meaning-
dent of technology at Iron Mountain. ful,” and InfoView for its reporting folders,
of historical buying trends.”
batch scheduling, file exporting, and
With so many billing systems, it was difficult Dave Weldon, Iron Mountain e-mail distribution.
to get a consolidated view of customers.
Further, Iron Mountain tapped into demo- Powerful Ad Hoc Reporting
graphic data from Dun & Bradstreet (D&B) The Result Iron Mountain has trained more than 850
to enrich its understanding of customers and Iron Mountain looked to Business Objects people in the use of standard reports, and
prospects—adding another layer to the data to help them integrate, manage, and report another 120 in ad hoc reporting. End users
complexity. Finally, Iron Mountain aimed to on data stored in systems across the orga- include analysts in marketing and finance,
sustain a growth rate of 10 to 15 percent per nization. Solutions implemented included as well as business users from sales and
year, which meant the company needed to BusinessObjects Data Integrator, Business- account management. For super users deliv-
be able to access and understand their data Objects Data Federator, BusinessObjects ering sales, financial, and marketing analysis,
in order to identify new customers, as well as RapidMarts, BusinessObjects Enterprise XI, Weldon notes, “Web Intelligence is a friendly
sell more to existing ones. BusinessObjects Web Intelligence, InfoView, and powerful ad hoc reporting tool that gives
and Crystal Reports. them the level of sophistication and detail
“Adding D&B information to the data they require.”
warehouse told us a lot more about our cus- Real-Time Access to Integrated,
tomers, their locations, what industries they Trustworthy Data Quality of Service and SLA Performance
are in, and how big they are,” says Weldon. Iron Mountain uses Data Integrator to con- Looking ahead, Iron Mountain plans to use
“Unfortunately, we weren’t able to get that solidate summarized invoice information from Business Objects to manage service-level
information into the hands of our business their 11 billing systems into their data ware- agreement (SLA) performance. “As we get a
community. We had a very large database house, and Data Federator to supplement the more complete picture of our customers and
and only about 10 people—mostly marketing historical information with real-time access to their buying habits, we want to continue fill-
folks—that used it.” production databases. ing out that picture by looking at how well we
are able to perform to the SLAs we have with
Turnaround time was also an issue. The “Getting a worldwide snapshot of a customer our customers,” says Weldon.
sprawling complexity of data sources and and what they were buying used to take
sheer volume meant IT analysts took up to two months, and the results were not very Access to, and use of, trusted information
two months to deliver hundreds of reports to accurate. Now it takes two minutes—and is truly driving change and business perfor-
the business-user community. it contains three years of historical buying mance at Iron Mountain. ●
Before your company charges into a data But there are drawbacks. The explosion in Data federation is ideal for gaining timely
integration initiative, it is important for the data volumes is reducing the frequency by access to changing information, and is
project team to carefully identify your require- which data can be physically integrated, extremely valuable in industries such as
ments and consider what is driving the extending update times—for example, from financial services or government agencies,
initiative. Is it discrepancies between results days to weeks. Increasingly, businesses where strict compliance regulations prevent
from financial systems and sales? Concerns need more agile techniques for integrating them from replicating certain data.
about the accuracy of information in a com- data to ensure up-to-the minute information
pliance report? Lost market share because of to support global or 24x7 operations and to One Size Does Not Fit All
a lack of timely information? The end result address the exponential growth in, and man- When was the last time you purchased a
you are seeking must drive your data integra- agement of, corporate data. “one-size-fits-all” item? Chances are the item
tion strategy and design. didn’t really fit or suit you and ended up at
your favorite charity. Don’t make the same
The Challenge of Gaining Trusted Organizations that successfully pro- mistake with data integration.
Information vide their employees with access
According to a 2006 survey conducted by Analyze your business requirements care-
Business Objects, only about 10 percent
to quality information can truly fully to ensure your proposed solution is the
of information workers always have all the achieve a competitive advantage. right fit. But don’t be afraid to start with one
information they need to confidently make approach and add others as your business
business decisions.1 If this is the case, then needs evolve. Many organizations marry a
organizations that successfully provide their For organizations that depend on immedi- number of techniques to get exactly what
employees with access to quality information ate information, real-time data integration they need. A solid data integration strategy
can truly achieve a competitive advantage. techniques should be considered. But how can help you make more effective decisions
fast is fast enough? The “right time” depends based on trusted, complete data and trans-
Data integration is the key to ensuring that on business needs. For example, a company form the way your business works. ●
trusted information is readily available for that manages a city rail system must have
end users. Before you select data integra- accurate and immediate information regard- For a free white paper on this topic,
tion techniques and technologies, a number ing on-time or delayed arrivals and must be download “Data Integration: The Key to
Effective Decisions,” or click here for more
of important considerations must be able to drill down to individual trains that are
information about Business Objects.
weighed to ensure you choose an approach experiencing technical problems. Changed
that is best for the current and future needs data capture (CDC) can be deployed in this
of your business. scenario to detect data changes in real time
and move only the updated information.
Batch versus Real Time
Historically, the term data integration has Physical versus Virtual Data Integration
been synonymous with batch ETL (extract, Data integration teams must also consider
transform, and load), an approach used to whether data needs to be physically moved
move large data sets from one enterprise or whether a virtual or “in-place” approach to
application or database to another. This accessing and aggregating data makes more
technique is especially valuable in initiatives sense. The technique most often applied
where data volumes are high, such as data here is known as enterprise information
warehousing, systems migrations, and merg- integration (EII) or data federation. Data fed-
ers and acquisitions. eration employs a variety of specialized query
techniques to draw the right information from
operational systems quickly for immediate
1
Based on a recent survey of information workers analysis without negatively impacting the per-
in the U.S., United Kingdom, France, and Germany,
commissioned by Business Objects and conducted by
formance of source systems.
Harris Interactive, June 2006.
C A S E S T U DY
S&H Solutions:
A Competitive Edge
in Retail Analytics
Commentary by Sam Morales
Data Services Manager, S&H Solutions
S&H Solutions was looking to expand their loyalty card enables the one-to-one commu- level. Based on report results, customers
hardware capacity to manage significant nication and data capture.” can be individually identified, segmented,
growth as well as provide more flexible and offered behavior-specific messaging and
reporting. Via their Sybase IQ environment, Customer purchase information data is promotions. Each customer is identified with
S&H Solutions was able to increase end- captured in detail and stored in the S&H a member number through the S&H system
to-end performance by 500 percent. This proprietary data warehouse, regardless of the located at the point-of-sale register. The
environment has given S&H the ability to customers’ membership status. This informa- system will notify the cashier in real time with
grow and to undertake new business intel- tion can be reviewed historically to provide shopper-specific details such as whether the
ligence initiatives. retailers with information on customer customer is a loyal or high-spending shopper.
shopping behavior patterns and promotion This real-time connectivity and one-to-one
As the original creator of customer loyalty, acceptance. Data collection can also allow customer communication allows the retail
S&H Solutions began over 110 years ago for product-type reporting, including inven- partner to track customers on a continuous
with the nationally recognized customer tory management and customer response basis, and develop high-level customer pro-
rewards program S&H Green Stamps. Today, effectiveness. motions and messaging to increase shopper
S&H Solutions has revolutionized their retail frequency and loyalty.
business offerings by providing sophisticated
technological capabilities and knowledge
S&H Solutions has revolutionized Through high-performance technology and
tools that enable retailers to deliver in-store, their retail business offerings by ongoing improvements, S&H Solutions is
real-time, customer-specific messaging. able to provide competitive customer loyalty
providing sophisticated technologi- solutions to its retail partners to deliver an
S&H Solutions has recently added an entire cal capabilities and knowledge integrated data set, high-volume consumer
new layer of buying pattern analytics to the intelligence, as well as timely information.
established loyalty formula. This gives S&H
tools that enable retailers to deliver S&H Solutions’ solutions for its retail part-
Solutions a leading competitive position in in-store, real-time, customer- ners’ loyalty programs have proven effective
providing retailers with proprietary customer with high-level reports that highlight virtually
loyalty solutions. Retail partners can now
specific messaging. every aspect of each unique buyer segment.
better analyze the precise buying patterns of Sybase IQ’s high volume capacity brings a
repeat customers. critical richness to the data and an enhanced
The cornerstone of the analytics solution analytical capability. That information creates
The technical challenges are many for S&H is information—massive amounts of it. The consumer loyalty for the stores, and retailer
Solutions: it must manage integrating data S&H data warehouse stored in Sybase IQ loyalty for S&H Solutions. ●
from a variety of sources, all in different is close to 1.5 terabytes when compressed,
formats, run this data through sophisticated which would require at least 3 terabytes if
data analysis, and deliver meaningful infor- it were uncompressed in a raw format. The
mation to its customers—all in a high-volume main individual item detail table has 4 billion
environment. Sam Morales, data services rows, and joins with a 500,000-record Uni-
manager at S&H Solutions, briefly describes versal Product Code (UPC) table of product
the service provided to retail partners: “In lookups. This transaction table also cross-
short, we provide our partners [the retail references 10 million customers selected
store owners] the ability to target their cus- across different combinations of hundreds of
tomers [in] real time and on a one-to-one individual stores.
basis. The S&H system interfaces directly to
the retailer’s point-of-sale system and pro- This volume of information not only allows
vides detailed reports on shopper behavior global trend analysis; it also provides analyt-
and promotions. The customers’ unique ics at the individual product and consumer
Powering Integration
and Analytics
When the Data
Is the Business
By Sumit Kundu
Director, Product Management, Sybase
Data aggregators are information brokers, Integrating the complex queries. Sybase IQ’s data compres-
collecting industry- or society-wide data to Data Aggregator’s Environment sion algorithms cost-effectively support the
provide value-added services to customers Sybase Data Integration Suite addresses very large databases that are typical in data
and subscribers over the Web. The services the complex data integration needs of data aggregator environments. In addition, Sybase
provided by data aggregators go by many aggregators by combining a modular set IQ’s multi-node architecture helps data
names: mortgage risk intelligence, audience of technologies with advanced modeling, aggregators to simultaneously support a large
measurement services, market research development, and management tools. The clientele of concurrent users while allowing
provider, national statistical agencies, online five core integration technologies provided continuous data feeds into their
shopping price comparison, and many others. by the suite are replication, data federa- data warehouses.
They may be government agencies, indepen- tion, ETL, real-time events, and search. This
dent companies, or an enterprise division. comprehensive approach ensures that In a benchmark exercise, Sybase IQ was
any data integration requirement, or set of loaded with one trillion rows of data, which is
Complex Challenges requirements, can be addressed by the suite. the equivalent of 155 terabytes of raw input
A data aggregator faces the challenge of Supporting these component technologies is data. Sybase IQ used only 55 terabytes to
converting vast stores of data into a product an integrated modeling and metadata layer store the 155 terabytes of input data. Con-
or service, since their operations depend on that leverages the data modeling capabil- ventional databases would require up to one
reselling data and/or related analytics. Some ity of Sybase PowerDesigner, the industry’s petabyte (1,000 terabytes) of storage for the
aggregators manage the most complex and number one data modeling solution. In same amount of input data. In other words,
demanding data integration and analytics addition, the suite’s development tool, Work- thanks to its unique compression capability,
challenges imaginable: capturing millions of Space, provides an integrated development Sybase IQ can reduce data storage require-
data points at the most detailed level from environment (IDE). ments by up to 94 percent. Even more
hundreds or thousands of locations every impressively, Sybase IQ showed no slowing
day, only to turn that data around for cus- in query or data loading speed as query
Some aggregators manage the
tomer use in very short order. Others face submission rates increased five-fold—demon-
more modest requirements. Typically, all will most complex and demanding data strating linear scalability. And with customer
face one or more of the following challenges: databases of over 40 terabytes in production,
integration and analytics challenges
it’s evident that these performance numbers
• Integration of multiple, disparate imaginable: capturing millions of aren’t limited to benchmarks.
data sources
data points at the most detailed
In summary, Sybase IQ and Sybase Data
• Large numbers of complex, ad level from hundreds or thousands Integration Suite provide the capability and
hoc queries performance necessary for data aggrega-
of locations every day.
tors to integrate complex environments, as
• Large data sets, frequently more than well as to recognize and extract an in-depth
a terabyte of data understanding from large reservoirs of infor-
The Key: Advanced Analytics Capability mation—making it quickly available to those
• Large numbers of concurrent users Sybase IQ empowers data aggregators to who need it the most. ●
not only overcome their data management
• The need for an “active warehouse” challenges, but to thrive in the face of the For a free white paper on this topic,
of data most demanding environments and business download “Right-Time Business
Intelligence: Optimizing the Business
requirements. Sybase IQ’s column-based
Decision Cycle,” or click here for more
architecture enables ultra-high performance information about Sybase.
to support large numbers of ad hoc and
C A S E S T U DY
Effective Long-Lived
Data Management
LESSONS FROM THE OIL & GAS INDUSTRY
By David Archer
Vice President, Petris Technology, Inc
Amy Price
Director, Petris Technology, Inc.
An Introduction ownership, technical/business teams, and been developed but data tends to remain in
to Oil and Gas Information technological paradigms. the applications that created it. It has proven
Throughout the 20th century, technologi- too costly for existing applications to change
cal advances have increased the ability to A major exploration and producing company their underlying data access mechanisms
find and produce oil and gas. Petroleum lies has many terabytes of data under manage- to be tightly bound to industry data stan-
waiting to be produced in reservoirs below ment—with another terabyte per day of dards. Furthermore, changes in energy and
the surface of the earth—through deep operational data being added to the already information technologies result in mainte-
wells through which oil and gas flows to the overflowing data stores. Not to be neglected nance nightmares for most standardization
surface where it is collected, transported to are the rooms full of paper documents (and initiatives—as by their nature standards are
refineries, and converted into any number of their electronic counterparts). resistant to change.
end products that the modern world requires
for its current existence. Promising standards efforts in energy (as in
The productive life of a petroleum other industries) have centered on the defi-
Since the early 1980s, information technol- reservoir may span many decades, nition of XML specifications for information
ogy in the form of complex sets of technical exchange via Web services and service-ori-
geoscience applications have helped com- and the reservoir’s data has a life- ented architectures (SOA). However, these
panies discover, define, and develop their time that is at least as long as that efforts alone are not enough.
oil and gas reservoirs. Because hydrocar-
bons are “way down there” where we can’t of the physical asset. Two recent studies of oil and gas informa-
go, industry specialists have come up with tion management and integration highlighted
a sophisticated array of indirect methods some of the barriers to data management
to determine where the hydrocarbons are, Data Management Barriers in energy. These barriers are not unique to
how much might be there, how we might Oil and gas data is complex, generated and energy. (See Table 1).
drill wells to get to them, and how to extract used over a long time frame by many people
them. All of these indirect methods rely working across several technical disciplines A Systems Approach to Data Management
upon volumes of data that must be properly and is managed by a variety of means which Previous attempts at data management have
managed, making this industry one of are generally related to how the data was failed because the problems are systemic;
the most information-intensive in the originally captured—without regard to future any solution to one aspect will also affect the
modern world. needs. It is not uncommon to have the same others. A systems solution—one that consid-
information in several different places—either ers each of these aspects—provides
Long-lived Assets with Long-Lived, replicated or used inconsistently. The data optimal results.
Large-Volume Data itself is in a variety of formats bound to
The productive life of a petroleum reservoir different technical applications and organiza- • Data should remain in its original
may span many decades, and the reservoir’s tions—few of which follow common access repository, while making it visible and
data has a lifetime that is at least as long as methods or use common representations. accessible to users of the original and
that of the physical asset. Data accumulation Attempts to manage all the data in central other applications. Underlying applica-
begins at the earliest phases of explora- repositories designed to provide universal tions should not be changed.
tion with volumes of data growing until the access to people and process have proven
property is fully depleted. The data itself is difficult to create and maintain. • Use Web-based conventions to search
complex to acquire, understand, and inter- for and transfer needed data sets. Web-
pret. It’s produced by and used in a variety More than 15 years of attempts to standard- based maps provide a convenient way
of processes in very dynamic workflows, over ize industry data models have had limited to locate geo-referenced data (much of
long time frames subject to changing asset success; comprehensive data models have the oil and gas data is connected to a
C A S E S T U DY
Background The first option was to convert the text files FULL-SERVICE BANK PROCESSES DATA
With the right tool, data integration can to Oracle tables, perform a series of joins in MORE THAN 3 TIMES FASTER
be quick and painless. The following is an Oracle, then extract the data back into text
account of one company’s search for that files. The second option was to use Ascential
right tool. The company, a full-service bank DataStage, the ETL tool they were previously
that we will call FSB, operates banking using. The third option was Syncsort’s DMEx-
offices in a number of states. In addition to press. At the time, DMExpress was fairly
a wide range of traditional banking services, new, but FSB had used other Syncsort prod-
FSB offers a comprehensive array of invest- ucts in the past, so they decided to evaluate
ment, mortgage, and insurance services, and their latest software.
has a network of loan origination offices for
small businesses nationwide. Having dismissed the first option as too
cumbersome, FSB decided to evaluate
Challenge DataStage and DMExpress through a proof
FSB developed a production system to of concept. The proof of concept involved
identify and monitor a customer’s total files of fairly large sizes, as well as complex Figure 1. FSB was able to process data from four
banks in less than an hour.
relationship with the bank. The system ware- join and filter criteria. Unable to handle large
houses data primarily to support the lending volumes of data, DataStage would often
function of the bank. To populate the data crash before finishing. When it did complete,
mart that supports this system, data is taken it would take three to four times longer than Conclusion
from multiple disparate sources, includ- with DMExpress. After a successful proof of concept. FSB
ing mainframe legacy and Oracle systems, decided there was no other option. “Wherever
converted to a common format (ASCII flat we find the need to process large amounts of
files), and then processed with an ETL tool. “Wherever we find the need to pro- data in a short amount of time, DMExpress
The result is that users of the system can cess large amounts of data in a is the obvious choice,” the lead developer of
view multiple obligations—commercial loans, FSB commented. When it comes to speed,
mortgages, credit cards, deposit accounts, short amount of time, DMExpress is DMExpress gets it done faster. ●
and so on—of an individual customer as the obvious choice.”
a single lending relationship. The system
provides debt and deposit totals at both Lead Developer, Full-Service Bank
Solution
The development team at FSB was tasked
with evaluating different tools and selecting
a solution that could handle this process.
The decision came down to three options.
Alleviating Data
Integration Headaches
By Lia Szep
Senior Technical Analyst, Syncsort Incorporated
CA
C O N T I N U E D F R O M PA G E 1 7S E S T U D Y
C A S E S T U DY
Pitney Bowes The challenge in the marketing organization processes, enabling their customer-facing
was creating campaigns based on an under- associates to take actions based on customer
Improves Total Cost standing of the buying and service behaviors insight.
of the customer, such that the right products
of Ownership could be offered to them at the right time. Enhanced Sales Productivity
Oracle Sales Analytics has completely turned
Commentary by William Duffy
Data Warehouse Project Manager, Pitney Bowes Pitney Bowes needed a solution that could the tide in their sales organization. It enables
help them overcome these challenges. Fur- them to analyze where sales agents were
thermore, they needed a technology that spending time, ensure efficient customer
could draw insight from more than 10 differ- follow-up, perform better forecasting, analyze
OVERVIEW ent legacy systems. opportunities, and track renewals to close.
Industry “Oracle BI provides laser-like visibility into the
High Tech The company embarked on a vision, called performance of every sales rep,” said Wil-
“Power of One,” to provide its customer liam Duffy, data warehouse project manager.
Oracle Products
base with a consistent experience across all “Sales operations can now pinpoint potential
• Oracle Business Intelligence Applications
business units. At the core of making this issues right away, which earlier would have
• Oracle Business Intelligence Suite
Enterprise Edition vision successful was providing information taken three weeks.”
• Oracle Data Warehouse about customers across all customer-facing
associates—four call centers with over 1,250 Improved Customer Satisfaction
Key Benefits
agents and 1,500 field service repair repre- Pitney Bowes receives approximately 30,000
• Improved total cost of ownership with
comprehensive BI solution sentatives in North America. calls a day from its customers. With Oracle
• Enhanced sales productivity through better Service and Contact Center Analytics, they
customer insight The Solution can now measure the performance of their
• Increased responsiveness and customer Pitney Bowes chose Oracle, as its compre- call centers and effectively manage the
satisfaction hensive solution provided best-of-breed data productivity of their 1,250 agents, thereby
• Improved marketing effectiveness with warehouse, BI platform, and prebuilt BI ensuring responsiveness and improving cus-
better segmentation and targeting application products. Furthermore, its hot- tomer satisfaction.
pluggable architecture provided the ability
to integrate business intelligence into their Improved Marketing Effectiveness
The Challenge business processes, derive insight from their Oracle Marketing Analytics helps the com-
Pitney Bowes, a $5.5 billion company, is the legacy systems, and analyze information from pany drive important customer retention
clear market leader for postage meters, with both historical and real-time data sources. campaigns, targeting customers with expir-
80 percent market share. ing leases. It further helps them segment
By standardizing its customer-facing associ- customers and understand the potential of
With a global customer base of 2 million, ates onto the same Oracle BI solution, Pitney new customers, so that the right customers
one of the top challenges at Pitney Bowes Bowes was able to create a 360-degree view are called at the right time to improve overall
was making sure they maintain their exist- of their customers and provide real-time campaign effectiveness.
ing customer base while capitalizing on new access to this information to sales, service,
opportunities for growth. and marketing organizations. They also Superior TCO
achieved a faster time to value by implement- “One of the most important values of Oracle
This meant that the sales organization ing the Sales Analytics, Service and Call BI is its total cost of ownership,” said Wil-
needed to better understand their existing Center Analytics, and Marketing Analytics liam Duffy. “With Oracle BI, we were able to
customers, to ensure renewals and discover modules of Oracle BI Applications. Pitney deliver over 400 reports to a very large orga-
up-sell and cross-sell opportunities. They fur- Bowes has standardized on an Oracle data nization with just one person—now that’s
ther required information about the potential warehouse that can easily scale as their cost effective!” ●
of new customers by analyzing the behavior volume of information grows. This solution
of their existing customers. has helped them improve performance and For a free white paper on data quality,
user satisfaction, and provides an infrastruc- download “Transforming Data into
Quality Information,” or click here for
With such a large customer base, the com- ture to grow their business.
more information about Oracle.
pany had the challenge of ensuring that
their service representatives had all the Enhancing Business Processes
information required to ensure customer sat- with Pervasive Insight
isfaction—whether it be to repair a piece of Pitney Bowes has integrated Oracle business
equipment or to answer billing inquiries. intelligence solutions into their core business
C A S E S T U DY
Worldwide Leader requirements. Initiate Hierarchy provides financial accounting applications, result-
search, retrieval, editing, visual hierarchy ing in greater potential for cross- and
in Software navigation and management, and advanced up-selling.
compositing capabilities.
Achieves Accurate, • Privacy and legal compliance—As the
Implemented in less than 12 months, compliance landscape continues to
Complete Views of the project aligns data records from five evolve, this company is poised to adapt
countries and individual data from internal with it. With organizational hierarchy
B2B Organizational customer management systems and external management, privacy preferences are
master reference databases, such as Dun & easier to track and respect.
Hierarchies Bradstreet, Experian, InfoUSA, and others.
By building on top of enterprise-class, 64- The Future
TRANSACTIONAL HUB DELIVERED IN bit products, such as Microsoft SQL Server This company will soon introduce multi-byte
LESS THAN SIX MONTHS WITH ACCURATE, 2005 and Windows Server 2003, the proj- data from additional countries and be poised
SCALABLE SOLUTION ect was able to align more than 50 million to include their full global organization base,
organizational hierarchy records in under which will approach 150 million organizations
Commentary by Jim Cushman six hours, making them available for real- spanning 50 countries and 20 languages.
Vice President of Architecture, Initiate Systems, Inc.
time consumption with the ability to handle They will also roll out a worldwide data
thousands of concurrent transactions with stewardship tool that was developed as a
sub-second response times. thin-client application. And they will integrate
Company Overview in real time with data standardization and
A worldwide leader in software, services, business process management tools and ser-
and solutions that has achieved success with “Initiate’s advanced matching tech- vices that offer automatic routing, workflow,
B2B organizational hierarchies. nology created a global solution and decision making.
The Challenge for our CDI needs.” Later this year, Initiate Hierarchy will be
This company strived to attain a composite Senior Director, integrated in real time with large-scale
view of all individual and organization data Worldwide Leader in Software applications such as Siebel and SAP, where
about customers, partners, and suppliers. it will serve as the global customer master.
The organization wanted to define the true Integrated applications will leverage Initiate
value of each customer and identify valuable Hierarchy to search and retrieve organiza-
customers within organizations, even when Benefits tions and hierarchies; update, link/unlink,
purchasing through multiple channels. Addi- The company has enjoyed many benefits and merge/unmerge organizations; and
tionally, this company needed to accurately since deploying Initiate Hierarchy. much more. This offers a centralized, ser-
assign and recognize revenue for accounting vice-oriented architecture (SOA) for customer
and sales commission processing through • Customer satisfaction—provides a con- data quality and management, and will
efficient territory management, ensure accu- sistent customer experience at all touch dramatically increase the accuracy and reli-
rate license positions, assess customer risk points by improving self-service and ability of the data for a fraction of the cost of
and loyalty instantly and accurately, and enabling complete, accurate, real-time attempting to perform the same functions in
comply with regulatory compliance concerns views of all of a customer’s relationships over 50 unique customer repositories.
on a global scale. Finally, they wanted to dra- with the company.
matically reduce the internal costs associated Within the next year, this company will inte-
with attaining data quality, alignment, and • Financial reporting—streamlines report- grate its entire global individual customer
governance. ing on unparented and misparented base, amounting to approximately 500 million
accounts, which allows the company to records, and will use Initiate Identity Hub
Solution understand and identify its most valuable software to identify all its unique individual,
This software leader selected Initiate cus- customers, more accurately recognize household, and individual-to-organization
tomer data integration (CDI) software for a revenue, reduce commission errors, and relationships. ●
major project to completely reform its global ratify license positions.
data infrastructure and better identify indi-
viduals and organizations. • Field productivity—ensures synchroniza-
tion of data across the entire organization,
Initiate Systems developed a B2B data stew- including customer relationship manage-
ardship application to support the business ment (CRM), software licensing, and
C A S E S T U DY
Embarq Improves
CRM While Increasing
Employee Efficiency 10 + DATA SOURCES IBM WebSphere
Management Solution
XML
By Frederick Ferguson
Associate Partner, IBM Global Business Services XML
XML ETL
John de Wit
CEO, Nimaya Inc. XML
Q How can we efficiently tap Q Data warehouse appliances Q What can I accomplish with
into our data to enhance the have been acknowledged enterprise data mash-ups (EDM)
decision-making process? as successful at supporting that I can’t from application
data mart analytics. But can integration?
A Effective decision making requires a user they handle the requirements
to view a sequence of interrelated data. of enterprise-class data A Enterprise data mash-ups (EDM) are
With reporting technology, a series of warehousing? lightweight, Web-based SOA applications
5 to 10 reports may be needed to make that leave source data in its original loca-
analytically based decisions. This process A Increasingly, large organizations are tion and state. They do all the work on
is time-consuming, both for IT to design implementing Netezza data warehouse the fly, usually without the need for “big
the reports and for the businessperson to appliances to handle enterprise-class data iron.” The smaller footprint of EDM
uncover critical insights. warehouse applications because of their enables companies to quickly construct
Dynamic dashboards with relational ability to easily manage large workloads new dynamic and flexible applications for
online analytical processing (ROLAP) of varying complexity. These systems have specific tasks by assembling ingredients
functionality offer an advanced approach proven they can deliver high performance from existing applications, Web services,
to decision making. These informational against large, mixed workloads across and databases. This empowers companies
dashboards collapse data from dozens of many business applications and for hun- to solve problems incrementally, realizing
reports into one dashboard, and provide dreds of concurrent users. This capability ROI gains in small, measurable steps
intuitive and rapid navigation across provides organizations with faster, deeper and avoiding the huge burden of a larger
the data. With ROLAP’s “drill-any- insight into data from multiple depart- enterprise development project that could
where” capability, users can easily surf ments across the enterprise. take months and millions. Management,
the data warehouse to find the data developers, and end users all benefit from
without requiring an explicit report to this approach.
A NA LY S T V I E W P OI N T
be designed by IT.
Appliances are steadily gaining credibility
as scalable platforms for enterprise-class A NA LY S T V I E W P OI N T
A NA LY S T V I E W P OI N T Enterprise data mash-ups represent an
data warehouse systems—one reason why
A key culprit behind inefficiency in tapping intriguing convergence between bare-
IDC predicts the appliance market to grow
data is user reluctance to embrace sophis- knuckle data integration and the rich
from its current estimated $50–75 mil-
ticated business intelligence tools. Rather interactivity of Web 2.0 Internet sites. Using
lion to roughly $500 million in five years.
than attempting to force-feed BI down users’ such standards as JavaScript, DHTML, and
Continued R&D by DW appliance vendors
throats, many organizations have benefited XML for rapid development and integration
and the advent of multicore processors are
from lowering the barrier to entry through in a service-oriented architecture, mash-ups
combining to increase appliance scalability
third-party Microsoft Excel add-ons that provide a virtual, noninvasive tool to enrich
to support large data volumes, complex
enable analysts to continue using a favored certain applications, such as employee
queries, and high numbers of concurrent
tool but with richer ad hoc analysis and portals or supplier extranets, with dynamic
sessions. Prospective buyers should conduct
reporting functionality. Similarly, dynamic, structured or unstructured content without
rigorous proof-of-concept testing that mir-
graphical dashboards with easy-to-read impacting source systems performance.
rors the production environment to ensure
metrics and drill-through, workflow-driven While mash-ups are gaining popularity in
performance matches expectations. If per-
decision making, as well as collaborative certain niches, security vulnerabilities and
formance is up to par, an appliance may be
training involving both business and IT, can relative immaturity means that they should
a smart choice, especially for organizations
help improve user efficiency. A business not (at least for now) be considered as a
that need to reduce expenses for deploy-
requirements analyst can help understand replacement for a proven and robust data
ment and maintenance.
user needs and complaints and tailor solu- management infrastructure underpinning
tions accordingly. mission-critical systems.
Q Certain IT departments are Q What are the most important Q How can I improve the quality
still finding it challenging criteria for creating a usable of my data throughout my data
to build the right business data aggregation environment? integration project?
case for master data
management initiatives. How A There are five requirements that stand out. A A thorough data quality program includes
do you recommend creating a A data aggregation environment should: two phases. First, data is either captured
compelling business case? 1. Support multiple data integration tech- in a standard, error-proof way, or it
nology requirements, sharing common is cleansed in preparation for loading.
A This is an important question—MDM is administration, design, and metadata. Then, data can be enhanced for further
no longer exclusively an IT-driven project. 2. Meet rigorous standards for ad hoc analysis. For example, demographic and/
MDM can provide significant business querying and data warehouse load per- or lifestyle information can be added to
benefits, and gaining business sponsor- formance, well beyond what can be customer records before the actual load.
ship up front is critical. We recommend delivered by traditional relational data- Enhancing data usually entails combin-
doing a quick ROI audit to identify base systems. ing multiple sources of data, and this
the specific set of business drivers by 3. Be significantly easier to deploy and data is often held in multiple databases
function—whether in marketing, sales, maintain than what is possible through on disparate platforms. Using a data
contracting, procurement, or compliance. traditional database systems. manipulation tool makes coordinating all
The key is to identify the hard cost sav- 4. Enable near-linear user and data scal- of these different sources much easier.
ings associated with a MDM project that ability to support thousands of users
will capture the attention of business and terabytes of data and, in some
A NA LY S T V I E W P OI N T
managers. For example, one manufacturer cases, also the flexibility to allow mul-
With growing recognition of the risk that
identified over $11 million in cost savings tiple grades of SLAs.
from their sales operations over a five-year 5. Provide a cost-effective solution that poor data quality poses to the business,
period, effectively paying for the MDM drives data aggregators’ top-line growth many organizations are taking the smart
project. However, it is equally important while containing data warehouse infra- step of incorporating data profiling and
not to leave out the soft costs, which are structure costs. cleansing into broader data integration
essential to sell the larger vision.
initiatives, often multisource integration
A NA LY S T V I E W P OI N T into a warehouse or migration from legacy
A NA LY S T V I E W P OI N T Throughout the process, it’s important systems into a modern, Web-based appli-
Most IT professionals, much less business to keep the overarching objective in cation. Data profiling is a key first step
sponsors, have little insight into the magnitude mind—building measurable and sustainable that assesses the content and structure
of data quality and inconsistency problems business value, which can become diluted of data—an information reconnaissance
across multiple business units. A sound first amid the complexity and time pressures of mission that is prerequisite to thorough
step in building a business case for MDM is to a large-scale data aggregation project. Key cleansing and reconciliation across multiple
document the extent of the problem through criteria supporting that principle include 1) sources. Ideally, data quality is viewed not
a data life-cycle audit (and in many cases, executive-sponsored collaboration between as a collateral project, but as integral to a
the problem will be greater than expected). business and IT, 2) improving the qual- broader integration initiative with appropri-
A thorough assessment of the issue sup- ity and consistency of enterprise data, 3) ate scoping, tools, and resources allocated
plies a foundation to 1) document the cost of ensuring performance and scalability, and 4) to help ensure its success.
data discrepancies to the business, and 2) implementing a standards-based system that
quantify business benefits and ROI that can may be rapidly and affordably extended to
be realized through MDM, ideally over a 5- or support future initiatives or mergers/acquisi-
10-year period. It may be helpful to document tions. Establishing a “center of excellence” is
the larger industry trend toward a comprehen- a proven way to coordinate competing priori-
sive data infrastructure—and the opportunity ties and share best practices to help ensure
costs of falling behind. that business value is realized.
C A S E S T U DY
MedSolutions:
Improving Healthcare
through Information
Access and
Streamlined Processes
Commentary by Steve Wise
CIO, MedSolutions
Company Overview “cases.” The expansion was also designed hensive member redirection to lower-cost,
MedSolutions is a leading radiology manage- to automate approvals to physicians’ offices high-quality imaging centers. The portal also
ment company that provides customized, without human interaction. allows MedSolutions’ health plan customers
proprietary management of high-tech imag- to enhance their brand via their own cus-
ing to a national client base. As the project began, Collaborative sent a tomer-facing Web sites, featuring information
team of user experience design profession- from MedSolutions’ back-end databases.
Business Challenge als to MedSolutions. The team studied the Other services are being delivered as the
MedSolutions has recently undergone signifi- organization’s business processes—from rollout continues. These include eligibility
cant growth, which led to a dramatic increase customer through call center and interac- verification, claim status, electronic claims
in call center volumes. In addition, the com- tions with managed care companies—as submission, and educational programs.
pany continuously strives for faster, better well as its customer requirements and busi-
ways to work with physicians who request ness rules. After evaluating the environment, The self-service portal provides a clean and
imaging studies, and to streamline the Collaborative and MedSolutions technology modern visual environment and a strong
authorization process for them. Furthermore, professionals Web-enabled the process that design that reflects accuracy, simplicity,
MedSolutions was looking for a solution to providers (i.e., physicians) undertake to timeliness, professionalism, and efficiency.
provide members with increased transpar- receive authorization, all the while mirror- These attributes adhere to MedSolutions’
ency of information, and to support health ing the existing call center process to foster corporate design standards, as well as to
plan customers in delivering this and other consistency. its objectives of higher quality and lower
consumer-driven healthcare information. As diagnostic imaging costs for customers, and
a result, the organization’s plans included a Next, Collaborative introduced experienced reduced call center volumes. In addition, the
complete overhaul of its prior authorization project managers to keep the project on portal features an intuitive user interface,
and notification request process for imaging time and within budget. Other Collaborative and delivers immediate access to imaging
studies online. contributors included a group of technology authorization status, claims, and provider
professionals with specific portal architec- profile information.
Solution ture and systems integration expertise, and
To improve service delivery to physicians, several data warehouse architects. These Further, it synchronizes patient claims, eli-
support client health plan needs, and pro- experts synchronized three disparate data gibility, and member imaging authorization
actively manage increased demand on sources—claims, insurance eligibility, and data. For example, by integrating fax and mail
call center staff, MedSolutions executives member-imaging authorizations—making technology with online processes, providers
decided to develop a self-service portal that patient imaging information readily available receive immediate authorizations for treatment
would enable customers and other users to to providers while increasing the portal’s and state regulated letters. For more clinically
obtain immediate adjudication of imaging functionality significantly. complex cases, detail clinical information or
requests, and determine approvals or recom- attached progress notes allow nurses and
mended clinical reviews. Benefits doctors to process cases efficiently.
The robust application offers “My Portal”
To create the portal, MedSolutions part- functionality to members, providers, imag- From a data perspective, the benefits are
nered with Collaborative Consulting. The ing centers, and health plans that want abundant. The portal can quickly and easily
team expanded MedSolutions’ proprietary to conduct ad hoc reporting and custom- provide reports to specific internal and external
neural modeling (a competency built in to ize services based on personal desires. users without deploying software or establish-
the existing call center application), enabling Sophisticated scripting and application ing other means of delivery. Internally, data can
the portal to immediately adjudicate most business rules support provide compre- be leveraged in numerous ways. For example:
• The sales team can identify opportunities The portal’s architecture is leveraged as follows: for different user experiences, or can be
to cross-sell by comparing demographics, based on specific client requirements.
behavioral data, product and service use, • Reusability. A services-based approach
and survey information. to components within the solution’s data Thus far, the portal has received widespread
layer facilitates the reuse of components approval among medical professionals. In
Externally, partners can use the portal to lever- by leveraging existing legacy systems, addition, it is streamlining call center volume
age data and enhance care. For instance: resulting in a consolidated, enterprise internally, while its user-centric design and
data layer. immediate access to radiology management
• Health plans have access to key radiology data improves customer relationships. Criti-
utilization and cost data for their mem- • Reliability. The asynchronous messaging cal first steps of understanding user needs
bers, provider reporting, and a platform layer provides a consistent user experi- and critical business functions have also paid
for private labeling radiology approval ence, including the ability to continue off. MedSolutions has realized its adoption
processes. With private labels, they can to submit authorization requests even goals and recouped costs as a direct result
personalize and offer role-based views of during back-end system outages. of these critical first steps. As time goes on,
core business functions and information patients, imaging facilities, and healthcare
to a variety of audiences. • Loose coupling. Loose coupling between benefits administrators will benefit from its
the business logic layer and data services functionality, and customer satisfaction will
• Providers have access to clinical guide- insulates the business and presentation continue to increase. ●
lines and can review ordering patterns logic from database schema changes. This
and benchmarks. approach facilitates the implementation of
a major data quality initiative and a new
• Members have access to their imag- claims processing system with minimal
ing information, allowing them to make impact to the self-service portal solution.
informed decisions regarding their
treatment, including choosing providers • Composite application design. By design-
based on quality ratings and out-of- ing the presentation layer as a granular set
pocket expenses. of interoperating portlets, business pro-
cesses and application flow can be tailored
Leverage a
Disciplined, Standard
Approach to Achieve
“One-Source” Truth
By Mark Holmes
National Operational Consulting Practice Director
John Williams
Senior Vice President, National Practice Director,
Collaborative Consulting
Today’s best-run companies can identify and tall order. How can a company achieve “one- a relatively simple application can be
repair problems and replicate best practices source” truth? developed to share that information. If,
throughout the organization very quickly. however, several manufacturing plants are
Most of the time, these organizations mea- The best way is to adhere to a common, also heavy KPI data users, and a global
sure and manage their supply chains with disciplined approach to KPI development. executive team requires the information, a
exceptional skill and vigilance. Proper KPIs are the result of a mathematical portal may be more appropriate.
equation used and integrated throughout the
Excellence in supply chain management enterprise, and adopted by trading partners. • KPI value. KPIs must align with business
means achieving superior levels of visibility This level of standardization helps KPIs objectives. In addition, senior executives
into systems and processes, while optimizing link with an effective data architecture that must ensure their adoption throughout
the effectiveness of key performance indica- enables timely, accurate visibility. When that the enterprise.
tors (KPIs). Companies can improve their occurs, underperforming processes are high-
visibility into systems and process execution lighted very quickly, as are those that ought • Measurement time frames. By determin-
by ensuring that their KPIs are well defined to be implemented throughout the enterprise. ing how often to measure KPIs, a company
and based on accurate, timely data. When can figure out which data sources it must
organizations accomplish these levels of In addition, to optimize the effectiveness of access easily. Of course, measurement
introspection, they can create effective plans its KPIs, a company must identify those that times vary greatly from company to com-
and quickly implement changes that make a are most critical, i.e., the ones that drive the pany. An e-fulfillment organization will
real difference. business. Usually, this is a relatively small but want its “out-of-stock” KPI measured daily;
highly important group. another enterprise may need that metric
Here’s the catch: KPIs rely on multiple, usu- only weekly or monthly.
ally disparate sources of data. These feature Because most companies have only a few
a wide range of quality and accuracy, making critical KPIs, it makes sense to begin KPI KPIs require a sound definition, disciplined
it difficult to draw precise conclusions enhancement initiatives with them. Doing data gathering, and continuous alignment of
about their intended measurements. In fact, so, however, is complex; a company must that data with business objectives. By adher-
defining the proper KPI and gathering its uncover, organize, and expose the data that ing to a rigorous plan and maintaining data
associated data is usually incredibly complex creates visibility into KPIs. discipline, companies can create extremely
and challenging. And the more functions and robust KPIs, and access powerful, accurate,
divisions an organization has, the more com- Other important elements of KPI visibility optimized assessments of their performance. ●
plicated the task. include:
For free white papers on this topic,
Consider a particular data value. An • Data source. KPIs are most effective download “Data Governance: Developing
a Foundation for Long-Term Stability and
organization’s enterprise resource planning when all functions and divisions within
Strategic Business Advantage” or “Data
system, its third-party logistics system, and an organization “pull” a particular value Visibility and Supply Chain Intelligence:
its general ledger may all provide this value. from a standardized source, such as an Creating an Optimal KPI Model,” or
However, each could come by that value ERP system. When the same value is click here for more information about
differently, which invites inconsistency. Figur- extracted from different sources, incon- Collaborative Consulting.
ing out how to rein in all that information, sistencies and inaccuracies creep in.
determine what it means, and represent it in
an accurate and timely fashion is indeed a • Data users. If only a small group of
executives needs to review KPI data,
Master Data a variety of purposes by multiple organiza- You can then interactively distribute informa-
tions. They include operational efficiencies, tion to clients as required.
Management: enhanced revenue opportunities, better
insight into business operations, and tighter The ideal solution integrates seamlessly with
Extracting Value From compliance with regulatory requirements— your organization’s existing infrastructure and
helping avoid fines, internal misbehavior, and those of your partners. Additionally, the solu-
Your Most Important the potential for losses in shareholder value. tion is intelligent enough to ensure ongoing
In virtually every respect, increased atten- harmony of accurate and up-to-date informa-
Intangible Asset tion to master data management can lead to tion from disparate sources and is readily
quantifiable increases in business value. accessible to ensure it supports the needs of
By Seth Halpern
Senior Principal of Value Engineering, SAP the entire business ecosystem.
Market and operating incentives, combined
with the advent of innovative technology, make Leading-edge technology helps you stream-
In a 2006 best practices survey of the a compelling case for deploying a master line and improve the aggregation of master
Americas’ SAP Users’ Group, 93 percent of data management program. This is especially data from disparate sources. The ideal
respondents experienced data management apparent when you consider the potential for solution manages the entire process, includ-
issues during their most recent projects. And improvement across all business processes. ing deduplication and normalization, ID
data management was identified as the root mapping, matching and merging, staging,
cause of problems in process improvement Overcoming Barriers to MDM Excellence change tracking, interactive data-quality
projects. Part of the problem stems from the To achieve effective master data manage- analysis, and ad hoc consolidation. You
fact that many organizations believe they are ment and improve operating performance, can then analyze consolidated data using a
already using master data. But the reality is you must adopt a solution that addresses the business intelligence solution. Ideally users
that they are operating within the confines of following three elements: experience near-real-time search perfor-
disconnected silos of data—the information mance, with multiple search mechanisms
contained in multiple systems, applications, Master data consolidation: Matching, nor- with every dimension interlinked. You should
and spreadsheets. Once master data is malizing, cleansing, and storing master data be able to search an entire repository easily
generated and trapped in silos, inaccurate imported from client systems. The principal with any item or group of items, and partial
and inconsistent information is perpetuated activities of master data consolidation are: strings and equivalents should be indexed
throughout the organization—and beyond. to increase positive results. Performance
This creates an incomplete view of the busi- • Identifying identical or similar objects should be measured in milliseconds—even
ness and limits your ability to aggregate and spread across local systems when managing repositories containing mil-
distribute data, which hampers even the lions of records.
most carefully planned and executed initia- • Building consolidated master data
tives. Achieving master data consistency for The Case for Master Data Management
all systems within a distributed IT environ- • Providing ID mapping for unified, com- Ultimately, master data management, when
ment has traditionally proven difficult. pany-wide analytics and reporting done correctly, enables reliable cross-
system, enterprisewide business processes
An organization’s primary data entities gener- Master data harmonization ensures that and analytics, ensuring that everyone
ally include customers, products, employees, master data is synchronized across hetero- involved in the process has access to the
and suppliers. As the volume of data in any geneous system landscapes. Extending the same information and knowledge. A solu-
or all of these grows, the importance and scope of master data consolidation, harmo- tion that enables the consolidation of master
complexity of managing the data increases. nization also encompasses the distribution data, as well as the availability and free flow
For example, wholesale distribution compa- of consolidated, globally relevant information, of consistent data across system boundaries,
nies may sell hundreds of thousands or even and the enrichment of client application sys- offers the most promising opportunity for
millions of different products. Maintaining tems with locally relevant information. improving business processes and a decisive
accurate data on each of these products, competitive advantage. ●
SKUs, and components across the inventory is Central master data management speaks
challenging enough, but imagine the problems to the maintenance and storage of master Learn more: http://www.sap.com/usa/platform/
across suppliers, customers, and third parties data and the development of distribution netweaver/components/mdm/index.epx
when naming conventions are inconsistent. mechanisms for delivering master data to the
systems that need it. This activity differs from Click here to request more information
Effective MDM offers a number of benefits master data harmonization in that master about SAP.
to a wide range of companies—particularly data is created centrally using a rich client.
those with large volumes of data used for
C A S E S T U DY
Master Data
Management: Myth
of the Mega-Vendor
Platform
By Anurag Wadehra
Vice President, Marketing and Product
Management, Siperian, Inc.
Industry analyst firm IDC is predicting It is being debated whether any mega- data hubs, data sources, and the larger appli-
that the master data management (MDM) vendor today can handle these differing cation platforms, but also allow you to evolve
market will grow to $10.4 billion by the year requirements in a single integrated MDM the architectural style over time across the
2009, with a compound annual growth rate platform. But here is the more pertinent organization to implement a comprehensive
of 13.8 percent. Other analyst firms such question to ask: is centralizing on a single master data management strategy.
as Gartner and Forrester claim that MDM MDM vendor even necessary? If the ser-
provides significant business benefits and vice-oriented architecture (SOA) promises Remember, MDM is a strategic process,
is a critical foundation for managing enter- of mega-vendors are indeed true, then why and while mega-vendors are vying for domi-
prise information assets. Not surprisingly, not select different best-of-breed solutions nance, each master data domain presents
the market hype from the mega-vendor triad that work together to deliver the most suit- unique challenges. An adaptive approach
(IBM, SAP, and Oracle) has been deafen- able solution? Shouldn’t each MDM solution is necessary to meet these challenges and
ing as each attempts to position its single leverage your past investments in data inte- enable you to start down the path to MDM
enterprisewide platform as a complete solu- gration infrastructure, legacy data hubs, and without compromising your ability to evolve
tion to MDM strategy. Despite the flurry of external data sources? Perhaps the most toward an enterprisewide master data
activity, no one is talking about MDM in the critical question to ask is: where is your management platform. ●
same way, and there is no consensus on the organization most likely to derive business
crucial questions that CIOs and executive value from a MDM platform? For a free white paper on this topic,
management teams want to address. While download “How to Sell Skeptics on the
ROI of Master Data Management,” or
MDM has come to the fore as a critical area The Path Forward:
click here for more information about
of data management practice with different Adaptive Master Data Management Siperian, Inc.
data domains, MDM requirements have not Many companies are finding that the sim-
yet coalesced into a coherent market. plest route to move forward with MDM is to
initially deploy one master data domain for
Is a Single Vendor Platform Necessary? a specific business need, and then extend
At the organizational level, MDM is a cohe- to other data domains over time. Which data
sive strategy for managing all master data domain to start with will differ by industry.
domains, whether product, customer, For example, it may be customer data in
employee, asset, or financial. Each of these high-tech, doctor, or hospital data for a phar-
master data domains differs greatly. For maceutical company, counterparty reference
instance, customer master data often origi- data in institutional banking, or product data
nates from multiple sources, including several in retail.
from outside the company. Customer master
data is often structured and well understood, By starting with one data domain, companies
while typically following a “business party” are able to achieve immediate ROI in an iden-
model. On the other hand, product master tified area, typically in less than six months’
data is usually generated internally and shared time. However, the key to this approach is to
externally among suppliers. The character- select an adaptive MDM platform that can
istics of product master data—such as in a easily be extended to different data domains
product catalog or an item master—are both over time to meet future requirements. An
structured and unstructured, while requiring a adaptive MDM platform should not only sup-
hierarchical data model. port SOA and allow you to coexist with existing
C A S E S T U DY
Master Data
Management and
Business Performance
Management
By José Villacís
Senior Product Marketing Manager, Hyperion
Solutions Corporation
A recent TDWI survey revealed that 51 Transactional or operational MDM tools tasks such as user-enablement and systems
percent of respondent organizations use a typically ensure that the right product, supply administration and controls. MDM respon-
master data management (MDM) solution to chain, or customer master data exists in ERP, sibility for performance management is
support business intelligence (BI) and trans- SCM, CRM, or other transactional system analogous; business users should be respon-
actional applications, and 32 percent use it modules. This information needs to be readily sible for making changes and maintaining
for BI alone. available, usually in real time, to ensure the master data, while IT enforces business rules
successful operation of these systems. The and manages the environment.
Moreover, 83 percent of respondents reported key requirement is consistency and timeliness.
that their organization had suffered problems So an MDM application must be built for end
due to poor master data; specifically, 81 per- But transactional MDM systems are not built users—not only IT—with an intuitive user
cent cited inaccurate reporting as a problem. to support the master data management interface that includes all the functionality
needs of BI and BPM—the challenge is man- required for managing master data changes
But bad reports aren’t the only potential aging the changes that are required in master and the organization’s master data lifecycle.
issue if you don’t have your master data data structures over time. There are compa- For example, it must have real-time valida-
under control. nies today that process thousands of changes tions, immediate user feedback, and approval
per month in their master data structures! levels to control changes. In addition, it must
When it comes to business performance man- provide robust auditing, versioning, rollback,
agement (BPM), you’re not working just with MDM solutions that can effectively support and integrity check capabilities for manag-
operational reports, but also with financial BPM (sometimes called Analytic MDM), ing the environment. Robust import and
information, analytics, KPIs, etc. This infor- while complementary to transactional MDM blending, as well as export features, are nec-
mation is usually contained in several data solutions, must rely heavily on highly flex- essary, as the type, frequency, and format of
stores, such as a data warehouse or multiple ible but robust business rules engines that updates will vary widely.
data marts, and in applications for financial can handle the varied nature of changes in
consolidation, planning and budgeting, stra- BPM-related master data. A key requirement Managing master data is essential for ensur-
tegic scorecards, analytic applications, BI is business user empowerment; otherwise, ing the quality and integrity of the information
tools, and, of course, transactional and other IT remains a bottleneck. Today most IT orga- in our organizations. Ensuring that transac-
systems that feed the BPM layer. nizations still use spreadsheets, messages, tional systems share the correct master data
and calls to confirm and agree on changes in a timely manner is critical. But making
Under this scenario, the master data prob- with business users before applying them. So sure that the BPM layer or “management
lem evolves from simply trying to centralize the overhead cost and risk of errors from this system” layer provides the right information
and synchronize master data from a system manual, iterative process are high. for decision makers is a priority, too.
of record, into a problem of how to effec-
tively manage changes in master data. This Business users know why and how changes What’s important is to identify the biggest pains
is because the frequency of change, the in master data and reporting structures first, then determine the right MDM solution(s).
nature of the change, the type of downstream should be applied. Whether set off by typical Start small, but outline a path to evolve your
system, the number of attributes, and the business decisions, or by disruptive events MDM solution to the enterprise level. ●
applicable business rules can vary widely. like mergers, acquisitions, or reorganizations,
Most important, these changes involve differ- it is business organizations, not IT, who initi- For a free white paper on this topic,
ent departments and stakeholders, depending ate change. For years we’ve predicated in the download “The Business Case for
Information Management,” or click here
on the type of master data (employee, vendor, BI industry that IT should empower business
for more information about Hyperion
customer, product, financial, etc.). users to do their own analyses and report- Solutions Corporation.
ing, allowing IT to focus on more productive
BY PHILIP RUSSOM
QUANTIFYING THE DATA CONTINUUM variety of information, ranging from prose (unstructured) to
Before drilling into BI search and text analytics, we need to transactions (semi-structured).
review the spectrum of available data sources. After all, the
“data continuum” has direct import on the scope of reports and In recent years, market research conducted by various
other documents indexed by search or mined by text analytics. software vendors and consulting firms has attempted to
quantify the relative percentage split between structured and
The data continuum breaks into three broad areas. unstructured data in the average user organization. Most
estimates name unstructured data the unqualified winner at
• Structured data. At one extreme of the data continuum, 80–85%, leaving structured data in a distant second place
structured data is commonly found in database at 15–20%.
management systems (DBMSs) of various types.
However, TDWI Research finds that unstructured data is
• Unstructured data. The other extreme includes not as overwhelming in volume as previously thought. In an
documents of mostly natural-language text, like word- Internet survey conducted in late 2006, TDWI asked each
processing files, e-mail, and text fields from databases respondent to estimate “the approximate percentages for
or applications. structured, semi-structured, and unstructured data across
your entire organization.” (See the top bar in Figure 1.)
• Semi-structured data. The area between the Averaging the responses to the survey puts structured data
two extremes includes semi-structured data in in first place at 47%, trailed by unstructured (31%) and
spreadsheets, flat files in record format, RSS feeds, semi-structured data (22%). Even if we fold semi-structured
and XML documents. Many of these media are used data into the unstructured data category, the sum (53%)
with cross-enterprise data-exchange standards like falls far short of the 80–85% mark claimed by other research
ACORD, EDI, HL7, NACHA, and SWIFT. organizations. The discrepancy is probably due to the fact
that TDWI surveyed data management professionals who
Some data sources are hybrids that are hard to categorize. deal mostly with structured data and rarely with unstructured
Despite the three broad types of data sources, the continuum data. All survey populations have a bias, as this one does
includes sources that can manage both structured and from daily exposure to structured data. Yet, the message from
unstructured data. For example, a row in a database table has TDWI’s survey is that unstructured data is not as voluminous
a well-defined record structure that defines fields of mostly as some claim.
numeric data types. Yet, the same record may also have fields
that are character data types, like text fields or binary large Now that we have a new and different quantification of the
objects (BLOBs). Likewise, a report may contain structured unstructured segment of the data continuum, what should
data (or a query that fetches structured data), as well as report we do about it? We should all pare down our claims about
metadata and text in headings that can be searched. RSS unstructured data volumes, but we should not change our
feeds are especially problematic, since they can transport a conclusions about what needs to be done. In other words,
Figure 1. Little unstructured or semi-structured data makes its way into data warehouses today.
Based on 370 respondents.
regardless of how the numbers add up, we all know that the NEW DATA WAREHOUSE SOURCES
average user organization has a mass of textual information FROM THE DATA CONTINUUM
that BI and DW technologies and business processes are As we’ve seen, the data continuum divides into three broad
ignoring. And this needs to change. segments for structured, semi-structured, and unstructured
data. In turn, each of these segments is populated by various
Why can’t data warehousing professionals go on ignoring types of systems, files, and documents that can serve as data
unstructured data? Among the many good reasons, two sources for a data warehouse or other BI solution. These range
stand out: from flat files, to databases, to XML documents, to e-mail,
and so on.
• The view of corporate performance seen from a data
warehouse is incomplete unless it represents (in a To understand which of these are feeding data into data
structured way) facts discovered in unstructured and warehouses today—and in the near future—TDWI asked,
semi-structured data. “Which types of data and source systems feed your data
warehouse?” Survey respondents selected those in use today,
• BI platforms today commonly manage thousands of as well as those they anticipate using in three years. Figure 2
reports, and techniques borrowed from unstructured charts survey responses for both today and the future; it also
data management (i.e., search) can make reports a lot calculates the expected rate of change (or “delta”). Judging by
more accessible. users’ responses to this question, the kinds of data sources for
the average data warehouse will change dramatically in the
To quantify the situation, TDWI asked each survey next few years:
respondent to estimate “the approximate percentages for
structured, semi-structured, and unstructured data feeding • Unstructured data sources will soon be more
into your organization’s data warehouse or BI processes.” common for data warehouse feeds. The survey
(See the bottom bar in Figure 1.) The survey responses reveal predicts the greatest increases with technologies that
that structured data accounts for a whopping 77% of data convey natural language information in text (aka
in the average data warehouse or other BI data store, darkly unstructured data), like voice recognition (up 81%
overshadowing semi-structured (14%) and unstructured data in three years), wikis (81%), content management
(9%). Indeed, little data originating in unstructured or semi- systems (72%), taxonomies (70%), instant messaging
structured form makes its way into data warehouses today, (69%), and RSS feeds (68%). Admittedly, some of
despite large quantities of it elsewhere in an organization. these show a high rate of change because they’re
(Figure 1 compares these.) starting from almost nothing, as with voice recognition
and wikis (11% and 12% today).
The dearth of unstructured data in the warehouse isn’t
surprising, considering that almost all best practices in data • Semi-structured data sources will increase
warehouse modeling demand structured data. Likewise, we moderately. This includes stalwarts like XML and
analyze and report off of data warehouse data using tools EDI documents (up 32% and 18% in three years,
that see data only through the eyes of SQL, which in turn respectively). The new kid on the block is the RSS
demands data in relational or multidimensional structures. feed, which contains both semi- and unstructured
As we’ll see in detail later in this report, you have to impose data. Most RSS feeds transport prose (unstructured
structure on unstructured data before it’s usable with a BI/ data as text), but are beginning to carry transactions
DW technology stack. as semi-structured data in markup documents. Either
way, 22% of survey respondents claim that their
data warehouse accepts RSS feeds today, and 90%
anticipate integrating data from RSS feeds in three
years. This makes sense, because RSS feeds operate in
near real time over the Web, and many organizations
WHICH TYPES OF DATA AND SOURCE SYSTEMS FEED YOUR DATA WAREHOUSE?
(SELECT ALL THAT APPLY FOR BOTH TODAY AND IN THREE YEARS.)
are looking for faster and broader ways to deliver as data warehouse sources seems plausible, TDWI
alerts, time-sensitive data, and transactions. Research is deeply skeptical about the decline in
relational databases and spreadsheets claimed by survey
• Miscellaneous unstructured sources will increase respondents. Since these are so deeply ingrained in BI
moderately, too. These are mostly files containing and in IT in general—and are spawning new instances
text, like e-mail (up 47% in three years), word- constantly—their decline seems very unlikely.
processing files (35%), Web pages (35%), and Web
logs (27%). Their increase will be moderate because The general trend—toward more unstructured data
they’re already established. sources. Survey responses show that priorities along the data
continuum will soon shift relative to data warehouse sources,
• Some sources of structured data may decline, but the with some data sources declining and others rising. Although
category will keep its hegemony. Survey respondents respondents may have been overly optimistic about the rate
anticipate reducing data extraction from various older of change they will embrace, the survey clearly signals a
types of database management systems (DBMSs), shift toward using more semi-structured and—especially—
namely those that are hierarchical (-15% in three unstructured data sources. The trend is plotted conceptually
years), mainframe (-30%), legacy (-46%), and flat in Figure 3, and the shift can be visualized as an increase
files in record format (-31%). Indeed, these are legacy in the types of data sources plotted in the middle or on the
platforms that are ripe for retirement or migration. But right side of the graph. Another way to see it is that the wide
survey respondents also anticipate extracting less data majority of data warehouse feeds today come from the left end
from spreadsheets (-21% in three years) and relational of the graph. These won’t go away, but instead will be joined
DBMSs (-22%). While the decline of legacy databases incrementally by more data sources toward the right end.
Figure 3. The data clearly signals a shift toward using more semi-structured and–especially–unstructured data sources.
RAMIFICATIONS OF INCREASING UNSTRUCTURED the structured data coming from unstructured data
DATA SOURCES sources. Since the data is usually structured by the
The evolving list of data sources means changes for DW/BI time it arrives in the data warehouse environment,
practices. Data warehousing professionals should be aware of adjustments should be slight. Similar adjustments are
these and prepare for them: required when users want to copy unstructured data
into a warehouse.
• Unstructured and semi-structured data must be
transformed into structured data. Note that sources • Training—and learning—are in order. Data
of unstructured and semi-structured data will be warehousing professionals currently have little or no
increasingly tapped for data warehousing, but that experience with unstructured or semi-structured data
doesn’t mean that much of this raw data will actually sources. Likewise, experience is rare with search and
go into a data warehouse. In most cases, this source text analytic tools. So additional training is needed,
data will need to be parsed for entity extraction and—due to minimal experience—the learning curve
or otherwise transformed into structures that are will be long and flat. ●
meaningful in a data warehouse or to a reporting tool.
C A S E S T U DY
Emerson Process
Management Unifies
Account Data for
a Stronger
Revenue Stream
Commentary by Nancy Rybeck
Customer Data Warehouse Strategy Architect,
Emerson Process Management
A Need for Coordination and correction, validation, and enhancement of existing accounts. With higher-quality data
Consolidation customer, prospect, and supplier data. The and an increased customer focus, Emerson
Emerson Process Management, a leading CDQ Platform is helping Emerson Process Process Management has seen tangible
global supplier of process-oriented products Management meet its goal of obtaining a benefits, most significantly a double-digit
and services, is a $3 billion operating unit of single view of each customer and partner. In percent increase in purchases per customer,
St. Louis-based Emerson, the Fortune 500 a short time, 70 percent of its customer and a reduction in selling costs, and improved
global technology and engineering firm. partner address records had been identified customer satisfaction.
as duplicates and eliminated. Also, nearly 85
Responding to global competition and a drive percent of the unique business sites within Emerson is an excellent example of how
to increase profitability, Emerson Process its database had also been deduplicated. improved data quality can help increase the
Management has evolved from a product- agility of a company, thus contributing to a
centric organization to a customer-focused strong revenue stream, better operational
enterprise. Consequently, the organization
Emerson is an excellent example efficiency, and increased shareholder value.
required technology that could enhance its of how improved data quality can They are one of thousands of progressive
knowledge about the customers and part- organizations that understand you are only as
ners of its global sales and manufacturing
help increase the agility of a com- good as your data.
divisions. Understanding each customer fully pany, thus contributing to a strong
and completely required accurate informa- An Essential Component of Customer
tion instantly available to all reaches of the
revenue stream, better operational Data Quality
Emerson organization. efficiency, and increased share- This solution combines leading-edge data
quality functions in a single dataflow envi-
This was a daunting challenge, as Emerson
holder value. ronment so you can make more informed
Process Management’s customer name and business decisions. Sophisticated parsing,
address data—representing over 75 busi- standardization, matching algorithms, and
ness units—resided in multiple operational “As a long-time Group 1 customer, our advanced address validation tools work
systems. It was found that 80 to 90 percent of decision to adopt the CDQ Platform was together seamlessly so you can unify account
those records could potentially be duplicates. guided by the solution’s unparalleled global data, eliminate redundancies, link members
To make matters worse, each division around address data support,” said Nancy Rybeck, of the same household, and expand your
the globe had its own method of managing its customer data warehouse strategy architect customer knowledge. ●
customer data. Each maintained its own data- at Emerson Process Management. “And with
base with a unique representation of customer a majority of our customers and partners
names and addresses. This uncoordinated doing business with many of our divisions,
approach to customer relationship manage- we required a solution such as the CDQ
ment could not support the firm’s goals. Platform that could perform powerful, user-
defined record consolidation.”
Gaining a Single Customer View
Emerson Process Management now uses The CDQ Platform is an integral component
Group 1 Software’s flagship data quality of the organization’s marketing efforts, most
solution, the Customer Data Quality (CDQ) notably cross-selling and up-selling initia-
Platform, which provides enterprisewide tives developed to expand penetration of
Without clean data, there is no CRM. • Understanding the intended use of Steps for Improving Data Quality
Poor data quality can lead to serious that data Improving data quality and using data more
business problems. strategically can be achieved by:
Several years ago, the introduction of new • Identifying factors that both determine
customer relationship management (CRM) and impact the data’s fitness for use 1. Conducting a data quality assessment to
systems from the world’s largest software help recognize the severity of database
houses was heralded as the panacea for • Establishing the policies, people, pro- quality issues.
the wayward enterprise. The formula for the cesses, and technologies to manage the
success of these systems seemed simple: quality of data 2. Adopting a well-defined data governance
interact with customers, collect valuable data, plan, including a definition of who owns
compile 360-degree views of individual cus- Given the current emphasis on the need to the data, who is authorized to access
tomers, and utilize the information to build maintain a 360-degree view of the customer, the data, and which specific standards
rich relationships that improve over time. data quality involves being able to link all of a should apply to the data.
given customer’s records together—a task that
After a while, however, many enterprises can only be accomplished with identifiers for 3. Developing a corporate-wide agreement
found they were not getting the value from the records associated with each customer. on data standards for master reference
these systems that they expected, and they data that describes common business
have arrived at similar conclusions: CRM An Absolute Necessity entities like products, customers, and
systems are really just front-end graphical Poor data quality can impact an organiza- suppliers.
user interfaces (GUIs) and canned business tion’s ability to increase customer retention
processes (best practices) that rely on data. and loyalty, limit exposure, and increase 4. Choosing a technology to serve as the
A superior CRM system does not guarantee operational efficiencies. For example, the backbone for preparation of relevant
data quality and is unable to generate return inability to eliminate redundant name and customer data that includes name and
on investment on its own. It is the quality address records can result in additional address and non-name and address
of the data that is fed into the system that mail-order campaign costs, customer dis- cleansing, change-of-address processing,
makes all the difference. satisfaction—even legal concerns. Imprecise tax jurisdiction assignments, personalized
data on the total business conducted with a messaging, tables and dictionaries, batch
Defining Data Quality single vendor can result in missed opportuni- and real-time processing, and more. ●
Data quality applies to more than just cus- ties for better rates with suppliers.
tomer name and address data. It applies to For a free white paper on this topic,
product numbers and associated descrip- Recent regulatory and Homeland Security download “Data Quality: The Foundation
of Operational Effectiveness,” or click here
tions, part numbers and units of measure, initiatives such as the U.S. Department of
for more information about Pitney Bowes
medical procedure codes and patient Treasury’s Office of Foreign Assets Control Group 1 Software.
identification numbers, telephone numbers, (OFAC), Sarbanes-Oxley, the U.S.A. Patriot
e-mail addresses, commodity codes, vendor Act, and the Health Insurance Portability and
numbers, and vehicle identification numbers, Accountability Act (HIPAA) can quickly spur a
to name just a few. company to establish a solid data foundation.
Ensuring data quality requires: These regulations will cause even lagging
organizations to recognize that an effective
• Understanding the nature of the data and data quality program is quickly becoming a
the degree of “trusted authority” from near-absolute requirement.
which it is derived
C A S E S T U DY
The Business Since Intellidyn continuously updates its data entire production process so that all con-
Intellidyn is an award-winning provider of the warehouse from multiple sources as fresher sumer data could be linked consistently over
nation’s most complete direct-response mar- information becomes available, the solution time at three levels: individual, household,
keting and multi-channel database marketing view had to span all disparate data sources and address. This solution enables Intellidyn
and analytics solutions, delivering strategic to compile complete, accurate records of to generate more than 1 billion linking keys in
insight and end-to-end marketing services to individuals, households, and addresses from less than 24 hours.
consumer businesses. In addition to compil- over 2,000 unique attributes. This new plat-
ing one of the industry’s most “battle-proven” form also needed to integrate seamlessly with The Results
prospect databases, Intellidyn also maintains the company’s existing IT platform. Addition- The DataFlux solution has contributed to
the largest, most in-depth repository of trans- ally, to address the changing marketplace, an impressive increase in the efficiency of
action, credit, demographic, and behavioral Intellidyn would require an agile system that Intellidyn’s data management. Access time
databases in the nation. The company pro- could answer complex queries in seconds, to multiple master files decreased from 22
vides unparalleled expertise in data analytics while converting, scoring, and arraying mas- hours to 6. Intellidyn clients now have cur-
and integrated marketing to consistently sive master file refreshes. rent, accurate, and reliable views of their
boost ROI. customers within hours, instead of the days
or even weeks that the same process once
The Challenge Intellidyn needed a data integra- took. The enterprise data management solu-
Building a successful consumer-centric data tion solution that could accurately tion offers Intellidyn a “360-degree view” of
warehouse has been a challenge for many every consumer in the United States.
companies. The majority of these attempts
and consistently identify individu-
have failed to meet expectations. Since data als across 60 terabytes of data on “It all comes down to what consumers
management was the very core of Intellidyn’s demand from our marketing clients—‘know
business, failure was not an option for the
more than 190 million U.S. cus- me… or my loyalty drops to zero,’” said
company. Intellidyn’s experience was that tomers in real time, while being Peter Harvey, president of Intellidyn.
data warehousing and customer relationship “DataFlux provides the ability to build the
management were not the sole domain of IT.
transparent to “power users.” customer-centric views enabling smart mar-
Success was in allowing savvy database mar- keters to demonstrate to their customers
keters to drive the IT solution. that they do in fact ‘know them.’ In return,
The Solution these marketers realize continuous boosts in
More importantly, the competitive environ- Intellidyn implemented an enterprise data performance. Without this critical data foun-
ment was becoming more fluid and dynamic. warehouse using DataFlux data management dation, all other data warehousing and CRM
As consumers embraced new channels, technology to handle data profiling, data functionality is compromised.” ●
such as online shopping and voice-activated quality, data integration, and data enrich-
transactions, these radically increased not ment. DataFlux provided Intellidyn with the
only the volume of data that Intellidyn clients flexibility to create match codes based on
were handling, but also the speed at which fuzzy logic algorithms. The integration of this
it had to be processed. Intellidyn needed a logic allowed Intellidyn to create different
data integration solution that could accurately degrees of matching accuracy, doubling the
and consistently identify individuals—and company’s matching efficiency.
all data elements relating to those individu-
als—across 60 terabytes of data on more DataFlux technology gave Intellidyn the abil-
than 190 million U.S. customers in real time, ity to perform phonetic matching, cleanse
while being transparent to “power users.” common data errors, and standardize the
Several years ago, a large manufacturing Uncertainty about customers’ identities can different sources are indeed the same cus-
company lost a key distribution center to a severely compromise efforts to build stronger tomer, and intelligently integrates customer
fire. The fire destroyed not only the building, relationships. In today’s competitive market- information from multiple applications and
but also thousands of shipments destined for place, if customers don’t feel valued, they will databases.
a global customer base. take their business elsewhere. Customers are
hard to acquire, but easy to lose. The various records for Michael William
Naturally, the CEO of this company wanted Smith, Mike Smith, and Michael W. Smith,
to send a letter to customers to explain the Adding to the Customer Equation for example, are determined by identity logic
situation and to provide a timetable for when Customer data integration, or CDI, is an to indeed be the same individual, provided
operations would return to business as usual. emerging method for compiling the most other data points are similar. Companies can
The CEO passed the request to the vice authentic customer information from all appli- flag information for linking customers across
president of customer relations, who in turn cations, databases, and customer touchpoints applications and sources, and isolate the
asked IT to generate a list of all customers into one centralized data source. By bringing best data from multiple sources.
for that particular center. the best information about customers to the
surface, CDI strives to deliver consistent, Building Lasting Customer Relationships
Pervasive Problem accurate, and reliable information—regard- CDI solutions are helping companies create
The IT staff pulled reports from its customer less of the originating application. consistent, accurate, and reliable data—and
relationship management (CRM), enterprise deliver a truly unified view of their customers
resource planning (ERP), and billing and The benefit is that the data itself—not the that builds a firm foundation for sales, sup-
supply chain management systems. What applications—is the focus. Each business port, and marketing functions.
they found reveals a distressing but pervasive unit can view the same information about
problem that exists in most organizations. customers, which improves support and ser- Thanks to CDI, organizations are develop-
vice across business functions. ing healthier, more lasting relationships with
Each list contained different, overlapping, their customers, and can market more intel-
and confusing views of the customer base. Companies are now turning to CDI solutions ligently—and profitably—to them. ●
Some names were listed or spelled slightly with two added components: robust data
differently; some addresses were a bit off. quality capabilities and sophisticated identity For a free white paper on this topic,
The end result was that this company could logic, also known as identity management. download “Semantics, Metadata and
Identifying Master Data,” or click here
not create an accurate and inclusive list of With these components, users can improve
for more information about DataFlux.
customers affected by the loss of the distri- the quality of data while also identifying and
bution center. managing the same customer sets across
sources and applications.
For years, enterprise applications—in par-
ticular, CRM systems—have been promising The data quality component typically begins
a “single view of the customer.” However, with an in-depth data profiling phase. The
the proliferation of systems has led to company then builds in business rules to
more confusion at a data level about the standardize and verify addresses and other
customer base. In fact, marketing and cus- attributes, reconcile conflicting information,
tomer relations executives are struggling to validate name and address information, and
understand even the most basic questions: add demographic data to enhance the value
Who exactly are our customers? Which of information.
customers are we trying to target? Who
are our best customers? Which customers The second component is identity logic—a
represent our best opportunities? crucial phase of any successful CDI effort.
This determines whether customers listed in
C A S E S T U DY
The Challenge need to avoid potential customer frustration quality of data from which business intel-
M&S Money is a leading U.K. financial ser- caused by inaccurate data, enabling the ligence is created.
vices company, a wholly owned subsidiary company to undertake business analysis
of global banking leader HSBC. It retains a and, therefore, sales activities with greater The Solution
close working partnership with UK-based confidence. In selling to Marks & Spencer M&S Money chose the Informatica Pow-
Marks & Spencer and collaborates on customers, M&S Money has adopted the erCenter data integration solution as the
customer activities. Like all financial ser- policy that when it is not completely confi- cornerstone of its project. Informatica Data
vices providers operating in the E.U., M&S dent about data accuracy, and therefore a Quality was specifically selected to assure the
Money was faced with the Basel II Capital customer’s likely interest, it will not approach quality of data input and extraction from its
Accord, requiring the tracking and reporting the customer at all, rather than risk any Oracle data warehouse. Given the impend-
of ongoing exposure to credit, market, and frustration. “We cannot ever risk upsetting ing Basel II deadline, M&S Money created
operational risks. The compliance deadline a customer because of problems with our a specification in mid-2004 for a data qual-
was January 1, 2007. Basel II is a demand- data,” Hershaw crystallized the issue. ity initiative that would meet the regulatory
ing piece of legislation that requires extensive requirements and the Financial Services
expertise and sophisticated data manage- Authority (FSA) mandates, as well as deliver
ment capabilities if the quality and integrity of “There was obviously the need to a platform for enhanced organization-wide
data is to be consistently assured. data quality management. Informatica was
satisfy the data quality elements
chosen because of PowerCenter’s technical
M&S Money therefore needed to deliver of Basel II, but also taking this attributes, its scope for customization, and
a data quality initiative that ensured data its ability to integrate a multitude of dispa-
enhanced overall approach to
was managed according to a set specifi- rate data feeds. Moreover, the Informatica
cation throughout the whole information data management would give us team demonstrated a wealth of knowledge of
environment, imposing quality controls at how best to achieve the level of data quality
greater confidence in our activities
many points throughout the architecture required for Basel II compliance and brought
while retaining centralized control and to create sales and therefore help the experience of managing similar scales
management. Neil Hershaw, information of assignments for major banks and finan-
drive competitive advantage in
management officer for M&S Money, saw cial institutions. “Basel II sets the bar high
this mandate as an opportunity to improve the market.” on required data management practices.
data quality enterprisewide. “Basel II’s com- Neil Hershaw, M&S Money
Compliance dictates the ability to correlate
pliance deadline presented us with both a a significant history of consistent, accurate,
challenge and an opportunity—delivering and granular data. Equally, as an organiza-
high-quality data to ensure successful risk tion, we wanted to ensure we could always
management, but also improving our data to As part of the project to ensure Basel II com- undertake quantitative measurement of our
drive improvements in many other areas of pliance, therefore, M&S Money planned and data to be utterly assured of its accuracy,”
our operations.” delivered a data quality project to accomplish said Hershaw.
a number of operational benefits, namely
In addition to the compliance requirements, the assurance of customer data accuracy, Constant Monitoring of Data Quality
M&S Money sought broader data man- reduction of time taken to complete business The primary operational driver was that M&S
agement benefits. These centered on the analysis, and increased assuredness in the Money needed to be constantly aware that its
data was of the appropriate quality and that it pliance with Basel II’s stipulations; this was that drives business development. After
could always demonstrate clear management completed successfully a year ahead of Informatica Data Quality went live, it took
of the data. Informatica Data Quality was the deadline. The company has created an business analysts just four days to develop
deployed as a data quality management plat- assured level of data quality that has enabled 20 Basel II business rules on the fly, deliver
form alongside the data warehouse to ensure it to undertake business analysis and finan- a data accuracy scorecard, create profiles
that data feeds comply with the required cial reporting with greater confidence. on account history tables, and develop other
quality levels, ensuring that BI data remains business rules that were then added to the
wholly accurate. Equally, the solution had to Forty Percent Reduction of the Analysis Cycle scorecard. “Basel II has been a blessing in
support M&S Money’s new data manage- A further gain has been the reduction of the disguise. Improved data quality and the abil-
ment procedures, which were introduced as analysis cycle by up to 40 percent, meaning ity to measure that quality have impacted our
part of the project, so that any changes that faster comprehension and validation of infor- business for the better in several areas,” said
could negatively impact quality would be mation thanks to less time spent assessing Hershaw. “It is clear that business analysis
immediately addressed and nullified. the accuracy of the data. has benefited, but also there are factors such
as how we evaluate our sales and marketing.
Ensuring Data Quality from Third Parties Flexibility and Support We are now able to ascertain with confidence
In creating this caliber of data quality man- “We evaluated several potential solutions,” what activities have caused a sale.
agement platform, the company also had to Hershaw summarized, “and chose Infor-
consider the need to assure the sustained matica because they have vast experience in “We live by the mantra that customers must
accuracy of data derived from third-party Basel II compliance and Informatica Power- be presented with no surprises, but we also
feeds, such as those from insurance policy Center was the right solution for our needs. experienced no surprises of our own in meet-
providers. Again, Informatica Data Quality It offered ease of use, support for our existing ing the requirements of Basel II. It was a
provided the required support for these IT environment, and the ability to be custom- tough assignment, but the detailed planning
disparate data feeds, and therefore enabled ized to meet our precise requirements. We and execution enabled us to minimize the
all data quality to be managed within a needed to ensure that all data feeds, be they scale of the challenge. We’re now even more
single framework. internal or external in origin, were in line with confident that we will always deliver on our
our quality specifications. The Informatica commitments to customers and can assure
Regular Evaluation platform has given us the ability to provide high levels of data quality to assist in running
The methodology for the project began with that assurance to our business and to enable our business,” Hershaw concluded. ●
defining data quality rules for the relevant our analysts to create rules for how the data
files and tables, coding those rules, running should be treated, without having to consult
the data, then analyzing the data and a program or make changes to our systems.”
creating an action plan for quality assur-
ance. Once the project is completed, data An Engine to Drive Business Development
quality levels are evaluated formally on a M&S Money’s IT team is now able to provide
quarterly basis. quantitative measurement of the data held
by the organization, which was a broader
The Results goal of the initiative. For a financial services
Basel II Compliance company, that resource has proven to be
The main driver for the business was com- invaluable in providing the “engine room”
Continuous Measurement A well-defined set of metrics should be used Metrics and scorecards that report on data
Data quality must be tracked, managed, to get a baseline understanding of the levels quality, audited and monitored at multiple
and monitored if it is to improve business of data quality; this baseline should be used points across the enterprise, help to ensure
efficiency and transparency. Therefore, being to build a business case to justify invest- data quality is managed in accordance with
able to measure and monitor data quality ment in data quality. Beyond that, the same real business requirements. They provide both
throughout the lifecycle and compare the metrics become central to the ongoing data the carrot and the stick to support ownership,
results over time is an essential ingredient in quality process, enabling business users responsibility, and accountability. But, beyond
the proactive management of ongoing data and data stewards to track progress and the data quality function itself, the metrics
quality improvement and data governance. quickly identify problem areas that need to used for monitoring the quality of data can
be addressed. actually roll up into higher-level performance
Organizations need a formalized way of setting indicators for the business as a whole. ●
targets, measuring conformance to those tar- Breaking down data issues into these key
gets, and effectively communicating tangible measures highlights where best to focus your For free white papers on this topic,
data quality metrics to senior management data quality improvement efforts by identify- download “Profiling: Calculating Return
on Investment for Data Migration and
and data owners. Standard metrics provide ing the most important data quality issues
Data Integration Projects” or “Monitoring
everyone (executives, IT, and line-of-business and attributes based on the lifecycle stage of Data Quality Performance Using Data
managers) with a unified view of data and your different projects. For example, early in Quality Metrics,” or click here for
data quality, and can also provide the basis a data migration, the focus may be on com- more information about Informatica
for regulatory reporting in certain circum- pleteness of key master data fields, whereas Corporation.
stances, such as Basel II, where there are the implementation of an e-banking system
specific data quality reporting requirements. may require greater concern with accuracy
during individual authentication.
Metrics to Suit the Job
Ultimately, data quality monitoring and Scorecarding
reporting based on a well-understood set of Inherent in the metrics-driven approach is
metrics provides important knowledge about the ability to aggregate company-wide results
the value of the data in use, and empow- into data quality scorecards. A scorecard
ers knowledge workers with the ability to is the key visual aid that helps to drive the
BY WAYNE W. ECKERSON
APPLICATIONS. Predictive analytics can identify the custom- NO SILVER BULLET. However, it’s important to note that
ers most likely to churn next month or to respond to next predictive analytics is not a silver bullet. Practitioners have
week’s direct mail piece. It can also anticipate when factory learned that most of the “intelligence” in these so-called decision
floor machines are likely to break down or figure out which automation systems comes from humans who have a deep
customers are likely to default on a bank loan. Today, market- understanding of the business and know where to point the
ing is the biggest user of predictive analytics with cross-selling, tools, how to prepare the data, and how to interpret the results.
campaign management, customer acquisition, and budgeting Creating predictive models requires hard work, and the results
and forecasting models top of the list, followed by attrition are not guaranteed to provide any business value. For example,
and loyalty applications. (See Figure 3.) a model may predict that 75% of potential buyers of a new
product are male, but if 75% of your existing customers are
male, then this prediction doesn’t help the business. A marketing
APPLICATIONS FOR PREDICTIVE ANALYTICS
program targeting male shoppers will not yield any additional
value or lift over a more generalized marketing program.
WHAT IS THE BUSINESS VALUE OF PREDICTIVE ANALYTICS THE PROCESS OF PREDICTIVE MODELING
TO YOUR ORGANIZATION? METHODOLOGIES. Although most experts agree that predictive
analytics requires great skill—and some go so far as to suggest
that there is an artistic and highly creative side to creating
models—most would never venture forth without a clear
methodology to guide their work. In fact, process is so impor-
tant that in 1996 several industry players created an industry
standard methodology called the Cross Industry Standard
Process for Data Mining (CRISP-DM).
Figure 4. Based on 166 respondents who have implemented predictive analytics. Regardless of methodology, most processes for creating predic-
tive models incorporate the following steps:
HOW DO YOU MEASURE SUCCESS? 1. PROJECT DEFINITION: Define the business objectives and
desired outcomes for the project and translate them into
predictive analytic objectives and tasks.
HOW DO YOU DELIVER PREDICTIVE ANALYTICS? 5. DEPLOYMENT: Apply model results to business decisions or
WHAT NOW? While some organizations have discovered the processes. This ranges from sharing insights with business
power of predictive analytics to reduce costs, increase revenues, users to embedding models into applications to automate
and optimize business processes, the vast majority are still decisions and business processes.
looking to get in the game. Today, most IT managers and some
business managers understand the value that predictive analytics 6. MODEL MANAGEMENT: Manage models to improve per-
can bring, but most are perplexed about where to begin. formance (i.e., accuracy), control access, promote reuse,
standardize toolsets, and minimize redundant activities.
“We are sitting on a mountain of gold but we’re not mining it
as effectively as we could,” says Michael Masciandaro, director Most experts say the data preparation phase of creating predic-
of business intelligence at Rohm & Haas, a global specialty tive models is the most time-consuming part of the process,
materials manufacturer. “We say we do analytics, but it’s really and our survey data agrees. On average, preparing the data
just reporting and OLAP.” occupies 25% of total project time. However, model creation,
testing, and validation (23%) and data exploration (18%) are
Rohm & Haas has hired consultants before to build pricing not far behind in the amount of project time they consume.
models that analyze and solve specific problems, but these This suggests that data preparation is no longer the obstacle it
models lose their usefulness once the consultants leave. Masci- once was.
andaro says building an internal predictive analytics capability
could yield tremendous insights and improve the profitability
of key business areas, but he struggles to understand how to RECOMMENDATIONS
make this happen. Now that we’ve defined predictive analytics, assessed its busi-
ness value, and stepped through key trends and processes, it’s
“How do you implement advanced analytics so they are not a important to provide specific recommendations to BI managers
one-off project done by an outside consultancy?” asks Masci- and business sponsors about how to implement a predictive
andaro. “How do you bring this functionality in-house and analytics practice. This section offers five recommendations
use it to deliver value every day? And where do you find people that synthesize best practices from various organizations that
who can do this? There are not too many of them out there.” have implemented predictive analytics.
1. HIRE BUSINESS-SAVVY ANALYSTS TO CREATE MODELS solution for managing the data that the analytic modelers
Every interviewee for this report said the key to implement- may want to use. While it is not necessary to build a data
ing a successful predictive analytics practice is to hire warehouse to support the analytic process, a data ware-
analytic modelers with deep understanding of the business, house can make the process infinitely easier and faster.
its core processes, and the data that drives those processes.
SAVING TIME. A data warehouse pulls together all relevant
“Good analysts need strong knowledge of narrow business information about one or more domains (e.g., customers,
processes,” says Keith Higdon of Sedgwick CMS. In claims products, suppliers) from multiple operational systems and
processing, Higdon says good analysts “understand the then integrates and standardizes this information so it can
business of claims handling; the interplay of variables across be queried and analyzed. With a data warehouse, analysts
claim, claimant, and program characteristics; and what only have to query one source instead of multiple sources to
data they can and cannot rely on.” Only then, he adds, can get the data they need to build models.
analysts create “meaningful models” that result in positive
business outcomes. 5. BUILD AWARENESS AND CONFIDENCE IN THE TECHNOLOGY
One of the toughest challenges in implementing analytics
2. NURTURE A REWARDING ENVIRONMENT TO RETAIN is convincing the business that this mathematical wizardry
ANALYTIC MODELERS actually works. “Building confidence is a big challenge,”
Since good analytic modelers are difficult to find, it’s says one analytics manager. “It takes a while before business
imperative that managers create a challenging and reward- people become confident enough in the models to apply the
ing environment. Of course, money is requisite. Top-flight results. The ultimate litmus test is when business people
analytic modelers often command higher salaries than classic are willing to embed models within operational processes
business or data analysts. But good analytic modelers are and systems. That’s when analytics goes mainstream in an
motivated by things other than money and status, says Tom organization.”
Breur, principal of XLNT Consulting in the Netherlands.
But getting to a lights-out analytical environment is not
“You don’t attract analytic modelers with the same incen- easy. Most organizations, even those with large analytic
tives as other people,” he says. “They want an opportunity staffs, still only apply analytics in pockets and have yet to
to demonstrate their skills and learn new things, so you truly operationalize the results.
have to increase their training budgets. I’ve struggled with
human resources departments on this issue.”
CONCLUSION
3. FOLD PREDICTIVE ANALYTICS INTO THE INFORMATION Applying these five recommendations should enable any
MANAGEMENT TEAM organization to implement predictive analytics with a good
THE INSIDE TRACK. Traditionally, analytics teams are measure of success. While many people seem intimidated by
sequestered away in a back room somewhere and report to an predictive analytics because of its use of advanced mathemat-
ambitious department head (usually sales or marketing) who ics and statistics, the technology and tools available today
is seeking to ratchet up sales and get an edge on the competi- make it feasible for most organizations to reap value from
tion. Unfortunately, this approach is not optimal, according predictive analytics. ●
to most practitioners. Analytic modelers are voracious
consumers of data, and must establish strong alliances with
the data warehousing team to maintain access to the data.
Wayne W. Eckerson is the director of research and services for
“Since I work in the data warehousing department within TDWI and author of Performance Dashboards: Measuring, Monitor-
IT, I go to my colleagues and say, ‘I need this,’ and I usually ing, and Managing Your Business (John Wiley & Sons, 2005).
get it. I get to the head of the queue faster than someone He can be reached at weckerson@tdwi.org.
from outside, which means I get more [storage] space,
quicker access to data, and more privileges. As a result, my This article was excerpted from the full, 32-page report by the
projects get pushed faster. Those are the unwritten rules,” same name. You can download this and other TDWI Research free
says an analytic modeler. of charge at www.tdwi.org/research.
4. LEVERAGE THE DATA WAREHOUSE TO PREPARE AND SCORE The report was sponsored by MicroStrategy, Inc., OutlookSoft Cor-
THE DATA poration, SAS, SPSS, Sybase, Inc., and Teradata, a division of NCR.
Once you’ve hired the people and established the organiza-
tion, the next important task is to provide a comprehensive
C A S E S T U DY
Data Warehouse
Appliance Combines
Unconstrained
Analytics with Energy
Efficiency for UK’s
Leading Mobile Carrier
Commentary by Stephen Hawkins
Senior Data Architect, Orange UK
Customer Profile at least half as much data again to the data- Market Leader Orange Raises the BI Bar
Orange UK is the United Kingdom’s most base, yet still achieve the same speed of For Orange UK, Netezza’s data warehouse
popular mobile phone service, with more response,” explains Sylvia Sawkins, manager appliance has significantly improved data
than 13 million active subscribers. Orange of data control, Orange UK. mining on CDRs, enabling highly targeted
UK is at the forefront of efforts by mobile marketing to ensure that the highest value is
communications carriers to develop sophis- obtained from each customer, while maximiz-
Orange’s BI team has been able
ticated marketing programs that deliver ing retention. And by reducing complexity,
targeted products and services to customers. to dramatically increase its respon- power consumption, and space require-
ments—three of the biggest costs facing IT
siveness to internal business users.
Business Challenge departments—the Netezza system provides
Orange’s success and future growth depends Orange can now analyze billions an efficient platform for future growth.
on in-depth insight into customer calling
of CDRs to offer highly targeted
patterns and recognition of behavior trends Now that the NPS appliance enables timely
through the analysis of call detail records promotions, while improving the access to terabytes of data, Orange is able
(CDRs). As data volume continued to to turn its attention to new business chal-
ability to detect fraud and capture
increase, Orange determined that it needed lenges. “The system started by making our
a new data warehousing platform. Orange cross-carrier billing. lives easier; this led to an ability to handle a
needed to be able to analyze billions of CDRs greater capacity. Culturally, now the business
in a fraction of the time used by other sys- demands more from its data, expectations
tems, while reducing its equipment footprint. Substantial Space and Power Savings are set much higher, so there’s a great focus
With space and power at a premium within on quality,” Sawkins adds. The goal: data
Deeper, Wider, Faster Orange’s data centers, the NPS system that is not just faster, but better. ●
With its Intelligent Query Streaming archi- saved large amounts of both. With its
tecture, the Netezza Performance Server previous system, Orange was consuming
(NPS) data warehouse appliance has allowed almost 25,000 watts and 85,000 BTU per
Orange to broaden and deepen its BI efforts hour. Once it implemented a three-tera-
while using its existing analytical tools. byte Netezza data warehouse appliance,
Orange is now able to run more queries than Orange saw its power consumption drop to
it could before, with faster query times. 7,000 watts, and cooling requirements were
As a result, Orange’s BI team has been able reduced to 24,000 BTU per hour.
to dramatically increase its responsiveness to
internal business users. Orange can now ana- Even as the amount of information con-
lyze billions of CDRs to offer highly targeted tinues to grow, Netezza’s compact data
promotions, while improving the ability to warehouse appliance has allowed Orange
detect fraud and capture cross-carrier billing. to significantly reduce the equipment foot-
print in its data center, with the number of
The system is also able to keep pace with cabinet spaces dropping from 26 to 9. The
rapidly rising data volumes. “Since we first speed and ease of deployment were an
installed the Netezza system, we have added added bonus.
Data Warehouse
Appliances Help
Tackle the Data Center
Power Crisis
By Phil Francisco
Director of Product Marketing, Netezza Corporation
Increasing power costs and increasingly Netezza Performance Server (NPS) system. (FPGAs) to do the bulk of the data filtering,
constrained power availability are prompting The NPS system is built specifically for with an embedded PowerPC chip handling
many companies to reexamine their IT strate- high-performance data warehousing, while the remainder. The AMPP architecture,
gies and budgets. Companies are facing a consuming significantly less power and rather than expensive components, provides
power dilemma—do they scale back growth generating less heat than other solutions. the performance difference—along with sub-
plans, pay the cost of building an additional Netezza’s architecture uses intelligent stor- stantial energy savings.
facility, or relocate the data center where age nodes to process data as it streams off
costs are lower? the disk, increasing performance. Each of The Energy-Efficient Data Warehouse
these nodes uses an embedded PowerPC The Netezza data warehouse appliance, with
Smaller… Faster… Hotter chip that consumes 4.5 watts of power streaming processing on its energy-efficient
Data center servers used for general-purpose (compared to 70 or more watts in x86- intelligent storage nodes, consumes a frac-
computing are dominated by Intel/AMD, or based systems), reducing the power typically tion of the power required by systems based
x86, architectures. This chip technology is required by other data warehouse solutions. on other leading processors and general
at the heart of blade servers and rack serv- The architecture of Netezza’s data warehouse computing architectures. Customers pro-
ers—including those used in other data appliance—rather than expensive compo- cess massive queries in significantly shorter
warehouse systems—and contains features nents—makes the difference. periods of time, with much lower electricity
designed to improve performance for gen- consumption and minimal space require-
eral- purpose computing. But the increasing ments that translate into lower cost
data volumes and performance demands of Customers process massive queries of ownership. ●
data warehousing are putting the CPU under in significantly shorter periods of
tremendous strain. CPU designs based on For a free white paper on this topic,
x86 architectures are moving more data time, with much lower electricity download “Large Scale Use of the
Netezza Performance Server System,” or
through smaller components at faster speeds consumption and minimal space click here for more information about
than ever before. With so much processing Netezza Corporation.
activity and input/output traffic condensed requirements that translate into
into a very small space, the devices produce lower cost of ownership.
much more heat, and require significantly
more cooling, than their predecessors of only
a few years ago.
Low Power, High Performance
Green and Greener Netezza’s architecture takes a different
The industry is responding to this in several approach to processing queries than archi-
ways. Chip manufacturers are announcing tectures developed for general- purpose
CPUs that deliver higher performance than computing. This not only provides perfor-
previous top-of-the-line chips, yet draw mance gains; it also lowers power
less power. Industry consultants and consumption and heat.
experts abound with advice on how best
to cool the data center or improve its the Unlike many traditional data warehousing
energy efficiency. systems, Netezza’s asymmetric massively
parallel processing (AMPP) architecture
But for data warehousing, greater energy and is built for streaming processing of data.
space efficiency has already arrived in the The system’s architectural approach uses
form of a data warehouse appliance—the commodity field programmable gate arrays
Raw Performance
Information Appli-
Data Warehousing ance Initiative. These
WORKLOAD SIZE
configurations draw
By Chris Buss, World Wide Manager, BCS BI
Marketing, HP upon both companies’
Bill Nee, Senior Director, Database Product extensive lab and per-
Price/Performance
Marketing, Oracle Corporation M
formance benchmark OR U
A T F e r, C P
investments as well L
P Clu s t
S,
HP and Oracle jointly offer a rich set of as global experience RAW DATA SIZE O/
C A S E S T U DY
Aquascutum
Innovates in Retail
Commentary by Robin Coles
Business Development Manager, Prologic
Customer Overview The Solution and thus empowered analysts and decision-
Aquascutum was founded in 1851 by To meet this need, Prologic integrated makers to achieve rapid business insight. For
a tailor in the heart of Mayfair, London. HyperRoll into the Aquascutum CIMS Vision example, a weekly business key performance
Aquascutum’s early business was focused on application. Aquascutum can now provide its indicator (KPI) report that used to take 80
developing an innovative new attribute for its users with lightning-fast reporting capabili- minutes to open and up to 30 minutes to
coats: the “shower-proofing” of wools. Today, ties and enable users to perform increasingly print, now takes only 2 minutes to open and
Aquascutum is a leading fashion house that sophisticated queries. HyperRoll combines 30 seconds to print.
offers designer clothing for men and women, patented query acceleration, caching, and
luggage, and accessories. compression algorithms to deliver immediate “If our queries take hours to run, it is very
and substantial improvements in business difficult to analyze our business and make
Business Challenge intelligence performance. HyperRoll’s tech- strategic decisions fast enough to make
Aquascutum relies on CIMS, a dedicated nology works seamlessly alongside CIMS a difference,” said Craig Kayes, business
retail management application, from Pro- Vision to accelerate data loading by up to projects manager of Aquascutum. “Prologic
logic to manage key areas of the business, factors of 200, while enabling access and and HyperRoll have allowed us to analyze
including merchandising, warehousing, analysis of data in unparalleled detail. our key business drivers on an ongoing basis
distribution, allocation, replenishment, and and proactively use our data in our decision-
sourcing. CIMS is an end-to-end enterprise making process. We anticipate that these
resource planning (ERP) solution that has
Enormous amounts of data can improved reporting capabilities will continue
been deployed at more than 35 fashion and now be summarized at the touch of to have a dramatic impact on our business.”
lifestyle companies in the U.K.
a button, improving business visibil- Such reductions in query response times are
In the dynamic world of fashion, retailers ity and enabling users to become already providing benefits to multiple areas
such as Aquascutum need timely access of the business. Aquascutum is now able
to detailed business information, enabling
more proactive in their approach to report in detail on every single style in its
them to react to trends while managing to merchandising. business, perform ad hoc queries, and con-
inventory and making a host of other critical duct extensive “what-if” analyses to better
Robin Coles, Prologic
business decisions. Aquascutum uses Pro- inform a range of business decisions. The
logic’s sophisticated reporting tool—CIMS company is now able to respond to situations
Vision—to deliver the information required more quickly, while identifying issues and
within its dynamic business. However, as “Using HyperRoll’s data aggregation soft- creating strategies to address current oppor-
the data volumes and requirements have ware, we are now able to offer substantially tunities or threats. In addition, they are able
grown, the company began to find that the increased analysis capabilities to our custom- to drill down into detail, identifying the root
system lacked the back-end power required ers. Enormous amounts of data can now be causes and rapidly resolving them. ●
to deliver timely information. As a result, a summarized at the touch of a button, improv-
number of reports slowed to the point where ing business visibility and enabling users to
basic information could take several hours become more proactive in their approach to
to generate. merchandising,” said Robin Coles, business
development manager of Prologic. “We antic-
In addition, Aquascutum was anxious to ipate wide adoption of the HyperRoll-enabled
better exploit the information CIMS Vision CIMS Vision solution, both within our custom-
provided by generating ad hoc and “what if” ers’ organizations, and the market
analyses; this had proved impractical due to as a whole.”
the sheer volume of data contained within its
data warehouse. The Results
By integrating HyperRoll into the CIMS Vision
reporting environment, Aquascutum has
dramatically improved its query performance
Investing in Data
Warehouse Query
Performance
By Eric Rogge
Vice President of Marketing, HyperRoll
There are many challenges with data ware- • Aggregates That approach requires a fundamental
houses, not the least of which is estimating Most queries to a data warehouse are understanding of all possible data ware-
the investment in software, hardware, and aggregate queries, requiring extra com- house hardware and software architecture
effort to optimize the performance of a putational work by the relational database approaches for assuring performance. Much
data warehouse. It is important that data management system (RDBMS). Support- as successful fishermen often have secret
warehouse architects, managers, and other ing a few aggregates is easy, but adding and unique ways of catching fish based
stakeholders enter into the data warehouse dimensional permutations can quickly upon deep knowledge of fish behavior and
investment cycle with honest and realis- scale the number of aggregates to calcu- environment, data warehouse stakehold-
tic assessments of performance need. Of late to thousands or more. ers should cultivate broad understanding
course, the best plans can go awry. Plus, of performance enhancement options and
there is always a political aspect to the invest- • Schema complexity use innovative ways to meet their data ware-
ment, where commitments are made and As with query complexity, schemas are house performance demands. This means
expectations are set by vendors, by external now more complex because they must moving beyond the mainstream methods and
consultants, by internal architects, and by describe more complex business opera- seeking alternatives that are not in use by
IT managers. Any unmet expectation by the tions. This is especially true as users business competitors.
data warehouse or by these stakeholders can demand new and different views of the
damage business relationships and careers. data from the data warehouse. An evaluation framework for assuring data
warehouse query performance should start
With all of these factors in play, defining the • User scalability with a broad search for query performance
level of investment needed for adequate Often business expects a nonlinear enhancement alternatives. This should be
data warehouse performance can be a slip- scaling of user communities. Early user followed by a comparative assessment of
pery fish. To capture the fish, or in other counts may be in the low teens or hun- the various alternatives based upon an
words, to identify adequate investment, key dreds, but full-scale deployments may be organization’s performance impacting factors
stakeholders should itemize and plan for the in the thousands. (like those described here) along with the
different data warehouse capabilities that are other usual technology and vendor selection
impacted by its performance. These include: Clearly, data warehouse managers need to criteria. Organizations should evaluate their
forecast the rate and nature of change for need to differentiate from their competitors
• Static to dynamic report transition these and other similar factors that impact via speed and ease of data access. As the
Report development backlogs are often data warehouse performance. The most need for differentiation grows in priority, so
addressed by converting static report common approach is to size the data ware- should the need for differentiating technol-
sets to smaller, more versatile prompted house system for near-term needs and then ogy and methodologies. Most importantly, a
or ad hoc reporting capabilities to users. hope that traditional data warehouse tuning structured business case should be created
The engineering trade-off is that user solutions will cover any performance gaps in for any initiative to increase data warehouse
query rates and complexity becomes less the foreseeable future. The unfortunate truth query performance. ●
predictable and manageable. is that this approach quite often falls short of
meeting the need for data warehouse query For a free white paper on this topic,
• Query complexity performance. Combining the impact of a few download “Ensuring Data Warehouse
Query Performance,” or click here for
To compete in the market, businesses of the above performance factors can require
more information about HyperRoll.
look to increasingly complex analytics a tenfold or even a hundredfold improve-
to manage the subtle and complex ment in query performance. With this level
aspects of competing for customer of query performance risk, data warehouse
attention and engagement. These ana- stakeholders have to take a progressive
lytics drive increasingly complex and approach to assuring adequate performance.
performance-demanding queries to
the data warehouse.
C A S E S T U DY
TEOCO Increases
Performance and
Customer Satisfaction
Commentary by John Devolites
Vice President and General Manager,
Communications and Entertainment Solutions,
TEOCO
About TEOCO serve its customers and drive competitive up 5 terabytes of data to be able to run our
Founded in 1994, TEOCO (The Employee advantage, TEOCO decided that an alternate benchmarks against. Those benchmarks
Owned Company) is a leading provider of data warehouse architecture and environ- were the same that we were running on the
strategic business solutions to large and ment must be implemented. Oracle side of the equation. With the DATAl-
small enterprises. TEOCO’s Communications legro appliance, across the board, we saw a
and Entertainment Solutions group is the The Solution 50 to 100 times performance improvement
premier provider of network cost and revenue TEOCO did its research and chose to evalu- on the queries that we were running. What
management solutions to the communica- ate data warehouse appliances on three that means is that a query that was taking
tions industry. TEOCO combines the strength major components, the first of which was 20 to 22 hours to run, ran in minutes,”
of its renowned software applications with in- performance. Devolites says, “At the top end comments Devolites.
depth auditing expertise to achieve unrivaled of the spectrum we wanted something that
market penetration and industry leadership in would run at least 40 to 100 times faster on Tangible Benefits
the communications space. the same types of queries that we were deal- By implementing a data warehouse appli-
ing with today on Oracle.” ance, TEOCO is now able to mine data in
Business Problem innovative ways. Devolites says, “So this new
As continued competitive pressures drove technology allows us to mine this information
consolidation in the communications indus- “We basically built up 5 terabytes in a way we’ve never seen it before. We can
try, TEOCO found many of their customers of data to be able run our bench- look at in-office and call routing patterns. We
merging together to create larger enter- can look at fraud. We can look at different
prises with sharply increased transaction marks against. Those benchmarks things that we have never before been able
volumes. Originally, TEOCO’s systems were were the same that we were run- to audit for a communications carrier.
designed to support up to $1 billion in
monthly billing transaction volume and even ning on the Oracle side of the “For one of the carriers, we ran through a
less for reporting. However, with increasing equation. With the DATAllegro whole series of cell phone billing information
customer transaction volumes, the com- and identified their top 100 users. Within the
pany was experiencing substantial run-time appliance, across the board, we top hundred users, there were 10 with over
issues and large report issues that resulted saw a 50–100 times performance 25,000 minutes per month. Because of the
in delayed customer response. While the DATAllegro appliance, we were able to look
transaction processing was still adequate, improvement on the queries that at data in a way that we’ve never been able
the reporting delays were significant and we were running. What that means to before and determine that it was a cab
continually degrading. company in Seattle abusing a service plan.
is that a query that was taking 20– Before DATAllegro, there would have been
“Just to give you a very simple overview: 22 hours to run, ran in minutes!” no other way to determine that information,”
AT&T merged with SBC, raising the transac- says John Devolites. ●
John Devolites, TEOCO
tions from $1 billion worth to $2 billion worth
per month. Now, with the addition of Bell
South and Cingular, there’s another $600
to $700 million of transactions per month, TEOCO set out to find a vendor that could
resulting in our application processing nearly offer a flexible, scalable, yet cost-effective
$3 billion in transactions per month, but our solution. Therefore, a proof of concept was
systems were designed only to handle things essential to the company’s ability to validate
in the billion dollars with even less range for the reliability and performance of any data
reporting,” says John Devolites. To better warehouse appliance. “We basically built
Appliances—Data
– The need to analyze growing volumes • Long-term storage of transaction details
Mart or Enterprise of point-of-sale or telecom transactions for compliance
to remain competitive
Data Warehouse? • Ad hoc queries
• Business demands for reduced latency,
By Stuart Frost
CEO, DATAllegro, Inc. which translates into faster query times • Applications such as fraud detection
that require access to data at very
• Larger user bases fine granularity
Appliances are now becoming well estab-
lished in the data warehousing market. Some • Demand for ever more complex, ad hoc • Exports to external analytics systems
companies and analysts have positioned queries to address fraud detection and such as SAS
appliances as “just” suitable for data marts anti-money laundering
(DM). Is this true, or can they also be used • Building large-scale aggregation or
for large-scale enterprise data warehouse As a result, many previously successful EDW summary tables and exporting them
(EDW) projects? installations on platforms such as Teradata, to the EDW
DB2, and Oracle are becoming overwhelmed
The answer is yes, they can—under certain by the need to support hundreds of users By offloading these tasks from the EDW to the
circumstances. While few people would with a broad mix of query types against tens appliance, companies are greatly reducing
claim that appliances are currently ready to of terabytes of data. Upgrade quotes for the need for an expensive EDW upgrade. In
handle complex EDW, they are finding an these platforms can easily be tens of mil- addition, the specialized nature and advanced
interesting niche as an integral part of many lions of dollars—and even then, they may not technology of the appliance enables the
EDW infrastructures. meet the business needs! above processes to run significantly faster,
often by two orders of magnitude.
DM and an EDW Differences
Definitions of DMs and EDWs vary, but the Since appliances are relatively easy Since appliances are relatively easy and
most common differences lie in the number and cheap to maintain, any addi- cheap to maintain, any additional complex-
of business processes supported by a given ity introduced by this “divide and conquer”
system. A DM typically supports only one tional complexity introduced by this approach is limited in nature and over-
business process or subject area, whereas ”divide and conquer” approach is whelmed by the huge benefits.
an EDW supports several, and in some
cases is a true enterprisewide system. In limited in nature and overwhelmed Summary
addition, DMs are often fed summarized by the huge benefits. New data warehouse appliance technolo-
information from the EDW in a hub-and- gies have the potential to transform the data
spoke architecture, although this varies warehousing market. By acting as a high-
across the industry. performance, high-capacity, and low-cost
Using Appliances to front end to an established EDW, they can
EDW Challenges Divide and Conquer the Problem add significant value to an already successful
A significant majority of Global 2000 com- Since high-performance data warehouse installation—while at the same time avoiding
panies have deployed data warehouses in appliances are now available at prices as expensive upgrades.
the last 10 years, establishing the overall low as $20,000 per terabyte, a number
business value of analytics. However, many of EDW users are now turning to this new If this all sounds too good to be true, many
companies are now struggling to keep up technology as a potential solution. However, vendors offer free proofs of concept so you
with new demands on their data warehouse they are not relegating appliances to the role can check out their claims at minimal cost.
systems. For example: of mere data marts. Instead, they are using What do you have to lose, apart from poor
appliances as a low cost front-end to the performance and a lot of cost? ●
• Significant data growth due to: EDW itself.
For a free white paper on this topic,
– New legislation (Sarbanes-Oxley, In a typical scenario, large-volume, fine- download “The Next Generation of Data
Warehouse Appliances,” or click here for
E.U. data retention laws, etc.) granularity transaction records are stored
more information about DATAllegro, Inc.
directly on the appliance. The appliance then
– Mergers and acquisitions handles tasks like:
• Data cleansing
DataFlux provides end-to-end data quality DATAllegro v3™ is the industry’s most Hyperion Solutions Corporation (Nasdaq
integration solutions that enable companies to advanced data warehouse appliance utiliz- Global Select: HYSL) is the global leader in
effectively analyze, improve, and control their ing an all-commodity platform. By combining business performance management software.
data, transforming data into valuable corporate DATAllegro’s patent-pending software with the More than 12,000 customers in 90 countries
information. DataFlux—a wholly owned sub- industry’s leading hardware, storage, and data- rely on Hyperion both for insight into current
sidiary of SAS—enhances the effectiveness base technologies, DATAllegro has taken data business performance and to drive perfor-
of data-driven initiatives across the enterprise, warehouse performance, reliability, and inno- mance improvement. With Hyperion software,
including master data management (MDM), vation to the next level. DATAllegro v3 goes businesses collect, analyze, and share data
customer data integration (CDI), and product beyond the low cost and high performance across the organization, linking strategies to
information management (PIM). The DataFlux of first-generation data warehouse appliances plans, and monitoring execution against goals.
Data Quality Integration Solution provides a and adds the flexibility and scalability that only Hyperion integrates financial management
single platform encompassing data profiling, a commoditized platform can offer. applications with a business intelligence plat-
data quality, data integration, data enrichment, form into a single management system for the
and data monitoring, along with methodolo- Whether you have a few terabytes of user data global enterprise. And with Hyperion services,
gies and rules via service-oriented architecture or hundreds, DATAllegro’s data warehouse your organization can confidently implement,
(SOA) for creating consistent, accurate, and appliances deliver a fast, flexible, and afford- learn, and run your management system. To
reliable data. To learn more about DataFlux, able solution that allows a company’s data to find out more, please visit www.hyperion.com.
please visit www.dataflux.com. grow at the pace of its business.
HyperRoll software is a data warehouse query Informatica Corporation delivers data inte- Information Builders created the industry’s
performance engine that dramatically acceler- gration software and services to solve a most widely deployed business intelligence
ates the performance of your existing data problem facing most large organizations: the solution and is the leader in real-time opera-
warehouses and business intelligence appli- fragmentation of data across disparate sys- tional reporting. The WebFOCUS business
cations while also reducing your hardware, tems. Informatica helps organizations gain intelligence platform, the company’s flagship
storage, and manpower costs. With HyperRoll, greater business value from their information product, has the architecture, integration,
query speed improvements between assets by integrating their enterprise data. and simplicity to permeate every level of the
10x and 100x are typical. HyperRoll combines Informatica’s open, platform-neutral software extended enterprise. It is the most scalable,
patented aggregation, memory-based cach- reduces costs, speeds time to results, and secure, and flexible solution in the market,
ing, and compression algorithms to provide scales to handle data integration projects of and helps organizations build applications that
lightning-fast access to large data volumes any size or complexity. With a proven track have no barriers. Information Builders is the
while drastically reducing the need for addi- record of success that extends back to 1993, only vendor with its own complete integration
tional system resources. HyperRoll supports Informatica helps companies and government solution.
7x24 operating environments, is 100 percent organizations of all sizes realize the full busi-
compatible with your existing database and ness potential of their enterprise data. That’s iWay Software’s ability to seamlessly integrate
reporting tools, and will dramatically improve why Informatica is known as the data integra- with WebFOCUS enables unrivaled informa-
information access for your business. tion company. tion access.
Initiate Systems is the leading provider of MicroStrategy is a global leader in business Netezza, the global data warehouse appliance
customer-centric master data management intelligence (BI) technology. Founded in 1989, market leader, enables enterprises to make
solutions for companies that want to create MicroStrategy provides integrated reporting, their critical data actionable—quickly, simply,
the most complete, real-time views of people, analysis, and monitoring software that helps and affordably. The Netezza Performance
households, and organizations from data leading organizations worldwide make better Server® (NPS) family of products delivers
dispersed across multiple application sys- business decisions every day. Companies breakthrough performance, unmatched ease
tems and databases. Initiate Identity Hub™ choose MicroStrategy for its advanced techni- of deployment and operation, and innovative
software is more accurate, scalable, rapidly cal capabilities, sophisticated analytics, and flexibility and scalability. By architecturally
deployed, and widely used than any other superior data and user scalability. integrating database, server, and storage
solution. Many organizations in the financial within a single appliance, the NPS system
services, government, healthcare, hospitality, With more than 15 years of industry experi- delivers significantly faster performance than
and retail sectors are using Initiate™ software. ence, thousands of customer successes, and traditional systems, at a much lower total
Initiate Systems’ proven experience makes it a reputation for innovation and leadership, cost of ownership. Customers who have rec-
uniquely qualified to enable strategic initiatives MicroStrategy is the clear choice for your busi- ognized the benefits of Netezza’s approach
that allow organizations to increase revenue ness intelligence investment. More information include Ahold, Amazon.com, Debenhams, the
and efficiency and reduce operating costs about MicroStrategy is available at: U.S. Department of Veterans Affairs, Neiman
and risks. www.microstrategy.com. Marcus, and Orange UK.
Together, Nimaya and IBM help companies Oracle’s business is information—how to Since 1994, Petris has been providing practi-
realize higher levels of customer satisfaction manage it, use it, share it, protect it. The cal, real-world information technology solutions
at lower costs, turning customers into advo- world’s largest enterprise software company, for its energy clients, leveraging its expertise
cates and increasing retention and revenue. Oracle is the only vendor to offer solutions and experience in drilling and wellbore data
Nimaya’s CustomerGrid, a service-oriented for every tier of your business—database, management and analysis, geospatial informa-
architecure (SOA) application running on IBM middleware, business intelligence, business tion systems, and technical services.
WebSphere, leverages data residing in sys- applications, and collaboration. With Oracle,
tems across divisions, functions, and extended you get information that helps you measure Through a unique vendor-neutral approach
enterprises. By capitalizing on investments in results, improve business processes, and com- and expertise in service-oriented IT architec-
existing applications, Nimaya and IBM deliver municate a single truth to your constituents. ture, Petris offers advanced data management
inexpensive, easy-to-use solutions, addressing technology that seamlessly integrates data
corporate needs for becoming customer-cen- from diverse data stores. The resulting soft-
tric by providing consistent and accurate data. ware tools are intuitive to use and transparent
Key to this is Nimaya’s business rules engine, to operate. While competitors offer solutions
which delivers insight and generates proac- to address pieces of the data management
tive leads, alerts, and tasks through unified puzzle, only Petris has the services and solu-
customer views and enterprise data mash-ups. tion framework that includes them all.
Pitney Bowes Group 1 Software provides SAP is the world’s leading provider of business Siperian offers the most complete, integrated
innovative software solutions that enable our software. More than 38,000 customers in software platform for adaptive master data
clients to better understand and connect more than 120 countries run SAP® applica- management (MDM). Siperian helps com-
with their millions of customers, prospects, tions—from distinct solutions addressing the panies create and present real-time unified
and partners. Group 1 helps over 3,000 needs of small and midsize enterprises to suite views of their customers, products, and sup-
organizations improve profitability, increase offerings for global organizations. Powered pliers to business users from distributed data
effectiveness, and strengthen customer rela- by the SAP NetWeaver® platform to drive sources for higher profitability, reduced opera-
tionships through consolidating, cleansing, innovation and enable business change, SAP tional costs, and improved compliance. Our
and enriching corporate data, and generating software helps enterprises of all sizes around award-winning solution, Siperian MDM Hub™,
personalized business documents for multi- the world improve customer relationships, uniquely adapts to different architectural
channel delivery, customer care, and efficient enhance partner collaboration, and create styles to meet companies’ evolving business
business processing. Our comprehensive efficiencies across their supply chains and needs while delivering significantly lower total
Customer Communication Management business operations. SAP solution portfolios cost of ownership, faster time-to-value, and
(CCM) solutions span from database to support the unique business processes of superior return on investment. Siperian busi-
delivery, adding value to every aspect of com- more than 25 industries, including high-tech, ness solutions for financial services, health
munication and allowing clients to integrate retail, financial services, healthcare, and the and life sciences, high-tech, communications
intelligence throughout their mailstream. public sector. With subsidiaries in more than and media, and manufacturing industries
50 countries, the company is listed on sev- improve customer relationship management,
eral exchanges, including the Frankfurt stock marketing, regulatory compliance, and order-
exchange and NYSE, under the symbol SAP. to-cash processes.
Additional information is available at:
www.sap.com/usa/platform/netweaver/
components/mdm/index.epx.
Sybase is the largest global enterprise software Syncsort Incorporated is a leading developer
company exclusively focused on managing of high-performance storage management and
and mobilizing information from the data data warehousing software. For over 35 years,
center to the point of action, providing open, Syncsort has built a reputation for superior
cross-platform solutions that securely deliver product performance and reliable technical
information anytime, anywhere—enabling cus- support. An independent market research firm
tomers to create an information edge. Sybase named Syncsort one of the top Data Ware-
addresses the need for business intelligence house 100 Vendors seven years in a row. Over
and data warehousing with world-class 90 percent of the Fortune 100 companies are
data modeling, ultra-high-speed business Syncsort customers, and Syncsort’s products
analytics, and comprehensive data integra- are used in more than 50 countries to back
tion technologies. Maximize the potential of up and protect data in distributed environ-
your information assets and achieve better ments, speed data warehouse processing, and
intelligence capabilities with Sybase IQ, Pow- improve performance in applications with high
erDesigner, and our Data Integration Suite. data volumes.
When businesses get serious about business
intelligence and data warehousing, they get
Sybase. For more information, visit:
www.sybase.com.
Through TDWI Membership, business intel- TDWI Onsite brings TDWI courses to customer
TDWI, a division of 1105 Media, ligence and data warehousing professionals sites and offers training for all experience
is the premier provider of in-depth, learn about the latest trends in the industry levels. Everyone involved gains a common
while enjoying a unique opportunity to learn, knowledge base and learns in support of the
high-quality education and research network, share ideas, and respond as a collec- same corporate objectives. Training can be tai-
in the business intelligence and tive whole to the challenges and opportunities lored to meet specific business needs and can
in the industry. incorporate organization-specific information.
data warehousing industry. Starting
in 1995 with a single conference, TDWI Membership includes more than 5,000
Members who are business and information CERTIFIED
TDWI is now a comprehensive technology professionals from Fortune 1000 BUSINESS INTELLIGENCE
resource for industry information corporations, consulting organizations, and PROFESSIONAL (CBIP)
governments in 45 countries. TDWI offers spe-
and professional development cial Membership packages for corporate Team
www.cbipro.com
opportunities. TDWI sponsors and Members and students. Convey your experience, knowledge, and
expertise with a credential respected by
promotes quarterly World Confer- employers and colleagues alike. CBIP is an
ences, regional seminars, onsite WORLD CONFERENCES exam-based certification program that tests
www.tdwi.org/conferences
industry knowledge, skills, and experience
courses, a worldwide Membership within five areas of specialization—providing
program, business intelligence TDWI World Conferences provide a unique the most meaningful and credible certification
opportunity to learn from world-class instruc- available in the industry.
certification, resourceful publica- tors, participate in one-on-one sessions with
tions, industry news, an in-depth industry gurus, peruse hype-free exhibits, and
network with peers. Each six-day conference WEBINAR SERIES
research program, and a compre- features a wide range of content that can help www.tdwi.org/education/Webinars
hensive Web site (www.tdwi.org). business intelligence and data warehousing
professionals deploy and harness business TDWI Webinars deliver unbiased information
intelligence on an enterprisewide scale. on pertinent issues in the business intel-
ligence and data warehousing industry. Each
live Webinar is roughly one hour in length and
SEMINAR SERIES includes an interactive question-and-answer
www.tdwi.org/seminars
session following the presentation.
425.277.9126
info@tdwi.org
www.tdwi.org