You are on page 1of 9

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/235979931

Data Warehouse Testing

Conference Paper · March 2013


DOI: 10.1145/2457317.2457319

CITATIONS READS
9 1,819

3 authors, including:

Neveen Elgamal Galal Galal-Edeen


Cairo University Cairo University
14 PUBLICATIONS   33 CITATIONS    47 PUBLICATIONS   290 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

DeepMotions View project

SimilarMove View project

All content following this page was uploaded by Neveen Elgamal on 02 June 2014.

The user has requested enhancement of the downloaded file.


Data Warehouse Testing
Neveen ElGamal
Faculty of Computers and Information,
Cairo University,
Giza, Egypt
+201002585680
n.elgamal@fci-cu.edu.eg

Supervised by:

Ali ElBastawissy Galal Galal-Edeen


Faculty of Computers and Information, Faculty of Computers and Information,
Cairo University, Giza, Egypt Cairo University, Giza, Egypt
alibasta@fci-cu.edu.eg Galal@acm.org
Thesis State: Middle

ABSTRACT Testing the DW system outputs with the data in the data
During the development of the data warehouse (DW), too much sources is not the best approach to experiment the quality of
data is transformed, integrated, structured, cleansed, and the data. However, this type of test is an informative test that
grouped in a single structure that is the DW. These various types will take place at a certain point in the testing process but the
of changes could lead to data corruption or data manipulation. most important part in the testing process should take place
Therefore, DW testing is a very critical stage in the DW during the DW development. Every stage and every
development process. component the data passes through should be tested to
A number of attempts were made to describe how the testing guaranty its efficiency and data quality preservation or even
process should take place in the DW environment. In this paper, improvement.
I will state briefly these testing approaches, and then a proposed
matrix will be used to evaluate and compare these approaches. 1.1 The Challenges of DW testing
Afterwards, I will highlight the weakness points that exist in the
It is widely agreed upon that the DW is totally different from
available DW testing approaches. Finally, I will describe how I
other systems, such as Computer Applications or
will fill the gap in the DW testing in my PhD by developing a
DW Testing Framework presenting briefly its architecture. Transactional Database Systems. Consequently, the testing
Then, I will state the scope of work that I am planning to techniques used for these other systems are inadequate to be
address and what type of limitations that exist in this area that I used in DW testing. Here are some of the differences as
am expecting to experience. In the end, I will conclude my discussed in [6-8];
work and state possible future work in the field of DW testing.
1. DW always answers Ad-hoc queries, which makes it
impossible to test prior to system delivery. On the other
1. INTRODUCTION hand, all functions in any computer application realm
In the data warehousing process, data passes through several are predefined.
stages each of which causes a different kind of changes to the
data to finally reach the user in a form of a chart or a report. 2. DW testing is data centric, while application is code
There should be a way of guaranteeing that the data in the centric.
sources is the same data that reaches the user, and the data 3. DW always deals with huge data volumes.
quality is improved; not lost.
4. The testing process in other systems ends with the
development life-cycle while in DWs it continues after
©2013 Association for Computing Machinery. ACM the system delivery.
acknowledges that this contribution was authored or co-authored
5. “Software projects are self contained but a DW project
by an employee, contractor or affiliate of the national
continues due to decision-making process requirement
government of Egypt. As such, the government of Egypt retains
a nonexclusive, royalty-free right to publish or reproduce this
for ongoing changes” [8].
article, or to allow others to do so, for Government purposes 6. Most of the available testing scenarios are driven by
only. some user inputs while in DW most of the tests are
/
system-triggered scenarios.
EDBT/ICDT '13 , March 18 - 22 2013, Genoa, Italy
Copyright 2013 ACM 978-1-4503-1599-9/13/03…$15.00 7. Volume of test-data in DW is considerably large
compared to any other testing process.
8. In other systems test cases can reach hundreds but the Section 3 will analyze the comparison matrix to highlight the
valid combinations of these test cases will never be drawbacks and weaknesses that exist in the area of DW
unlimited. Unlike the DW, the test cases are unlimited testing and the requirements for a DW testing framework.
due to the core objective of the DW that allows all Section 4 presents my expected contribution in the PhD to
possible views of data. [7]. fill the gap of DW testing by introducing a new DW testing
framework. Section 5 will present the architecture of the
9. DW testing consists of different types of tests
proposed DW testing Framework and Section 6 will state the
depending on the time the test is taking place and the
scope of work and expected limitations that we are expecting
component being tested. For example; Initial data load
to experience during our work. Finally I will conclude my
test is different from the incremental data load test.
work in section 7.
One of the core challenges of testing DWs or providing
techniques for testing DWs is its flexible architecture. DW
2. RELATED WORK
systems could have different architectures according to
business requirements, DW required functionalities, and/or 2.1 DW Testing Approaches
budget/time constraints. A number of trials were made to address the DW testing
process. Some were made by companies offering consultancy
services for DW testing like [5, 13, 14, 17, 22]. Others, to fill
1.2 Data Warehouse Architecture the gap of not finding a generic DW testing technique, have
As shown in figure 1, a global DW system consists of a proposed one as a research attempt like [1-4, 8, 11, 15, 18,
number of inter-related components: 23]. A different trend was taken by authors and organizations
• Data Sources (DS) to present some automated tools for the DW testing process
• Operational Data Store (ODS)/ Data Staging Area like [12, 16, 22], and from a different perspective, Some
(DSA) authors presented a DW testing methodology like [2, 14, 21].
• Data Warehouse (DW) The rest of this section will briefly introduce these
• Data Marts (DM) approaches in groups according to similarities, and a
comparison between them will be presented later in the
• And, User Interface (UI) Applications .Ex; OLAP following sections.
reports, Decision Support tools, and Analysis Tools
The approaches presented in [1-3, 14, 15] have adapted some
of the software testing types like;
1. Unit testing,
2. Integration Testing,
3. System Testing,
4. User Acceptance Testing,
5. Security Testing,
6. Regression Testing,
7. Performance Testing,
and extended them to support the special needs of the DW
testing. What was a great advantage for these approaches is
that they had a solid background which is the well defined
and formulated software testing that helped them in
presenting a DW testing approach that is not so far from the
Figure 1. DW System Architecture system testers to understand and use.
Other attempts like [4, 13, 23] focused on addressing the
Each component needs to be tested to verify its efficiency testing of the most important part of the Data Warehousing
independently. The connections between the DW process which is the ETL process. [13] addressed the process
components are groups of transformations that take place on of DW testing from 2 high-level aspects;
data. These transformation processes should be tested as well
to ensure data quality preservation. The outputs of the DW • The underlying Data which focuses on the data
system should be compared with the original data existing in coverage and data compliance with the
the DSs. Finally, from the operational point of view the DW transformation logic in accordance with the
system should be tested for performance, reliability, business rules,
robustness, recovery, etc… • DW components which focuses on the
The remainder of this paper will be organized as follows; Performance, Scalability, Component
Section 2 briefly surveys the existing DW testing approaches Orchestration, and regression testing.
and introduces our DW testing matrices that we used to What was unique in the approach presented in [4] is that they
compare and evaluate the existing DW testing approaches. concentrated on the data validation of ETL and presented 2
alternatives for the testing process either the White Box TABLE 1: DW components Vs testing types [8]
Testing where the data is tracked through the ETL process
itself, or the Black Box Testing where only the input and the
output data of the ETL process is validated.
The Research attempts presented by the two research groups
from DISI -Bologna University (Matteo Golfarelli and
Stefano Rizzi) and Slovak University of Technology (Pavol
Tanuška, Oliver Moravčík, Pavel Važan Fratišek Miksa,
Peter Schreiber, Jaroslav Zeman, Werner Verschelde, and
Michal Kopcek) where the richest attempts that addressed the
DW testing from various perspectives. In [18], the authors
suggested a proposal for basic DW testing activities
(routines) as a final part of the DW testing methodology.
Other parts of the methodology were published in [19-21].
The testing activities can be split into four logical units
regarding: The approaches presented in [12, 16, 22] are CASE tools
ƒ Multidimensional database testing, specially designed to address the DW Testing. TAVANT
ƒ Data pump (ETL) testing, product named “One-Click Automation Framework for Data
ƒ Metadata and, Warehouse Testing” is a tool that supports DW testing
ƒ OLAP testing. process that works concurrently with the DW Development
Life Cycle [22]. It imbeds the test cases in the ETL process
The authors then highlighted how these activities split into and insures that all stages of ETL are tested and verified
smaller more distinctive activities to be performed during the before subsequent data loads are triggered. Their DW
DW testing process. Testing methodology is incremental and is designed to
In [8], the authors introduced DW testing activities (routines) accommodate changing DW schema arising from evolving
framed within a DW development methodology introduced business needs.
in [9]. They have stated that the components that need to be The QuerySurge CASE tool developed by RTTS [16] is a
tested are, Conceptual Schema, Logical Schema, ETL tool that assists the DW testers in preparing and scheduling
Procedures, Database, and Front-end. To be able to test these query pairs to compare data transformed from the source to
components, they have listed eight test types that best fit the the destination, for example; preparing a query pair one that
characteristics of DW systems. These test types are: runs on a DS and the other on the ODS to verify the
ƒ Functional test completeness, correctness, and consistency of the structure of
ƒ Usability test data and the data transformed from the DS to ODS.
ƒ Performance test Inergy is a company specialized in designing, developing and
ƒ Stress test managing DWs and BI solutions [12]. They have developed
ƒ Recovery Test a tool to automate the ETL testing process. They did not
ƒ Security test build their tool from scratch. Instead, they used some existing
ƒ Regression test tools like DbUnit (Database Testing Utility) and Ant Task
Extension (Java-based build tool) and they presented an
A comprehensive explanation of how the DW components
extension to the DbUnit by applying some DW specific logic
are being tested by the above testing routines is then explored
like;
showing what type of test(s) is suitable for which component
as shown in table 1. • Which ETL process to run at what time
The authors then customized their DW testing technique to • Which dataset should be used as a reference
present a prototype-based methodological framework [11]. • What logic to repeat and for which range
Its main features are (1) Earliness to the life-cylce, (2)
modularity, (3) Tight coupling with design, (4) Scalability, They also developed a PowerDesigner script that extracts the
and (5) measurability throught proper matrics. The latest core of the test script based on the data model of the DW.
piece of work that this research group presented in [10] was a Unfortunately, not enough data were available for the
number of data-mart specific testing activities, classified approaches [5, 17] as they are two approaches developed by
interms of what is tested and how it is tested. The only companies offering consultancy services and reveling their
drawback with this research group’s work is that its DW DW testing techniques could negatively affect their business.
architecture does not include a DW component. The data is
After we introduced the previous work that had been made in
loaded from the data sources to the data marts directly. Their
the DW testing now it is time to exhaustively study, compare
DW architecture consists of a number of data marts
and evaluate them. In order to do such a comprehensive
addressing different business areas.
study, we have defined a group of DW testing matrices in
[6].
2.2 DW Testing Matrices 2.3 Comparison and Evaluation of
As we have presented previously in [6], the DW testing Surveyed Approaches
matrices classifies tests – or test routines for clarification- After studying how each proposed DW testing approach
according to where, what, and when they take place; addressed the DW testing and according to the DW testing
• WHERE: presents the component of the DW that this test matrices defined in the previous section, a comparison matrix
targets. This divides the DW architecture as shown in is presented in table 3 showing the test routines that each
approach covers. The DW testing approaches are represented
figure 1 into the following layers:
on the columns, the ‘what’ and ‘where’ dimensions classify
o Data Sources to Operational Data Store: Presents the test routines on the rows. The intersection of rows and
the testing routines targeting data sources, wrappers, columns indicates the coverage of the test routine in this
extractors, transformations and the operational data approach where “√√” represents full coverage and “√”
store itself. represents partial coverage. Finally, the ‘when’ dimension
o Operational Data Store to DW: Presents the testing that indicates whether this test takes place before or after
routines targeting the loading process, and the DW system delivery is represented by color highlighting the tests
which take place after the system delivery, while the tests
itself.
that take place during the system development or when the
o DW to Data Marts: Presents the testing routines system is subject to change are left without color
targeting the data marts and the transformations that highlighting.
take place on the data used by the data marts and the
We were able to compare only 10 approaches, as not enough
data marts themselves. data was available for the rest of the approaches.
o Data Marts to User Interface: Presents the testing
As it is obvious in table 3, none of the proposed approaches
techniques targeting the transformation of data to the addressed the entire DW testing matrices. This is simply
interface applications and the interface applications because each approach addressed the DW testing process
themselves. from its own point of view without leaning on any standard
• WHAT: presents what these routines will test in the or general framework. Some of the attempts considered only
targeted component. parts of the DW framework shown in figure 1. Other
attempts used their own framework for the DW environment
o Schema: focuses on testing DW design issues.
according to the case they are addressing. For example; [8]
o Data: concerned with all data related tests like data used a DW architecture that does not include either an ODS
quality, data transformation, data selection, data or DW Layers. The data is loaded from the Data Sources to
presentation, etc… the Data Marts directly. This architecture makes the Data
o Operational: includes tests that are concerned with Marts layer acts as both the DW and the Data Mart
the process of putting the DW into operation. interchangeably. Other approaches like [1, 14, 15] did not
include the ODS layer.
• WHEN will this test take place?
o Before System Delivery: A onetime test that takes From another perspective, there are some test routines that
place before the system is delivered to the user or are not addressed by any approach like; Data Quality factors
like accuracy, precision, continuity, etc... Some major
when any change takes place on the design of the
components of the DW were not tested by any of the
system. proposed approaches which is the DM Schema and the
o After System Delivery: Redundant test that takes place additivity of measures in the DMs.
several times during system operation.
3. REQUIREMENTS FOR A DW
The ‘what’, ‘where’ and ‘when’ testing categories will result TESTING FRAMEWORK
in a three dimensional matrix. As shown in table 2, the rows Based on carefully studying the available DW testing
represent the ‘where’ dimension, the columns represent the approaches using the DW testing matrices presented
‘what’ dimension, and later on the ‘when’ dimension shall be previously to analyze, compare and evaluate them, it is
represented in color in the following section. Each cell of this evident that the DW environment lacks the following:
table will consist of the group of test routines that addresses 1. The existence of a generic, well defined, DW testing
this combination of dimension members when this matrix is
approach that could be used in any project. This is
used to compare the existing DW testing approaches.
because each approach presented its testing techniques
Table 2: DW Testing Matrices
based on its DW architecture which limits the
Schema Data Operation reusability of the approach in other DW projects with
DSÆODS different DWs architectures.
Backend
ODSÆDW 2. None of the existing approaches included all the test
DWÆDM
Frontend DMÆUI routines needed to guarantee a high quality DW after
delivery.
Table 3: DW Approaches Comparison
Backend Frontend
Test

[14]

[23]
[15]
[18]

[13]

[14]

[23]
[15]
[18]

[13]
Test

[1]
[4]

[3]
[2]

[8]

[1]
[4]

[3]
[2]

[8]
ODSÆDW DWÆDM
Schema Requirement testing √√
User Requirements coverage √√ √√ Schema DM Schema Design
ODS Logical Model √√
Calculated Members √√
Field Mapping √√
√√
Irregular hierarchies
Data type constraints √√ √√
Aggregations √√
Aspects of Transformation rules √ √ √ √ √
Correct data filters √√
Correct Data Selection √√

Data Integrity Constraints √√ √√


Additivity Guards
Parent-child relationship √√ DMÆUI
Record counts √√ √√ Reports comply with √√ √√ √√ √
Duplicate Detection √√ √√ Schema specifications
Report structure √√ √
Threshold Test √√ √√
Data Boundaries √√ Report cosmetic checks (font, √√ √
Data profiling √√ color and format)
Random record comparison √√ Graph cosmetic checks (type, √√
Surrogate keys √√ √√ √√ √√ color, labels, and legend)
Column headings √√
Operation Review job procedures √√
Description √√
Error messaging √√
Processing time √√ Drilling Across Query Reports √√
Integration testing √√
Rejected record √√ √√ √√ √√ √√
Data Correct data displayed √√ √
Data access √√ √√ Trace report fields to Data √√ √√ √√
ODSÆDW source
Schema DW Conceptual Schema: √√
Field data verification √√ √√ √
DW Logical Model: √ √
Integrity Constraints √√ √√
Performance test (Response √√ √√ √√ √√
Threshold test √√ √√
Operation time, # of queries, # of users)
Data type constraints √√ √√ √√
Stress test
Hierarchy level integrity √√
√√ √√
Granularity √√
Audit user accesses
Derived attributes checking √√ Refresh time for standard and √√
complex reports
Data Record counts √√ √√ √√ √√ √√ √√ Friendliness √√ √√
No constants loaded √√
Null records
Field-to-Field test √ √√ √√
√√
Overall
Data relationships √√ √√
Dependencies between levels √√
Schema
Data transformation √√ √√ √√ √√
Metadata
Duplicate detection √√
Value totals √√ √√
Functionality meets √√ √√
requirements
Data boundaries √√ √√
√√ √√
Quality factors √ √ √ Object definition
Data providers √√
Compare transformed data with
√√ √√
expected transformation Data type compatibility √√ √√ √√ √√
Data Aggregation √√ √√
Reversibility of data from DW Operation Error logging √√ √√ √√ √√
√√
to DS √√
HW Configuration
Confirm all fields loaded √√
SW setup in test environment √√
Simulate Data Loading √√
System connections setup √√
Operation Document ETL Process √√
√ √ √√
Security Test
ETL Test √ √√ √ √ √ √
Scalability test √√ √√ √√ √√ √√ Final Source to target √√
Initial load test √√ √√
comparison
Test Through Development √√ √√
Incremental load test √√ √√
Regression Test √√ √√ √√
Freshness √√ √√
Data Access √√
3. None of the existing approaches took into consideration automation is not applicable. It should also include
the dependencies between test routines because suggestions for using existing automated test tools to
sometimes it is not mandatory to perform all test minimize the amount of work done to get automated support
in the DW testing process.
routines. However, some tests if passed successfully
others could be neglected. One of the core features that we intend to include in our
4. The approaches proposed in [18, 19, 21] were the only proposed testing framework is testing along with the DW
development life cycle. This could be done by stating the
ones focusing on both the DW testing routines and the
pre-requisites of each testing routine with respect to the DW
life cycle of the testing process. The life cycle of each development life cycle stages in order to detect errors as
test routine includes a test plan, test cases, test data, early as possible.
termination criteria, and test results. Nevertheless, they The framework’s infrastructure is planned to take the form of
presented the two approaches independently not a directed graph. Nodes of the graph will represent test
showing how the testing routines can fit in a complete routines, and links between nodes will represent test routine
DW testing life cycle. relationships. It is expected that there will be several types of
5. In several projects testing is neglected or diminished relationships between test routines. For example, dependency
due to time or resource problems. None of the existing between test routines, tests that are guaranteed to succeed if
other tests passed successfully, or some tests should never be
approaches included any differentiation technique or
performed unless other tests pass.
prioritization for the test routines according to impact
After defining the DW testing framework and fully
on the overall DW quality in order to help the testers
documenting it, I am planning in my PhD to materialize the
select the important test routines that highly affect the DW testing framework by developing a web service that
quality of the DW in case they have resource or time makes the framework reachable for testers.
limitations that force them to shorten the testing Finally, a case study will be used to experiment the testing
process. framework in a real world DW project and the results of the
6. Some of the above test routines could be automated but experimentation will be used to evaluate the proposed DW
none of the proposed approaches showed how these testing framework.
routines could be automated or have an automated
assistance using custom developed or existing tools. 5. THE ARCHITECTURE OF THE
These drawbacks urged us to think of a new DW testing PROPOSED FRAMEWORK
approach that fills the gaps in this area as well as benefit This section will present the architecture of the framework
from the efforts made by others. The following section will proposed to show the work flow of the framework when it is
put into operation. As Shown in figure 2, the key player of
include description of our proposed DW testing framework.
the DW testing process is the Test Manager who feeds the
system with the DW architecture under test and the current
4. A PROPOSED DW TESTING state of the DW. In other words, which component of the
DW is developed so far. This step is done because we
FRAMEWORK support the testing through system development in our
In my PhD I am planning to develop a DW testing proposed framework.
framework that is generic enough to be used in several DW
projects with different DW architectures. The framework’s The DW Architecture Analyzer component then studies the
primary goal is to guide the testers through the testing received data and compares it with the dependencies between
process by recommending the group of test routines that are test routines from the Test Dependency Graph with the
required given the project’s customized DW architecture. assistance of the Test Dependency Manager component and
passes the data to the Test Recommender to generate an
The proposed framework is supposed to include definitions Abstract Test Plan.
for the test routines. The main target of our research is to
benefit from existing DW testing approaches by adopting the The process of preparing the Detailed Test plan then splits
test routine definitions from available testing approaches and into two different directions according to the type of test
define test routines that were not addressed or routine whether it is a validation test or a verification test. In
comprehensively defined by any previous approach. case of validation test types, which are the more complicated
testing routines, the Validation Manager involves the
In our study we will prioritize the test routines according to Business Expert(s) and System User(s) in addition to
importance and impact on the output product, so that the accessing the relevant data from the system Repository to
tester could select the tests that highly affect the quality of prepare the part of the test plan concerns the validation test
the delivered system if any scheduling or budget limitations routines. For the Verification Test Routines, the Verification
were faced. Manager along with the Test Case Generator and Test Data
Part of the test routine definition should include how the test Generator modules helps in preparing the Detailed Test Plan
routines can be automated or get automatic support if full of verification test routines.
Figure 2: Proposed DW Testing Framework Architecture

The Verification Manager involves the system tester(s) and defined and implemented. This is because a considerable
DB administrator for assistance in the process of Test Case number of the existing approaches are industrial
preparation and test data generation (in case the DW is still organizations and revealing such information could highly
in the development phase and no real data is available). affect the organization’s novelty in the field of DW testing.
To benefit from the existing test automation tools for DW,
the Verification Manager will include in the Detailed Test 7. CONCLUSIONS AND FUTURE
Plan possible use of existing Test Automation Tools in test WORK
routines that could be automated. Some trials have been carried out to address the DW testing,
most of them were oriented to a specific problem and none of
6. SCOPE AND LIMITATIONS them were generic enough to be used in other data
In My PhD I will not take into consideration: warehousing projects.
1. The testing routines that targets unconventional By the end of my PhD, I will present a generic DW testing
DWs like temporal DWs, active DWs, Spatial framework that integrates and benefits from all existing DW
DWs, DW2.0, etc... testing trials. It will guarantee to the system user that the data
2. Testing routines that targets the last layer of the quality of the data sources is preserved or even improved due
DW Architecture presented in table 1 (DM to UI) to the comprehensive testing process that the DW had passed
since it is a broad range of different application through.
types spread with different architectures and Future works in this field could be extending this testing
implementation techniques which will be very hard framework to support nonconventional DWs like Active
to define a generic testing routines for them. DW, Temporal DW, Spatial DW, etc. This could be done by
3. Test routines checking the data quality of the data defining a specialization for test routines to address the
sources; however, we will consider all the sources special needs for these DWs.
to be of low quality.
8. REFERENCES
In the process of assimilating the existing DW testing [1] Bateman, C. Where are the Articles on Data
approaches into our proposed DW testing framework, we are Warehouse Testing and Validation Strategy?
expecting to face some difficulties regarding the availability Information Management, 2002.
of data and the knowhow of the way testing routines are
[2] Bhat, S. Data Warehouse Testing - Practical Stick [14] Munshi, A. Testing a Data Warehouse Application
Minds, 2007. Wipro Technologies, 2003.
[3] Brahmkshatriya, K. Data Warehouse Testing Stick [15] Rainardi, V. Testing your Data Warehouse. in
Minds, 2007. Building a Data Warehouse with Examples in SQL
[4] Cooper, R. and Arbuckle, S., How to Throughly Test a Server, Apress, 2008.
Data Warehouse. in Software Testing Analysis and [16] RTTS. QuerySurge, 2011.
Review (STAREAST), (Orlando, Florida, 2002). [17] SSNSolutions. SSN Solutions, 2006.
[5] CTG. CTG Data Warehouse Testing, 2002. [18] Tanuška, P., Moravčík, O., Važan, P. and Miksa, F.,
[6] ElGamal, N., ElBastawissy, A. and Galal-Edeen, G., The Proposal of Data Warehouse Testing Activities. in
Towards a Data Warehouse Testing Framework. in 20th Central European conference on Information and
IEEE 9th International Conference on ICT and Intelligent Systems, (Varaždin, Croatia, 2009), 7-11.
Knowledge Engineering (IEEE ICT&KE), (Bangkok, [19] Tanuška, P., Moravčík, O., Važan, P. and Miksa, F.,
Thailand, 2011), 67-71. The Proposal of the Essential Strategies of Data
[7] Executive-MiH. Data Warehouse Testing is Different, Warehouse Testing. in 19th Central European
2010. Conference on Information and Intelligent Systems
[8] Golfarelli, M. and Rizzi, S. A Comprehensive (CECIIS), (2008), 63-67.
Approach to Data Warehouse Testing ACM 12th [20] Tanuška, P., Schreiber, P. and Zeman, J. The
international workshop on Data warehousing and Realization of Data Warehouse Testing Scenario
OLAP (DOLAP '09), Hong Kong, China, 2009. proizvodstvo obrazovanii. (Infokit-3) Part II: 3
[9] Golfarelli, M. and Rizzi, S. Data Warehouse Design: meždunarodnaja nature-techničeskaja konferencija.,
Modern Principles and Methodologies. McGraw Hill, Stavropol, Russia, 2008.
2009. [21] Tanuška, P., Verschelde, W. and Kopček, M., The
[10] Golfarelli, M. and Rizzi, S. Data Warehouse Testing. proposal of Data Warehouse Test Scenario. in
International Journal of Data Warehousing and European conference on the use of Modern
Mining, 7 (2). 26-43. Information and Communication Technologies
[11] Golfarelli, M. and Rizzi, S. Data Warehouse Testing: (ECUMICT), (Gent, Belgium, 2008).
A prototype-based methodology. Information and [22] TAVANT, T. Data Warehouse Testing. Date
Software Technology, 53 (11). 1183-1198. Accessed: Jan 2013
[12] Inergy. Automated ETL Testing in Data Warehouse [23] Theobald, J. Strategies for Testing Data Warehouse
Environment, 2007. Applications Information Management, 2007.
[13] Mathen, M.P. Data Warehouse Testing InfoSys, 2010.

View publication stats

You might also like