www.fullinterview.

com

ETL Testing

1

CRM. A source for the data warehouse is a data extract from operational databases. transformed and finally aggregated and it becomes ready to be loaded into the data warehouse. cleansed. It may gather manual inputs from users determining criteria and parameters for grouping or classifying records. where only a portion of detailed data is required. ERP system). The data warehouse can be created or updated at any time. nonvolatile and consistent data which can be analyzed in the time variant. The data is validated. stable. 2 . Sometimes. Data warehouse database contains structured data for query analysis and can be accessed by users. with minimum disruption to operational systems. it may be worth considering using a data mart. A data mart is generated from the data warehouse and contains data focused on a given subject and data that is frequently accessed or summarized. Data warehouse is a dedicated database which contains detailed.www.fullinterview.com Data warehousing and its Concepts: What is Data warehouse? Data Warehouse is a central managed and integrated database containing data from the operational sources in an organization (such as SAP. It is ensured by a strategy implemented in ETL process.

fullinterview.www.com Data warehouse Architecture: 3 .

com Data warehouse Architecture (Contd): 4 .www.fullinterview.

Disadvantages of Data Warehouse:      Data warehouses are not the optimal environment for unstructured data. etc.  Information in the data warehouse is under the control of data warehouse users so that. data warehouses can have high costs. the information in the warehouse can be stored safely for extended periods of time. This greatly simplifies reporting and analysis. notably customer relationship management (CRM) systems. Because data must be extracted. Data warehouses can get outdated relatively quickly. Duplicate. there is an element of latency in data warehouse data. expensive functionality may be developed.com Advantages of Data warehouse:  Data warehouse provides a common data model for all data of interest regardless of the data's source. the items with the most sales in a particular area within the last two years).fullinterview. There is a cost of delivering suboptimal information to the organization.. functionality may be developed in the data warehouse that. Or.  Inconsistencies are identified and resolved prior to loading of data in the Data warehouse.www. and reports that show actual performance versus goals.g. should have been developed in the operational systems and vice versa. general ledger charges. transformed and loaded into the warehouse. in retrospect. data warehouses provide retrieval of data without slowing down operational systems. exception reports.  Because they are separate from operational systems. This makes it easier to report and analyze information than it would be if multiple data models were used to retrieve information such as sales invoices. There is often a fine line between data warehouses and operational systems.  Data warehouses enhance the value of operational business applications.  Data warehouses facilitate decision support system applications such as trend reports (e. even if the source system data is purged over time. order receipts. Maintenance costs are high. Over their life. 5 .

com ETL Concept: ETL is the automated and auditable data acquisition process from source system that involves one or more sub processes of data extraction. 6 . L . data integration.Transforming the data – which may involve cleaning. E . data loading and data cleaning. validating and applying business rules.fullinterview.Extracting data from source operational or archive systems which are primary source of data for the data warehouse. data consolidation. T . data transformation. filtering. data transportation.Loading the data into the data warehouse or any other database or application that houses the data.www.

Extraction: The first part of an ETL process involves extracting the data from the source systems. but may include non-relational database structures such as Information Management System (IMS) or other data structures such as Virtual Storage Access Method (VSAM) or Indexed Sequential Access Method (ISAM).com ETL Process: ETL Process involves the Extraction.fullinterview. might be different.www. Each separate system may also use a different data format. Once data is secured. If not. resulting in a check if the data meets an expected pattern or structure. data type. An intrinsic part of the extraction involves the parsing of extracted data. Common data source formats are relational databases and flat files. you have worry about its format or structure. Example the grain level. the data may be rejected entirely or in part. Because it will be not be in the format needed for the target. Transformation: Transformation is the series of tasks that prepares the data for loading into the warehouse. Transformation and Loading Process. or even fetching from outside sources such as through web spidering or screen-scraping. Most data warehousing projects consolidate data from different source systems. Data cannot be used as it is. Extraction converts the data into a format for transformation processing. Some rules and functions need to be applied to transform the data 7 .

This auditing process normally happens after the loading of data. What happens to the existing data? Should the old data be deleted because of new information? Or should the data be archived? Should the data be treated as additional data to the existing one? So data to the data warehouse has to loaded with utmost care for which data auditing process can only establish the confidence level. etc. different subject areas.www.fullinterview. Updating or deleting are executed at this step. Loading process decides the modality of how the data is added in the warehouse or simply rejected. ETL must support data integration for the data coming from multiple sources and data coming at different times. Methods like addition. Data can be consolidated from similar systems. creating duplicate data or even worst simply unable to load the data in the target Loading: Loading process is critical to integration and consolidation.com One of the purposes of ETL is to consolidate the data in a central repository or to bring it at one logical or physical place. This will avoid overwriting existing data. IBM (Cognos) Javlin IKAN IBM Pentaho Adeptia 8 . List of ETL tools: Below is the list of ETL Tools available in the market: List of ETL Tools Oracle Warehouse Builder (OWB) Data Integrator & Data Services IBM Information Server (Datastage) SAS Data Integration Studio PowerCenter Elixir Repertoire Data Migrator SQL Server Integration Services Talend Open Studio DataFlow Manager Data Integrator Open Text Integration Center Transformation Manager Data Manager/Decision Stream Clover ETL ETL4ALL DB2 Warehouse Edition Pentaho Data Integration Adeptia Integration Server ETL Vendors Oracle SAP Business Objects IBM SAS Institute Informatica Elixir Information Builders Microsoft Talend Pitney Bowes Business Insight Pervasive Open Text ETL Solutions Ltd. This has to be seamless operation.

Data Quality .fullinterview. Data Validation: Data completeness is one of the basic ways for data validation. Regression testing .com ETL Testing: Following are some common goals for testing an ETL application: Data completeness . Integration testing.To keep the existing functionality intact each time a new release of code is completed. Data Transformation: Validating that the data is transformed correctly based on business rules. This includes the validation of all the records. Data transformation . can be one of the most complex parts of testing an ETL application with significant transformation logic. corrects and reports invalid data.This is to ensure that the data loads and queries perform within expected time frames and the technical architecture is scalable. This is needed to verify that all expected data loads into the data warehouse. User-acceptance testing . Basically data warehouse testing is divided into two categories ‘Back-end testing’ and ‘Front-end testing’. fields and ensures that the full contents of each field are loaded. While the latter refers to where the user checks the data by comparing their MIS with the data that is displayed by the end-user tools. but this method requires manual testing steps and testers who have a good amount of experience and understand of the ETL logic.www.It ensures the solution fulfills the users’ current expectations and also anticipates their future expectations. Another way of testing is to pick up some sample records and compare them for validating data transformation manually. The former applies where the source systems data is compared to the end-result data in Loaded area which is the ETL testing. Performance and scalability.To ensure that all expected data is loaded. 9 . substitutes default values.It promises that the ETL application correctly rejects.It is to ensure that ETL process functions well with other upstream and downstream applications.This is meant for ensuring that all data is correctly transformed according to business rules and design specifications.

Auditing is done properly c) That the data loaded into the target is complete: i. For them the design document is the bible and the entire set of test cases is directly based upon it.www. The developer should focus on the following: a) That all inbound and outbound directory structures are created properly with appropriate permissions and sufficient disk space.g. Data integrity constraints are properly taken care of System testing: Generally the QA team owns this responsibility. All fields are loaded with full contents− i. You may also use data generation tools or customized tools of your own to create test data. An unbiased approach is required to ensure maximum efficiency.com Data Warehouse Testing Life Cycle: Like any other piece of software a DW implementation undergoes the natural cycle of Unit testing. An intelligently designed input dataset can bring out the flaws in the application more quickly.e. Integration testing and Acceptance testing. Surrogate keys have been generated properly iv. actually get loaded− compare counts between source and target and use data profiling tools ii. Boundary conditions are satisfied− e. We must test for all possible combinations of input and specifically check out the errors and exceptions. Rejects have occurred where expected and log for rejects is created with sufficient details vi. check for date fields with leap year dates iii. System testing. Wherever possible use production-like data. Aggregations take place in the target properly v. b) The ETL routines give expected results: i. All tables used during the ETL are present with necessary privileges. Here we test for the functionality of the application and mostly it is blackbox. Knowledge of the business process is an added advantage since we must be able to interpret the results functionally and not just code-wise. 10 . No duplicates are loaded iv. NULL values have been populated where expected v. All transformation logics work as designed from source till target ii. Error recovery methods vii. no data field is truncated while transforming iii. Unit testing: Traditionally this has been the task of the developer. All source data that is expected to get loaded into target.fullinterview. This is a white-box testing to ensure the module or component is coded as per agreed upon design specifications. The major challenge here is preparation of test data. Regression testing.

Therefore. Possibly it is the best example of an incremental design where requirements are enhanced and refined quite often based on business needs and feedbacks.fullinterview. They are the best judges to ensure that the application works as expected by them. Our test strategy should include testing for: i. Notifications to IT and/or business are generated in proper format Regression testing: A DW application is not a one-time solution. Now the new results could be compared against the older ones to ensure proper functionality. Hence. Here. we must test the 11 . Error logs and audit tables are generated and populated properly. it should not cause performance problems. Generation of error logs iv. Here we must consider the compatibility of the DW application with upstream and downstream flows. Acceptance testing: This is the most critical part because here the actual users validate your output datasets. Generally this is done by running all functional tests for existing code whenever a new piece of code is introduced. Integration testing: This is done to ensure that the application developed works from an end-to-end perspective. However. a better strategy could be to preserve earlier test input data and result sets and running the same again. Also the load windows refresh period for the DW and the views created should be signed off from users. Re-startability of jobs in case of failures iii.www. Cleanup scripts for the environment including database This activity is a combined responsibility and participation of experts from all related applications is a must in order to avoid misinterpretation of results. Sequence of jobs to be executed with job dependencies and scheduling ii. In such a situation it is very critical to test that the existing functionalities of a DW application are not messed up whenever an enhancement is made to it. business users may not have proper ETL knowledge. iii. the development and test team should be ready to provide answers regarding ETL process that relate to data population. The test team must have sufficient business knowledge to translate the results in terms of business. Granularity of data is as per specifications. when it goes into production environment. Performance testing: In addition to the above tests a DW must necessarily go through another phase called performance testing. However.com The QA team must test for: i. ii. Also the load windows. refresh period for the DW and the views created should be signed off from users. We need to ensure for data integrity across the flow. v. Any DW application is designed to be scalable and robust. iv. Data aggregations− match aggregated data against staging tables. Data completeness− match source to target counts terms of business.

This has to be reviewed multiple times to ensure completeness of testing. refined and streamlined.fullinterview. A bug in a DW traced at a later stage results in unpredictable losses. So the strategies for testing should be methodically developed.www. This phase should involve DBA team. 12 . and ETL expert and others who can review and validate your code for optimization. Always remember. This is also true since the requirements of a DW are often dynamically changing. And the task is even more difficult in the absence of any single end-to-end testing tool. Summary: Testing a DW application should be done with a sense of utmost responsibility. Under such circumstances repeated discussions with development team and users is of utmost importance to the test team. a DW tester must go an extra mile to ensure near defect free solutions. Another area of concern is test coverage.com system with huge volume of data. We must ensure that the load window is met even under such volumes.