Project Name

Project Name TEST PLAN Draft Version 1.0 November 2009


Table of Contents
1.1. Build and Deployment Process 1.2. Configuration Management 1.3. Hardware Specification 1.4. Software Specification 1.5. Reviewers 1.6. Disposition 15 15 15 15 21 21

Page: 2 of 21

College Board Proprietary and Confidential

0 Initial Draft Description Author Tester who prepared Page: 3 of 21 College Board Proprietary and Confidential .QUALITY ASSURANCE REVISION HISTORY Date 11/12/2009 Version 1.

types of tests performed. approach. this Test Strategy may change. It identifies the strategy employed for test preparation and testing. role definitions. and schedule of intended testing activities for the PSAT/AP Integration with EID & EDW Project. NOTE: As the project progresses. Additionally. INTENDED AUDIENCE The PSAT/AP Integration with EID & EDW Project team members are the intended audience for this document. and identification of associated risks.QUALITY ASSURANCE PURPOSE This Test Strategy Document describes the scope. resources. it provides environment definitions. Page: 4 of 21 College Board Proprietary and Confidential .

2. Score Reporting and organizational information into the IODS (EID). AP On-line Score Reporting and Cross Program Reporting projects. • Primary Scope of testing efforts is to validate the back end data to be provided by this project to SDRS Retirement. 1. each phase may or may not include testing of one or more of the following items: List of All Test Items Items To Be Tested Integration of PSAT(POS) – IODS (EID) Integration of AP(APD) – IODS(EID) Integration of PSAT data loaded into IODS (EID)into EDW Integration of AP data loaded into IODS (EID)into EDW Page: 5 of 21 College Board Proprietary and Confidential . Admin.1.QUALITY ASSURANCE 1. • Establishment and implementation of verifying and validating application(s) compliance with business requirements and Use Cases. Score Reporting and organizational information into the IODS (EID). or Master Test Plan (this document). • Development and implementation of a testing approach based on best practices. 1. • Testing efforts focus on validating the integration of AP (APD) Student. Test Scope The scope of the work to be performed includes the following: • Development and implementation of a strategic. • Identification of all the resources required to support the project testing initiatives. This document will outline the strategy and approach that will be used to test to PSAT/AP Integration with EID & EDW project. INTRODUCTION This test plan outlines the standard tests that should be applied to PSAT/AP Integration with EID & EDW. • Testing efforts include validating integration of PSAT data loaded into IODS (EID) into EDW as a base-layer for subsequent EDW • Testing efforts include validating integration of AP data loaded into IODS (EID) into EDW as a base-layer for subsequent EDW. • Testing efforts focus on validating the integration of PSAT (POS) Student. Admin. Test Items Testing will consist of several phases.

Technical Specification/Requirements.3. Use Cases. • To adhere to College Board Corporate System Life Cycle Methodology (SLCM) throughout the testing process. and track through resolution. The POS1 (PSAT) database shall be one of the sources for iODS. and Supplemental information • To identify. Test Objectives The Test Objectives for this effort are: • To validate that the functionality defined as within scope in the Business Requirements. Page: 6 of 21 College Board Proprietary and Confidential .QUALITY ASSURANCE Items Not To Be Tested {Usability} {Reliability} {White-box testing} If not. • To test and certify that this project does not adversely affect the baseline functionality of existing applications and that those applications continue to work as they do today. • Enforce a controlled test environment that simulates the production environment and preserves the integrity of the tests. report. Appendices. • Build and retain a test team that is knowledgeable about the PSAT/AP Integration with EID & EDW and its related projects. • Provide management with regular updates on the status of the tests. why not {No usability spec} {No way to get meaningful uptime testing in QA} {No white-box resource on the test team} 1. • Mapping the requirements with the test cases. all software problems encountered during the System Integration Test and User Acceptance Test.

Testing will occur on the most current version of PSAT/AP in the test environment. The version of the operating system. The project team will be responsible for the timely resolution of all defects. Not all defects will be resolved prior to production release. ASSUMPTIONS AND RISKS 2. memory and disk space). Any change to the QA environment must be approved by QA personnel.QUALITY ASSURANCE 2. The test environment shares resources with other applications (e. Risks • • • The QA team’s testing may be impeded by the following risks: Functionality changes implemented late in the development cycle will impact the testing cycle.3. processor power. Browser/OS/Hardware combinations cannot be exhaustively tested. Referenced Documents The following documents are related to and/or referenced by this document: Page: 7 of 21 College Board Proprietary and Confidential . A stable QA environment exists.1. 2. During the test process all required data feeds would be available in the test environment. application server software. All functional requirements are properly defined and detailed enough to meet testing needs. Assumptions • • • • • • • • • • • • • Any changes in project schedule and resources may delay QA delivery date.2. 2. 40 hour workweek. All human resources scheduling will be based upon a five-day. Change Control procedures are followed.. Obtaining support for third party tools may introduce delays. and all service packs match production Availability/Conflict issues in the QA environment may delay delivery date.g. All severity 1 and severity 2 defects are resolved prior to production release. In other words. other applications’ outages could take our QA instance down. Stakeholders will review each outstanding defect at completion of QA cycle for go and no/go criteria.

Test results must still be coordinated with others. Additionally.0 Location Z:\edmvob\IT Projects\PSAT-AP Integration into EID & EDW\Requirements 3. Test Cases The fundamental purpose of the Test Case is to ensure the accurate mapping of requirements to test items. Test Scenarios.QUALITY ASSURANCE Document and Description PSAT/AP Integration with EID & EDW Scope Statement Version 2. TEST STRATEGY The test strategy consists of a series of different tests and testing types that will fully exercise the PSAT/AP Integration with EID & EDW system. Business Requirements and/or Use Cases will be used as the basis for developing Test Cases. and Test Scripts. and to measure its full capabilities. scenarios. The following application(s)/software will be used to that end.1. and to facilitate mapping back to the requirements source. • Identified testing participants will receive instructions prior to the start of testing regarding the availability of system and environments they need to access. 3. The Test Case should also note the type of test or tests performed in order to most completely match the Use Case requirements. The scripts. to uncover the system limitations. Following practices will be followed as a part of test strategy: • Testers can perform testing from their regular workstations where it is possible. The tools listed below provide the means by which the tester and business user agree on the accuracy and completeness of the testing of the system’s requirements. • Defects if any will directly be logged into CQTM and the team notified to efficiently rectify the defects in a timely manner. • Test scripts/cases and scenarios will be prepared and shared with the rest of the team • Test participants will conduct the tests and document detailed results as testing progresses. they can be modified to test new functionality in future releases. and cases can be reused for regression testing. The intent of these tests is to verify that all requirements are met. Tool Name Purpose / Definition Rational RequisitePro (ReqPro) Test Permutation Spreadsheet (TPS) ReqPro is the tool … to create traceability relationships between business requirements / use cases and test cases TPS is the tool to provide a structure for identifying tests. Page: 8 of 21 College Board Proprietary and Confidential .

appendix. They will include information such as setup parameters. etc. intermediate files/data. customer. • Test Cases should include test scenarios that are appropriate to the Use Case. • Upon the completion of each Test Case. the test duration should be specified. functional area. The estimations assume that X tester(s) are available for developing the scenarios. reference data. link. Test Scenario steps will be specific and repeatable. • Positive and negative test scenarios should be developed. the test duration should be specified. • Each tester should update Test Cases for use during test execution. A complete listing of Test Scenarios can be found at {alternate location. permissions. a brief description.} # Type Test of Case Test Use Case Reference Test Case Description Priority Expected Duration 3. and the estimated length of time for developing the scenario. See Figure below. • Upon the completion of each Test Scenario.2.} # Test Scenario Test Case Reference Input Data Action / Process Output Data Priority Actual Duration Here are some guidelines for creating Test Scenarios: • Each Test Scenario should address specific functionality within a Test Case. • Each Test Case should match with the Use Case as much as possible. A complete listing of Test Cases can be found at {alternate location. etc. The following Figure shows an example of a Test Case to be developed. • Test data for each Test Scenario should be specified. appendix. • Each Test Case should address specific functionality. input files/data. • Each tester should add/update the Test Scenario for use during test execution. etc). Test Scenarios Test Scenarios are collections of test conditions (i.e. • Test data for each Test Case should be specified.. security. and output files/data. link.QUALITY ASSURANCE Here are some guidelines for creating Test Cases: • Test Case to Use Case Traceability should be defined to ensure complete and adequate coverage. Page: 9 of 21 College Board Proprietary and Confidential . • Each Test Case should be developed during the Test Preparation phase.

interface programs. • Unit test cases will be added whenever functionality is added. Unit Test Applies to all application code.5. code is reorganized.3.} # Test Script Test Scenario Reference Test Step Expected Results 3. A complete listing of Test Scripts can be found at {alternate location.0 testing approach will include: 3.” Expected Results”.5. Test Scripts Test Scripts are collections of test steps organized in the sequence in which they are performed. Smoke Test Page: 10 of 21 College Board Proprietary and Confidential .5. database stored procedures. • Individual developers are responsible for unit testing their code. if any} {This section should include any need to refresh data} Data Types & Source Database Data Creation Strategy Responsible Party Add’l Considerations 3. along with expected results and verification process. etc. Objectives are to demonstrate that: • Developers will perform manual unit testing. or defects are fixed. and other utility code.4. Test scripts are typically used in conjunction with an automated testing tool or to delineate a particularly complicated or detailed test scenario. • Unit testing will take place continuously during construction.2. • Team leads and architects will review. contain detailed blocks labeled “Test Step. See Figure below. but unlike Test Scenarios. link. Test Approach The PSAT/AP Integration with EID & EDW Phase 1. • Development teams are responsible for spot-checking and enforcing practices and standards for coding and unit testing. 3. appendix.QUALITY ASSURANCE 3. Test Script steps will be specific and repeatable.1. Test Data The test team will derive test data from {list data sources here} {Also list here the data creation strategy.

For example. One typical method that will be used in validation is picking some sample records and "stare and compare" to validate data Page: 11 of 21 College Board Proprietary and Confidential . • Shakeout tests will use a limited number of regression test scripts built during string or system testing to confirm proper operation of a build in the target environment. This includes validating that all records.5. Comparing unique values of key fields between source data and data loaded to the warehouse. System Test System Test will include validation and verification of the followings but not limited to:  Data Completeness. This will be used during testing and in production to compare source and target data sets and point out any data anomalies from source systems that may be missed even when the data movement is correct. This is a valuable technique that points out a variety of possible data errors without doing a full validation on all fields. Validate that data is transformed correctly based on business rules can be the most complex part of testing an ETL application with significant transformation logic. Populating the full contents of each field to validate that no truncation occurs at any step in the process. Objectives are to demonstrate that a build intended for further testing is correctly installed on the target test environment and is working as expected. 3. data loaded to the warehouse and rejected records. • Shakeout tests will be selected from a set of string or system test cases prior to the shakeout of a build in the test environment to focus on areas of anticipated vulnerability for the environment and code. all fields and the full contents of each field are loaded. Strategies that will be considered include: o o Comparing record counts between source data. • Shakeout tests will be performed whenever a new build is installed in the Test. Utilizing any data profiling tool that shows the range and value distributions of fields in a data set. if the source data field is a string (30) make sure to test it with 30 characters.QUALITY ASSURANCE Applies to code chunks or full code builds delivered from the Development Team. Acceptance and Production environments. One of the most basic tests of data completeness is to verify that all expected data loads into the data warehouse. o o  Data Transformation. • Shakeout tests will be executed by the team(s) designated as the responsible party.3. Objectives do not include performance testing. External Integration. Ensures that all expected data is loaded. Ensures that all data is transformed correctly according to business rules and/or design specifications.

corrects or ignores and reports invalid data. In addition. o Utilizing data profiling results to compare range and distribution of values in each field between source and target data. QA may need the help of an ETL developer to automate the process of populating data sets with the scenario spreadsheet to allow for flexibility because scenarios will change. what happens when the data contains foreign key values not in the parent table. QA will include as many data scenarios as possible. The test team has a number of generic tools to accomplish this. 3.  Data Quality. o Validating correct processing of ETL-generated fields such as surrogate keys. o Creating test data that includes all scenarios. For example. Ensures that the ETL application correctly rejects.6. To ensure success in testing data quality. Here are some simple automated data movement techniques that will be utilize may include: o Creating a spreadsheet of scenarios of input data and expected results and validate these with the business customer/Subject Matter Expert (SME). Toad and Rational ClearCase. The tools used primarily are VBScript. Test Tools and Techniques The idea is to create expected results and then compare those with the actual results. CQTM. o Setting up data scenarios that test referential integrity between tables. For example: o Reject the record if a certain decimal field has nonnumeric data. a number of custom scripts will have to be developed to test project specific functionality like Data Integration. substitutes default values.QUALITY ASSURANCE transformations manually. Page: 12 of 21 College Board Proprietary and Confidential . o Validating that data types in the warehouse are as specified in the design and/or the data model.

RESOURCES AND SCHEDULE 4. ROLE Project Manager RESPONSIBILITIES  Communication with customer to agree on the scope of QA  Agreement of acceptance criteria with the customer prior to commencing System Test  Assist QA with the creation of a detailed test plan  Assist QA with the creation of detailed test cases  Ensure that a detailed test plan is available for QA  Ensure that user IDs and passwords for all testers have been created and distributed prior to the start date of System Testing. 4.  Ensure that bugs identified during Functional Testing are logged in System Test Cases spreadsheet located in ClearCase and these issues are communicated to the development team on a timely basis. Testing Schedule Deliverable Test Plan Test Cases Depend ency Duration 1 day 1 day 3 days RUP Phase (if applies) Construction Elaboration Elaboration Responsible Party IG QA QA Milestone Description Environment Preparation Create Test Plan Create Test Cases Smoke Test Interface Test Page: 13 of 21 College Board Proprietary and Confidential .2. NAME Business Analyst System Test Lead Sherry Chen Tester(s) Krishna Levaku 4.1.QUALITY ASSURANCE 4. Resources and Responsibilities This section presents the recommended allocation of QA resources for the test effort and their responsibilities.  Ensure testing takes place within agreed upon timeframes  Execute test scripts/cases to ensure the application performs at an acceptable level.  Document testing results.

No requirement will be left without mapping to a test case.QUALITY ASSURANCE Milestone Description System Test Regression Test Performance Test Failover & Recovery Test Intrusion Test User Acceptance Test 4.3. Page: 14 of 21 College Board Proprietary and Confidential . Deliverable System Test Summary Report Depend ency Duration 5 days RUP Phase (if applies) Construction Responsible Party QA Requirements Traceability All the requirements of PSAT/AP Integration with EID & EDW project will be mapped to corresponding test cases in Microsoft excel.

QA.1.cb Page: 15 of 21 College Board Proprietary and Confidential . The OS is Windows XP for client machines and UNIX for backend and servers. 1. 2 GB DDR2 667 MHz Ram.collegeboard. 1.qa.66 Core 2 Duo. 160 GB HDD and DVD R/W drive among others.QA. Hardware Specification TBD The client machine that will be used to test the application runs 2. Software Specification TBD The project involves Oracle 11g.QA. System Component EDW Database Database Instance / Schema POS1 Database Database Instance / Schema IODS Database Database Instance / Schema UNIX Server Description EDW1.QA3 PSAT EID1. 1. EID1. POS1.QUALITY ASSURANCE 5.3.QA3 IODS oqsgeninfl20.QA3 EDW1 POS1. 1.2. WebLogic and Java. Configuration Management TBD The CM is responsible and the details are beyond the scope of the tester. Build and Deployment Process The Project team receives email from the X team once the build had deployed. TEST ENVIRONMENT The build will be delivered to X environment. EDW1.4.

A problem that severely impedes the progress of the test. Major . but also has an impact on a specific functionality. then promoted to the system test environment. TEST STATUS REPORTING In the planning stages. All Emergency. This means that the fix has been done. At that point a fix can be system tested and then closed. test status will be reported via test status reports. Problems will be classified along the following Severities: Critical . Severity 1 and 2 defects found in the current iteration will be resolved prior to implementing the next iteration of the PSAT/AP Integration with EID & EDW application in the QA environment. Medium and Low. Defects will be repaired in the development environment.A problem that is not only cosmetic in nature.A problem that is cosmetic in nature. The problem initiator will notify the Development Team Lead or Coordinator immediately of the specific problem encountered. addressed.A problem that brings test execution to a halt. and needs to be resolved in a timely manner prior to implementation. Issues will be raised. This problem prohibits completion of a specific function of the test. which may be reassigned a higher or a lower priority by the Project Manager upon triage.1 client > qa_repo 6.5.1. Average . escalated and resolved as needed. During test execution. with a low priority – it does not prohibit the execution or Implementation of the release. The three priorities that will be assigned are High. and is waiting to be migrated. the Tech Lead / Project Manager must indicate in Clear Quest that the defect is ready for retest. but does not severely impede the progress of the test. but does not totally suspend execution of the test. it is logged in ClearQuest. 6. Defect Reporting When a problem is discovered. Minor . Both the Development and the Test Teams will update the status of problems in Clear Quest. it will be reviewed and assigned to a developer for resolution. Initially priorities will be assigned by the QA team at the time of submitting the defects. unit/string tested. defect/problem reports will be generated at specified intervals. Page: 16 of 21 College Board Proprietary and Confidential .QUALITY ASSURANCE System Component Informatica Description PowerCentre 8. When a software fix has been completed.

the Tester is responsible for retesting and closing the problem when it is resolved. If the problem has not been resolved.QUALITY ASSURANCE 6. the tester should update the bug with comments to that effect. This means that the fix has been applied and unit tested. and is waiting to be deployed. Basic happy path shown in diagram below: Page: 17 of 21 College Board Proprietary and Confidential . current period • Defects closed.2. Metrics QA team will produce following metrics during different test phases: Execution Metrics • Planned Test Cases executed • Actual Test Cases executed • Test Cases passed • Test Cases failed • Test Cases not executed / not executable • Rate of scenario execution Defect Metrics • Defects opened. Defects will be repaired in the development environment then promoted to the system test environment. Once the fix is deployed to the test environment.3. the Development Manager or Lead must indicate in Clear Quest that the bug is ready for retest. current period • Mean time to close defect • Rate of defect closure Risk Assessment Metrics • Defects outstanding by severity • Defects outstanding by priority 6. Change Control Process When a software fix has been completed.

PRODUCTION READINESS 7. The start of this activity is the milestone that marks the successful completion of all testing activities defined to support the system development life cycle for the PSAT/AP Integration with EID & EDW project.QUALITY ASSURANCE 7. the PSAT/AP Integration with EID & EDW Production readiness efforts will be able to commence.3. 7.2. Delivery Plan Once all QA test types have been satisfied.1. Pre-Deployment Checklist TBD Only upon signoff by QA in terms of a release readiness document that the product shall be deployed in Production 7. Production Validation TBD Page: 18 of 21 College Board Proprietary and Confidential .

Page: 19 of 21 College Board Proprietary and Confidential .2. Algorithms will be validated against requirements and will be tested with maximum. Terms and Abbreviation Definition or Reference Enterprise Data Warehouse Integrated Operational Data Store Change Data Capture ( A utility within Informatica Power Exchange ETL tool) Extraction. APPENDIX 8. nominal. PSAT/AP Integration with EID & EDW project may only use a subset of these. 8. minimum. Algorithmic: Testing that validates that all calculations are accurate. and erroneous input values. Black-box: Testing based on what an application is supposed to do. 7. Post-Production Monitoring TBD QA team will allocate a resource part time to ensure. Also called functional testing. Transformation and Loading Term EDW IODS CDC ETL 8. Test Type Definitions Below are listed some examples of test types with a brief description.1.QUALITY ASSURANCE QA will be available to support to the application on the morning of deployment to validate essential checkpoints during the process.4. issues if any found post production have been tackled in a timely manner.

Erroneous: Erroneous input tests check the processing of invalid inputs. and far above the stated requirement for an input value... System testing can include negative path testing (i.QUALITY ASSURANCE Boundary: Boundary tests confirm the minimum and maximum values that a field will accept. database server. Load: Testing to ascertain the performance at prescribed user levels.) at varying user levels. Stress: Testing to push a component or a system to its breaking point. Performance: Testing to ascertain the performance of the application as it appears to the end user (page response time) and its components (app server. Interfaces may be internal as well as external. just below. Regression testing validates that no previously approved function. A smoke test of major functionality. but focuses on analysis of vulnerability to Internet-based penetration and security access controls and data encryption techniques. testing of invalid data inputs and user actions). Production Validation: A checklist of items to be done on the morning of deployment on the production environment but before the application is open to the public.g. alphabetic characters in a numeric field).e. Contrast with performance and stress testing. and regression testing validates that change does not cause previously approved functionality. Regression: Not specifically a standalone test type but is performed for any change to a system component that can place that system and all associated and interfacing systems at risk. Intrusion: Testing that applies to all application functionality. etc. and responses once the system is completely built and has passed integration testing. navigation testing that includes validation that all screens and navigation points can be reached for each specific role. Load testing is more appropriate when attempting to discover the performance signature of the system. Validating the building blocks or foundation of the change. A means of ascertaining how many users the system can host before component ‘x’ fails completely. or component has been compromised by a change. This type of testing may or may not use an automated tool. Basic test of the developed components of a system. This type of testing assumes system degradation. just above. Integration: Testing that seeks to verify the proper functioning between and among groups of components Interface: Testing to verify accurate exchange of data as well as ensuring that processing times are within an acceptable range. Testers select values that test the range and types of input data by inputting values that are known to be invalid (e. application. it merely determines whether the rate of degradation is acceptable. functions. Change can have far reaching effects. Smoke: Testing to demonstrate that a build intended for further testing is correctly installed on the target test environment and ready for test. System tests include compatibility testing. Page: 20 of 21 College Board Proprietary and Confidential . Boundary tests entail inputs far below. at. This type of test is performed during the initial test phase. System / Functional: Testing the behaviors. Unit/Class: Testing individual classes/components/programs/modules or objects of a system. system.

PARTICIPANT’S NAME ROLE 1. the artifact should be marked as "Approved".QUALITY ASSURANCE User Acceptance: Testing includes testers executing production-like scenarios using formal test cases with specified inputs and expected outputs. Reviewers List of reviewers and their associated roles. Disposition If the artifact has been reviewed and successfully met the criteria specified for this artifact. it is done to demonstrate sufficiency and correctness. Approve Changes Required Page: 21 of 21 College Board Proprietary and Confidential .6. 9. Document Review Status 1.5. rather. This type of testing is NOT done with the intent of finding bugs.

Sign up to vote on this title
UsefulNot useful