User Acceptance and System Test Checklist

Your use of this document is subject to the Terms of Use governing Enterprise Business Intelligence Roadmap. The information contained in this document is proprietary information of Cognos Incorporated and/or its licensors and is protected under copyright and other applicable laws. You may use the information and methodologies described in this document 'as is' or you may modify them, however Cognos will not be responsible for any deficiencies or errors that result from modifications which you make. Copyright 2005 (c) Cognos Incorporated. All Rights Reserved.

User Acceptance and System Test Checklist
Due to the similarities between User Acceptance Test and to System Test (both test the solution in its entirety), this checklist applies to both.

User Acceptance and System Test Steps
Define Approach and Plan (at the end of the Analyze phase) • Complete the test approach, including the test objectives and scope, risks, test environment requirements, metrics, entry and exit criteria, and test resources and work plan. o o Ensure the approach identifies all interfaces to data sources and destinations. Ensure the approach addresses entire business functions (i.e., tests the systems from a business user’s point of view). Ensure the approach addresses quality at the system level: performance, flexibility, usability, and reliability. Ensure all appropriate data and transactions are secure. Make sure processing can be reconstructed from audit trails. Ensure the work plan has users involved in all aspects of the test. Ensure the work plan includes all tasks, resources, and budgets. Make sure all identified risks have mitigation plans. Communicate the test schedule to involved parties.

o

o o o o o o •

Plan and prepare inputs to the test to meet the exit criteria: test approach and requirements. o o Preliminarily define test cycles. Meet with users to establish conditions. Write conditions in functional terms. Ensure conditions thoroughly test the business functions. Tie conditions to business benefits. Identify functional test conditions. For example, identify: Portal Tests Ensure that the Public Folders structure follows the Report and Roster structure. All relevant Rosters are displayed and accessible, and all navigation works correctly. Data Completeness Tests

11/28/2005

Page 2 of 5

The Completeness Tests verify that all input records are represented in the destination reports. Spot-check values on the reports to ensure that counts and averages are showing the correct ranges and values. Due to the volume of tests and data, sample test scripts will be used to ensure testing occurs effectively and efficiently. Data Accuracy Tests Data Accuracy Tests verify that values are transferred accurately to the destination reports from the source: A random sample of records is selected from each Test. As the application summarizes data into reports, testing will be more meaningful if the records selected have some attribute in common, e.g., same department. Locate and examine the same records in the destination reports. Ensure that data values have been accurately represented into each report component. Identify quality test conditions. For example, identify Security Tests: test to ensure that only authorized users can access the application. Ensure that users with restricted access see only the relevant sections of the Portal and the reports. Identify technical test conditions. For example, identify Access Tests: test to ensure that the user can access the application from any networked PC via a web browser, without any client software installed (other than required 3rd party tools such as Adobe). Identify conditions regarding compliance with external regulations. Identify any conditions that cannot be tested. Describe expected results for each condition. Logically group conditions into test cycles. o Ensure all cycles test an appropriate number of conditions. Make sure cycles are tied to the business process. Include quality conditions in one or more cycles. Include technical conditions in one or more cycles. Prepare and Execute (during Build phase) • Create scripts. o Augment test conditions if necessary. Maintain cross-reference between test cycles and test conditions. o o Make test scripts appropriately modular (high-level, detailed, error processing). Maintain cross-references between input data, output data, and test conditions. Ensure all inputs test specified conditions. Ensure all output references demonstrate conditions being tested.

11/28/2005

Page 3 of 5

Ensure the expected results are easy to understand by the executor. o • Ensure the test is repeatable based on the information given.

Define test configuration. o Set up test environment hardware. o o o o o Configure test environment software. Configure environment for system efficiency. Reserve back-up hardware and equipment. Document test configuration.

Obtain test data. o o Populate databases. Ensure that any common test data used during the various stages of testing is up-to-date.

o o o o • •

Establish responsibility for introducing code into the environment. Refine promotion procedures. Define responsibility for running batch jobs. Dedicate technical support to the test environment.

Run mini-pilot of the test stage to ensure a stable environment. Train test team in test procedures and tools. o o Communicate standards for metrics collection to the test team. Communicate standards and procedures for the test to the test team.

User Acceptance and System Test Exit Criteria
This section contains a set of exit criteria that can be used to control the quality of the deliverables related to the test Use this checklist to understand when the test is considered complete. Criteria Met? • • • Ensure solution performs all functionality as defined. Make sure the solution meets quality requirements. Determine whether one can gain a basic understanding of the solution from the documentation.

Documentation Complete? • • Actual results Problems remaining to be resolved

11/28/2005

Page 4 of 5

• •

Resolved problems Stakeholder consent

Standards Met? • • • • • Ensure the actual results show proof of testing. Create cross-references to the test cycles and conditions. Ensure the flow of the test is clear. Complete point sheets with problems, and include resolutions. Ensure environment is clean for the next cycle.

Process Followed? • • • • • • • Ensure actual results match expected results. Ensure all conditions are successfully tested. Submit data and migration requests as necessary. Update test cycles and test conditions with testing status. Update status of system. Collect appropriate metrics. Obtain final consent by test lead, project sponsor, or other authority.

11/28/2005

Page 5 of 5

Sign up to vote on this title
UsefulNot useful