Project xxxx TEST STRATEGY

DOCUMENT NAME & LOCATION: DOCUMENT VERSION: DATE: READERSHIP: SUMMARY:

Amendment History Version V0.1 Date Comment By Approved

Associated Documents (This document should be read in conjunction with): Title of Document Version No/File Name Date

Approval Approver Project Manager Name Date

Page 1 of 26

CONTENTS

1. Introduction...................................................................................................................4 1.1 Context....................................................................................................................4 1.2 Purpose...................................................................................................................4 1.3 Scope to be Tested.................................................................................................4 1.4 Out of Scope (Not Tested)......................................................................................4 2. Testing Approch............................................................................................................4 2.1 Purpose...................................................................................................................4 2.2 Test Objectives.......................................................................................................4 2.3 Traditional Testing Approach .................................................................................5 2.4 Overview of Test Phases .......................................................................................5 2.4.1 Component (unit) Testing.................................................................................6 2.4.2 System Functional Testing...............................................................................6 2.4.3 End to End (E2E) Testing................................................................................6 2.4.4 Technical (non-functional) Testing...................................................................6 2.4.5 User Acceptance Testing (UAT)......................................................................7 2.4.6 Operational Acceptance Testing (OAT)...........................................................7 2.4.7 Regression Testing..........................................................................................7 2.5 Proposed Test Approach .......................................................................................7 2.5.1 Release Schedule............................................................................................7 2.5.2 Testing Schedule..............................................................................................7 2.6 Risk Approach.........................................................................................................8 3. Test Deliverables..........................................................................................................9 3.1 Testing Deliverables................................................................................................9 3.2 Detailed Test Plans.................................................................................................9 3.3 Test Scripts.............................................................................................................9 3.4 Test Progress Reporting.......................................................................................11 4. Test Management.......................................................................................................11 4.1 Resource Management.........................................................................................11 Assumptions and Dependencies...............................................................................12 5. Defect Management...................................................................................................13 5.1 Defect Management Approach.............................................................................13 5.2 Defect Status and Process...................................................................................13 5.3 Defect Severity......................................................................................................16 5.4 Defect Priority........................................................................................................16 5.5 Test Progress Reporting Metrics..........................................................................17 6. Test Tools...................................................................................................................18 6.1 Introduction...........................................................................................................18 6.2 Overview of Testing Tool......................................................................................18 6.3 Test Tool Requirement and description...............................................................18 APPENDIX A – Example Testing Risk Log...................................................................19
Page 2 of 26

...............................................................20 APPENDIX C – Test Plan Contents...22 APPENDIX D – SamPLE Testing Roles and Responsibilities........24 Page 3 of 26 ..........................APPENDIX B – Example Detailed Test Phase Description.........................................................

INTRODUCTION 1.1 Context Project context 1. TESTING APPROCH 2.2 Purpose This document sets the strategy for all testing within the scope of the project This document describes: • • the test approach test phases principles governing testing activities • The delivery of the solution and the overall business strategy are excluded from the scope of this document.1.2 Purpose This subsection describes the testing approach that will be adopted by the project Test Objectives The test objectives are: Page 4 of 26 .1 2. 1.3 Scope to be Tested The following key components (sub-systems) will be tested: • • • All aspects of the non-functional requirements 1.4 Out of Scope (Not Tested) The following features and attributes will NOT be tested: • • • 2.

testing should be carried out according to the V-Model approach using the Requirements Traceability Matrix as a key input to Test design and planning 2.4 Overview of Test Phases List here the key phases of testing. 2.3 Traditional Testing Approach The traditional approach to testing uses the "V" model. eg: Component (unit) Tests System Functional Tests End to End Process (E2E) Tests Technical (Non-Functional) Tests User Acceptance Tests Operational Acceptance Tests - - Page 5 of 26 . Component design is associated with Component testing Where possible. specification or design documentation.e.- To demonstrate that the solution meets all requirements To identify Defects (faults and failures to meet the actual requirements) with an agreed rectification plan To mitigate risk and demonstrate that the release is fit for purpose and meets user expectations. which maps the types of test to each stage of development as per the simplified diagram below: User Requirements User Acceptance Testing Functional Specification End to End Testing System Design System Functional Testing Component Design Component Testing Component Build It shows that for each requirement. there is an associated testing phase (i.

as defined in the process-model The key difference between the End to End Testing and the System Functional Testing is that in the E2E Testing we are primarily validating the process with the appropriate functions and not just the discrete functions All the E2E processes to be tested will be documented in the E2E Detailed Test Plan. execution.2 System Functional Testing System Functional Testing is the testing of the core functional areas of the system against the agreed requirements and technical documents. including the following details: Owner Objective of the phase Test Approach. Volume and Scalability of the solution. All the System Functional Tests to be carried out should be documented in the Detailed System Functional Test Plan to be produced before testing begins. The scenarios will be comparable with the expected operational volumes. Page 6 of 26 .4. A range of test volume scenarios will be specified in the Non-Functional Testing Detailed Test Plan. 2. data. Vendor etc) that is responsible for testing the component 2. The testing will be based on the requirements. A set of exceptional volume Tests will also be specified to demonstrate the robustness of the solution in exceptional volume conditions.4.3 End to End (E2E) Testing Once all the functional areas have been successfully tested. technical and process documents. resources & location Scope Exclusions Entry & Exit criteria Sign-off procedures Testing tools to be used 2. the next phase of testing will be the End to End process testing.4. End to End (E2E) testing covers the testing of the full end-to-end processes .Each Test Phase outlined below should be described.1 Component (unit) Testing This is the testing that is carried out within the early stages of the development lifecycle: Describe here the key components and the Owners (eg THE CLIENT team. Non-functional requirements should have been gathered in the Requirements Traceability Matrix.4.4 Technical (non-functional) Testing Technical (non-functional) testing will primarily cover Performance. 2. environments.

2 Testing Schedule Outline below the proposed high-level schedule for testing: Page 7 of 26 .5. During the execution of UAT. Regression Testing is performed by re-running a selected set of the test scripts chosen according to the nature of the change.7 - Regression Testing Regression testing becomes necessary when: A new release or bug fix is delivered following the resolution of an Defect.5.4. the User Representatives will also be allowed an opportunity to carry out un-documented tests. Enhancements to the functionality are incorporated in the system. All test scripts will be designed to be re-run as necessary.A subset of these tests will also be executed (i.1 Release Schedule The following table outlines the delivery schedule of different code releases: 2. Once the UAT tests are successfully completed.4. 2. the Application Manager may reserve the right not to accept the system into operational support.5 Proposed Test Approach Outline here the likely sequence of testing: 2. It is expected that the User Representatives will select a subset of tests from the System Functional and E2E test scripts. re-run) as part of the Operational Acceptance Testing (OAT) 2.e. UAT can be signed off by the business team (and including the SPA) 2. These tests will be documented in the UAT Detailed Test Plan by the Test Analysts in advance of the execution of the UAT. If there are any unresolved priority 1 or priority 2 defects. the solution can be promoted to operational status.6 Operational Acceptance Testing (OAT) Operational Acceptance Testing is the is the last major test phase and is executed on the final implemented solution to confirm that it can be supported and meet operational support requirements as agreed in the Support Model.5 User Acceptance Testing (UAT) User Acceptance Testing (UAT) is the testing that is conducted by the End User Representatives to ensure that the delivered system meets the user defined functional requirements.4. or The technical environment is altered. ( please note that regression testing tends to carried out as part of the above phases and is not a separate testing phase on its own ) 2. Once these tests are passed.

all test major test activities will carry risks. The objective is to optimise the testing resources and reduce test time without compromising the quality of the final solution. Page 8 of 26 .( a detailed test plan should be produced early in Execute) 2.6 Risk Approach It is often impractical to perform a full exhaustive set of tests for a solution since this would be very costly in terms of both money and time and because the vendors should have tested their products prior to release to THE CLIENT. and an impact and likelihood analysis should be carried out to validate the choices being made List all key Testing risks below: • • • • • . Therefore.

a full description of the following should be provided: • • • • the test environment all required test scripts test data interfaces (Integration) required. the test scripts can be documented. TEST DELIVERABLES 3.3 Test Scripts A test script describes in detail how the test is conducted and what results are expected. However.1 Testing Deliverables This section details the type and structure of the test documentation that needs to be produced. The Detailed Test Plan will document the test method to be adopted for the testing of the Testing phase. design documentation and other non-functional criteria. 3. This allows the Testers to show exactly how Page 9 of 26 . The following is a list of documents that will be delivered as part of the testing activities: Detailed Test Plans Test Scripts for all test phases Testing Progress reporting The following sub-sections provide an overview of each of the key deliverables. 3. A single test script may cover one or more requirements.2 Detailed Test Plans The core of each Detailed Test Plan is based on the requirements.3. The Detailed Test Plan should cover: • • • • • System Functional Testing Technical (non-functional) Testing End to End Process Tests User Acceptance Testing Operational Acceptance Testing Within the Detailed Test Plan. Once the Detailed Test Plans have been approved. typically a single requirement is broken down into sub-requirements/test conditions.

whether the test passed or failed and a list of any test observations raised Pass / Fail . the actions that need to be performed in order to exercise the piece of functionality being tested Expected Results – A definition of the test results that expect to be observed if the test is successful. Enough information should be supplied to enable the tester to determine unambiguously whether or not the test has been passed Actual Results – The Actual results that were observed and a reference to any test evidence. As a rule the tester will store evidence of the test results where possible.g. - - - Page 10 of 26 . e.requirements have been covered by the test scripts and enables the Testing team to track issues related to specific test scripts. Each test script will detail the following: Test Name – A unique reference number followed by the test name identifying the test Requirement cross reference . review and update details related to specific changes to the test Prerequisites – reference to any scripts that need to be run before individual scripts can be executed.A reference to the requirement(s) and source documentation Revision History . Test Description .with original. This will include a record of the build being tested.A summary description of the purpose of the test Test Data – The test data to be used Test Steps – The instructions for running the test.A record of whether the test was passed or failed.

e.1 Resource Management The following is a list of all the key testing roles and core responsibilities that are required during the testing phase: Test Manager – responsible for all project testing End to End) Test Manager – responsible for the E2E Test activities Test Phase Team Lead – responsible for input into the test phases Test Analyst – responsible for documenting and executing the tests Technical Test Analyst – responsible for technical tests - Depending on the scale and nature of the system (i. 4.3. TEST MANAGEMENT 4.4 Test Progress Reporting Progress reports will be produced at regular intervals (typically weekly).The report will show: Test Phase System Under Test Test environment No of total tests No of tests completed No of tests passed No of tests failed Where appropriate. List the key resources here: Role Organisation / team Name Page 11 of 26 . it may be possible to combine all the roles so that combination of a Test Manager and Test Analysts should be able to fulfil all the testing responsibilities. provided by an external vendor). a detailed report highlighting all outstanding risks and potential business and/or operational impacts will also be produced.

Service Level Agreements in place for the testing environments.g : Build and component testing delivered on time and to a reasonable quality. - The vendors are responsible for fully testing their software before it is released to THE CLIENT. - Dependencies List any key dependencies e. Utilisation and support of instances of the Test tool Service Level Agreements in place for performance testing. - Page 12 of 26 . (I.g. The project Business Analysts are available to input into the creation of the test cases. Vendors are available to review any test results and defects that the team feel may be associated with the product software It is expected that all users are on IE 7+. The test documentation will be created by the test analysts. Provisioning of the appropriate environments for each phase of testing.Assumptions and Dependencies Assumptions List here any assumptions e. all entry criteria met and system is stable during the first week of test execution).e.

with audit trail) all defects that have been raised. An owner has been agreed and a fix is being created Development (i. In this case the action will be to correct the expected result and the Defect log will provide an audit trail When there is an unexpected outcome to a test that is not covered by the expected result.e. resolved and that remain open. Provides transparency across the project and management on defect status and priorities Defect Status and Process 5.1 - 5. Vendor) has a fix for the defect. All logged Defects should contain the following information: A unique identifier (defect number) Title for the defect Test Phase and test number that identified the defect System Area – functional area this defect impacts (best estimate) The severity classification of the defect Estimated Fix Time . Defects will be logged in the following situations: When the actual result does not match the expected result and the expected result is correct When an expected result does not match an actual result but the actual result is found to be correct. When the fix is released (i. the description of the Defect will be written in non-technical terms or the impact of the Defect will be described in non-technical terms. Level of risk on Go-Live Wherever possible.2 The following table shows the statuses of a defect: Status Identified Assigned Fixed Released For Retest Closed Description A new incident is identified. This may result in the creation of a new entry in the requirement catalogue When a Defect is raised to which no immediate acceptable response is available.e. - Page 13 of 26 . DEFECT MANAGEMENT Defect Management Approach Defect management requires the Testing team to document and track (i.an estimated timescale for resolution (determining the impact on testing) A full description of the Defect and how to recreate the defect An indicator of the status of the Defect. code drop by the vendor) for the test team to re-test Fix has been successfully tested or it is agreed no action is required.e.5.

the status of the defect will be updated and any re-testing of the defect fix and regression testing will be carried out under the guidance of the Test Manager. After each review session.Once the project enters the System Test execution phase. the Testing Team will review all Defects raised since the previous meeting to determine any conflicts or impacts across the various phases of test. Page 14 of 26 . typically each morning during test execution.

The following flow chart provides an overview of the Defect management process. Raise Defect Assign defect Defect fixed fail Fix applied and re-tested pass Defect Closed Page 15 of 26 .

Medium Target resolution: within 3 days If this defect can’t be resolved in the specified period.Medium A non-critical Defect occurs. or. requests for information or advice required 1 .High 4 . many users affected and no workaround is available. Target resolution: within 24 hours Incident that inconveniences testing progress. documentation anomalies.5.4 Defect Priority This table describes the levels of defect priority Priority Description of the Impact on the Testing Activity Incident that prevents all testing from continuing. All testing is suspended. but there is a workaround Cosmetic errors.3 Defect Severity The table below describes the levels of defect severity Severity Description Entire system or key business process is unusable or does not meet the needs of the business. possibly with a work around. Corruption or loss of data occurs that is not immediately recoverable and prevents the business from continuing Part of the system of key business process is unusable or does not meet the needs of the business. Testing is able to continue without much impact. or. 3 . Testing of a single function is possibly suspended. The Defect affects the ability to provide the best service. the level of risk on Go-Live will be assessed 1 . typically affecting a single user.High Page 16 of 26 .Critical 2 . few users affected but a workaround is available.Emergency 2 .Low 5. Target resolution: within 4 hours Incident that severely impacts testing but testing is able to continue. A test script of procedure error that requires a fix. Corruption or loss of data occurs that is immediately recoverable and allows the business to continue 3 . Testing of particular function(s) is possibly suspended.

5 Test Progress Reporting Metrics The Key Performance Indicator that will be used to measure the success of testing is: Test Execution: o o o o o Defects o o o o o o o o o Total defects raised (and by priority) Total defects fixed (and by priority) Total defects in progress (and by priority) Total defects closed (and by priority) Total defects by functional area Defect severity by Root cause Defect severity by application Defect severity by Defect Type Defect state by application Number of Planned Test Cases (total) Number of Planned Test Cases (Cum) Number of Passes Test Cases (Cum) Number of Failed Test Cases (Cum) Number of Test Cases in Progress (Cum) Page 17 of 26 . 5. 4 .Incident has little or no impact on testing progress.Low Target resolution: as agreed.

2 Overview of Testing Tool Describe here which tool is going to be used. TEST TOOLS Introduction 6. • 6. and how it allows the user to organise and manage the testing activities. Provides a reporting function that provides management reports and metrics on the testing progress.3 Test Tool Requirement and description The following table shows the test tool(s) that will be used to support the testing activities: • Page 18 of 26 . 6. • • • allows the user to catalogue the requirements specifies tests to be executed to validate the requirement allows the logging of the test results.6.1 This section describes the types of tools that are required to manage the testing activities contained within this document.

M H Project Manager 3. 7.APPENDIX A – EXAMPLE TESTING RISK LOG Re f 1. Management – plan resource requirements for both Test preparation and Test execution phases with sufficient time to secure additional resource where required. 2. The System Integration Test releases will clarify this point but the sooner the solution components are Tested together the better. Ensure that a close relationship is maintained with external dependency partners and make provision for delays when encountered. 6. Advance notice of changes impacting on Any in-scope project can feed into any required reprioritisation. Testing Team to ensure that all Test documentation is approved prior to commencement as this is a key part of the Entry Criteria to each Test phase. Inter-dependencies between projects streams could hinder progress on a single deliverable required for test preparation or execution External Interdependencies with vendors with late delivery could severely hinder progress. 5. Detailed Test Plan’s not approved prior to the scheduled Test start date Infrastructure components Tested in isolation may not fully prove the validity of the solution adopted H M Project Manager H M Project Manager L H Test Manager M H Test Manager Page 19 of 26 . Test documentation. Contingencies to be considered for potential delays Where applicable Test harness’ will be created and managed by each distinct project but the harness should closely represent the source of target system. Late changes in scope H M Project Manager 4. Risk Test environment availability for all required testing Resource constraints for test preparation and execution Prob H Impac t H Owner Test Manager Mitigation Ensure that Test Environments are documented and provisioned well in advance of the Test execution phase for each of the Projects in scope.

Any test data either pre-loaded or available to load as required Version. This will primarily be the Project Team members System Functional Test Scripts completed and approved All components of the solution correctly configured in the System Test environment by the vendors . Configuration and Defect Management tools and process defined and implemented • • • • • Page 20 of 26 . The following test types are in scope for this phase of the testing: • • • • • • Functional Testing Usability (User Interface) Security Testing Error handling Regression (if applicable) User performance (response time) Testing Exclusions The following exclusions will apply: • • Some interfaces may not be available to test against Penetration testing Entry Criteria The following entry criteria must be met before the commencement of System testing: • • 100% of agreed functionality has been delivered (subject to the functionality contained in the release being tested) Vendors have fully tested their developments and are formally delivering the software to THE CLIENT (this will include the installation of the software ) System Functional Test Plan has been reviewed and signed off by the agreed reviewers and approvers. Release.APPENDIX B – EXAMPLE DETAILED TEST PHASE DESCRIPTION System Functional Testing Item Accountability Responsibility Objectives Description Test Manager Test Manager The objective of System Testing is to: • Approach Scope Verify that the whole system performs as described in the functional & technical specification documents. Location: The System testing will conducted … The Testing Team in conjunction with the users and Project team members define the scope of System Functional Testing.

Tools Page 21 of 26 . development. E-mail sign-off of the System Test. Variances will be noted and documented by the Test Manager and System Test Team Lead in a report along with a risk assessment and recommendation to proceed. Actual results will be compared to expected results. All severity 1 & 2 defects are resolved. Test cases will be entered into Test Director or Excel & then executed. Status reports will be prepared from Test Director or Excel. All defects found are recorded. A set of pre-defined regression tests have been re-run after fixes applied and no new errors and/or Defects identified during regression tests Component tested code. All high priority test cases have passed successfully. Test Results (passed or failed) are logged in Test Director or Excel along with any defects found. executables and s/w configuration under version control Test Summary Report is completed and signed off • • • Sign-off Completion is achieved when the exit criteria have been met. Entry Criteria will be assessed in the prior to test execution. The Test Manager is responsible for monitoring progress of the System Testing and ensuring all tests and results are documented.Item Description • System Configuration documented and approved. Test Summary Report (TSR) is performed by the Approvers outlined in the Testing Strategy and System Test Plan. All severity 3 & 4 defects outstanding have documented workarounds and an agreed (between business. retested and closed. testing teams and vendors) schedule of when they will be corrected. Where entry criteria have not been met the decision to proceed with test execution is up to the discretion of the: • Exit Criteria IT&S Project Manager The System Functional Testing is completed when: • • • • 100% of pre-agreed system test cases have been executed.

when testing is resumed Approvals Introduction Test Items • • Features to be tested • Features not to be tested • • Approach • • • • • • • Item pass/fail criteria Suspension criteria and resumption requirements • • Test Deliverables Page 22 of 26 . Operations Guide Identification all hardware features. and the Severity / Priority assigned to each class of Defect The criteria to be used to suspend all or a portion of the testing activity on the test items associated with this Plan The testing activities that must be repeated. User Guide. and all software features and combinations of software features to be tested (descriptions of functionality and activities to be tested) Identification all features and combinations of features that will not be tested and the reasons Description of the overall approach to Testing. Requirements Specification. For each major group of features or feature combinations. Functional/Technical Specification.e. Design Specification. Project Plan.g. Policies & Standards Identification of the Test Items. error frequency The techniques to be used to trace requirements will be specified The criteria to be used to determine whether each test item has passed or failed testing. Configuration Management Plan.g. Technical Solution docs.APPENDIX C – TEST PLAN CONTENTS The Test Plan will have the following contents Test Plan Identifier • • • • • • Unique identifier associated to this Test Plan document Names and titles of all persons who must approve this Test Plan Objective of this Test Plan Background (summary of the project) Scope of the Testing phase this Plan relates to References to source material i. including version/revision levels References to all source documentation e. techniques and tools that will be used to test the designated groups of features The approach will be described in sufficient details to identify the major testing tasks Identification of the techniques that will be applied to judge the comprehensiveness of the testing effort Lists of both the Entry and Exit Criteria for the Tests Any additional completion criteria e. specifying the approach that will ensure that these feature groups are adequately tested Specification of the major activities.

preparing. The Test Plan will also specify the input data sources and any expected outputs. a full description will be provided of the test environment. end date) Identification of inter-task dependencies and any special skills required Specification of both the necessary and desired properties of the Test environment The physical characteristics of the environment including the hardware. Test Defect Reports. responsibility. Test Scripts. designing. executing. Test Completion Report.e. hardware. all required test scripts and harnesses. effort. the reasons for the limitation will be provided and a risk assessment undertaken to determine the impact of this on the validity of the results obtained during the tests. Test Specifications. and any other software or supplies needed to support the test The level of security that must be provided for all components of the Test environment. Test Data The set of tasks necessary to prepare for and perform testing (Task. including volumes and types of data. predecessor.e. and all interfaces required with third party systems. An explanation will be provided for each data type and flow relating it to the predicted or measured live environment Schedule Page 23 of 26 . Test Plan. software and data Identification of the tools needed The office space and desks etc required for the Test team Identification of the groups responsible for managing. checking and resolving Identification of the groups responsible for providing the Test Items and Environment Needs identified earlier Specification of the Test staffing by skill level Training requirements and options for providing necessary skills Test milestones identified in the Project plan Any additional Test milestones needed Estimates for the time required to perform each Testing task The schedule for Testing tasks and Test milestones Identification of the high-risk assumptions of the Test Plan Testing Tasks • • • • Environment needs • • • • • Responsibilities Staff and training needs • • • • • • Risks • NOTES: Within the Test Plan. Test Logs. Where the environment is not fully representative of the live environment. witnessing. the mode of usage.• Identification of the deliverable Test documents i. i. communications and system software.

business. Raising the above according to the agreed issues and risk management procedures. • Deliver the High Level Test Plan to be utilised for the delivery of detailed Test Plans • Deliver Detailed Test Plan for all the respective test areas • Recruitment of the Test Team (e. • Responsible for managing all project testing related issues.g. data and results. OAT etc. E2E. planning and ensuring appropriate level of resourcing for the project testing efforts. • Responsible for estimating. OAT etc).g. Lead the testing effort including: Delivery of the test cases/scripts. ST. risks and dependencies. Test Phase Team Lead • • Provide input into the Test Strategy Responsible for providing input into estimating. UAT. UAT.APPENDIX D – SAMPLE TESTING ROLES AND RESPONSIBILITIES The following table outlines the test team roles and their responsibilities: Testing Role Test Manager Responsibility/Accountable • Responsible for producing the Test Strategy. • Leading the end-to-end testing effort as outlined in this Test Strategy document (ST. TECH. planning and ensuring appropriate level of resourcing for the test phases. Create. Create. • Management of all testing resources. maintain and ensure sign-off of the Test Plans. Development. • Main escalation point between testing and other teams i. Test Analysts) • Accountable for Phase Test Plans e. maintain and ensure sign-off of the Test • • • • • • Page 24 of 26 . UAT. • Responsible for ensuring the specified testing entry and exit criteria are met for ST. Manage test preparation and execution risks. and issues.e. Ensure test artefacts delivered are stored correctly in Test Director or Excel Defect Management relevant to responsible test phase. • Testing management reporting • Responsible for creating and maintaining the test project plan for all core testing activities (as baselined in MS Project) • Responsible for ensuring the agreed delivery of all project testing deliverable’s (as baselined).

Undertake technical testing activities ensuring these meet agreed specifications. test cases and test data. Create. Provide input into the Test Summary Reports. maintain and retest defects in Test Director or Excel. Provide business input into the Test Plans. • • • • • • • Provide input into the Test Plans.Testing Role Test Analysts (TA) Responsibility/Accountable Summary Reports. Execute test cases stored in Test Director or Excel. Provide input into the Test Summary Reports i. Raise. maintain and retest defects in Test Director or Excel. Raise. maintain and execute the test cases in Test Director or Excel. Devise. create and maintain test data. raise and progress defects to resolution Input into the test cases Review and sign-off the DTP and test cases/scripts Review of Test results Ownership of defects associated with the vendor solution Responsibility for issue resolution if associated with the vendor product/solution Technical Test Analysts (TTA) • • • • • • • Business Analysts (BA) • • • • • Technical Lead (Team) • • • • • Vendors • • • • • Page 25 of 26 . Analyse and store test results in Test Director or Excel. Analyse and store test results in Test Director or Excel. Undertake testing activities ensuring these meet agreed specifications. business workarounds and impact assessment. Devise. Raise.e. Analyse and store test results in Test Director or Excel. Provide solution details to Test Analysts Review detailed test plans produced by Test Analysts Input into and review test cases produced by Test Analysts Review and categories/priorities test results Validate. maintain and execute the test cases in Test Director or Excel. Provide input into the Technical Test Plans. create and maintain test data. maintain and retest defects in Test Director or Excel) Provide input into the Test Summary Reports. Create.

Testing Role Responsibility/Accountable • Assist in testing and defect reproduction for de-bug information purposes Deliver OAT Detailed Test Plan Delivery and reporting of OAT testing results and progress Management of the OAT environments Execute OAT tests Validate. raise and progress defects to resolution Sign-off OAT Input into the development of the User Acceptance test scripts Review and sign off User Acceptance Detailed Test Plans Review and sign off requirements and scripts User Acceptance test Global Operations (GO) • • • • • • Business User from THE CLIENT Legal • • • • • • Agree acceptance criteria based on the successful completion of test execution Perform User Acceptance Testing Sign-off UAT Page 26 of 26 .