You are on page 1of 43

SRI KRISHNA COLLEGE OF ENGINEERING AND TECHNOLOGY

Kuniamuthur, Coimbatore, Tamilnadu, India


An Autonomous Institution, Affiliated to Anna University,
Accredited by NAAC with “A” Grade & Accredited by NBA (CSE, ECE, IT, MECH ,EEE, CIVIL& MCT)

COURSE MATERIALMATERIAL

Course : 20CS926 – Software Testing


Module – 2: System Testing
Topics : Approach and Techniques -Data Requirements.
Test Engineering Manager Role

www.skcet.ac.in
Module-2
System Testing
 Approach and Techniques, Data Requirements. Test Engineering Manager
Role.

 System Integration Testing – Approach and Techniques, Data Requirements.


Test Process Evaluation.

 User Acceptance Testing – Approach and Techniques, Data Requirements.


Test Plan Example.

 Operations Acceptance Testing – Approach and Techniques, Data


Requirements.

 Example: Test Tours. Stress and Performance tests

2
System Testing

3
System Testing
 To establish confidence that the application under
test (AUT) will be accepted by its users (and/or
operators), that is, it will pass its acceptance tests
 During system testing
• functional and structural stability of the system will be
demonstrated,
• Non-functional requirements such as performance and
reliability
 System testing is conducted by the testing team
under the supervision of the test Team leader

4
The Organization of Testing

5
Overview
 System Testing is performed by a testing team that is
independent of the development team that helps to test the
quality of the system impartial.
 It has both functional and non-functional testing.
 System Testing is black-box testing.
 System Testing is performed after the integration testing and
before the acceptance testing
 System testing  tests high-level requirements of the system
without considering the implementation details of
component modules

6
System Testing and the V Model

7
System Testing Process

8
System Testing Process
• Test Environment Setup: Create testing environment for the
better quality testing.
• Create Test Case: Generate test case for the testing process.
• Create Test Data: Generate the data that is to be tested.
• Execute Test Case: After the generation of the test case and the
test data, test cases are executed.
• Defect Reporting: Defects in the system are detected.
• Regression Testing: It is carried out to test the side effects of
the testing process.
• Log Defects: Defects are fixed in this step.
• Retest: If the test is not successful then again test is performed.

9
Overview of System Testing
 The objective of system testing is to establish confidence that
the application under test (AUT) will be accepted by its users
(and/or operators), that is, that it will pass its acceptance tests.
 During system testing, the functional and structural stability
of the system will be demonstrated, as well as nonfunctional
requirements such as performance and reliability.
 System testing is conducted by the testing team under the
supervision of the test team leader.
 In system testing AUT has a moderate requirement to
interoperate with one or more collaborating software
systems, this may be tested during systems testing.

10
Overview of System Testing

 Where the AUT has a significant requirement to interoperate


with several collaborating software systems and/or the nature
of the interoperation is complex, this will typically be tested
during a separate testing phase – termed systems integration
testing

 System testing should employ black box testing techniques


and will test high level requirements of the system without
considering the implementation details of component
modules

11
System Test Approach

12
System Test Approach
The following approach should be followed in conducting
system testing:
 Review the requirements for the system (e.g., by reference to
the requirements specification document for the AUT) and
identify:
• the high-level business requirements
• the requirement for data handling and transaction rates for the live
system
• the requirements for system performance
• back-up and recovery requirements
• any security requirements

13
System Test Approach
 Identify any requirements for the AUT to communicate with
other systems and the means of communication (where this
is not addressed by systems integration testing)
 Review the computing environment in which the live system
will run to identify interoperability or compatibility issues
with other systems (where this is not addressed by systems
integration testing)
 Examine the requirement for testing system procedures
(such as the installation procedure), system documentation
(such as user manuals), and help facilities (both paper-based
and interactive), as well as recovery, back-up, and archive

14
System Test Approach
 Use the previous information to generate a set of test cases to
test the system and incorporate them into a test script,
describing in detail the steps to be followed during testing.

 Where they are available, consider reusing test cases from the
earlier unit and integration testing.

 These can be obtained from the appropriate unit and


integration test reuse packs.

15
System Test Data Requirements

16
System Test Data Requirements

 Since the major goals of system testing is to establish


confidence that the AUT will pass its acceptance test, the data
used for testing must be as accurate and as representative of
the live data as possible.

 Similarly, because performance testing will take place at


system testing, the quantity of data available for testing must
also be of equivalent size and complexity.

 One approach to achieving the previous requirement for test


data is to use live data.
17
System Test Data Requirements
 In those circumstances where it is not possible to use live
data either because of risk to the live system (and other
applications that rely on live data) or because of security
reasons, a copy of the live data should be used with
‘Sanitization’

 Where live data or a copy of the live data is used, it may still
be necessary to introduce some handcrafted data

18
Roles and Responsibilities

19
Roles and Responsibilities
 System testing is typically conducted by the testing team
under the supervision of the test team leader, who is
responsible for ensuring that adequate system testing is
performed using appropriate testing techniques and under
quality control and supervision.

 Within the testing team, the test analyst is responsible for


designing and implementing (and/or reusing) the test script
and component test cases used in testing the AUT.

20
Roles and Responsibilities
 The tester is responsible for executing the test cases
documented within the test script.

 In a large test team, it is possible for several test analysts and


testers to report to the test team leader, whereas in small test
teams, the test analyst and tester roles may be filled by single
members of staff or even by a single person with joint
responsibility for design and execution of tests.

 At system testing, it is essential that the testing process is


monitored by an independent test observer who formally
witnesses the results of individual test cases.
21
Roles and Responsibilities
 The test team leader will liaise with the development team leader
to determine the progress of the development of the AUT and
likely dates for delivery of the AUT for system testing purposes.
 The test team leader will invite the development team leader to
informally observe the system test
 The test team leader should consider inviting a user
representative to informally observe system testing.
 System testing provides a valuable opportunity to expose the user
representative to aspects of the operation and appearance of the
AUT, allowing (informal) feedback to be obtained and helping
manage expectations of the user representative prior to formal
acceptance testing

22
Roles and Responsibilities
 The test team leader must liaise with the IT systems
administrator (i.e., the member of staff with responsibility for
administering the corporate IT facilities) to install the AUT
prior to testing.

 The test team leader will file copies of a number of outputs


produced as a result of the system testing process,

• completed test result record forms,


• system test log,
• comprehensive system test summary report

23
Planning and Resources

24
Planning and Resources
 System testing planning is the responsibility of the test team
leader and should be developed with reference to the overall
development and testing plan for the AUT to minimize the
impact on the development process caused by system testing
and to ensure that no contention for resources will occur.

 The human resources required for system testing will be


drawn from the testing team

25
Planning and Resources
 It is assumed that system testing will take place either in a
dedicated test environment or in the live environment.
 The choice will depend on a number of issues including:
• The availability of a suitably specified test environment (i.e., one that
is representative of the live environment)
• An assessment of the commercial or safety critical nature of the live
environment and the likelihood of the system test adversely affecting
it
• The occurrence of commercially or confidentially sensitive data, or
security critical information within the live environment.
 It is essential that the system test plan take account of the
time and effort required for any correction of defects and
retesting.
26
Inputs

27
Inputs
 The following items are required as inputs to system testing:

28
Testing Techniques for Integration Testing

29
Testing Techniques for System Testing
 The following testing techniques are appropriate for system
testing:

30
31
32
33
Outputs

34
Outputs
The following items are generated as outputs from system testing:
 The fully tested system
 The completed system test certificate (see Appendix H)
 Any revised test scripts and test cases (where appropriate)
 Archived test data
 The completed test result record forms (see Appendix F)
 The system test log (see Appendix G)
 The system test reuse pack (see Appendix I)
 A system test summary report (see Appendix J)
System testing will be considered complete when all previously
mentioned deliverables are complete and copies have been provided
to the test team leader.

35
Outputs

 Appropriate deliverables (i.e., all of the previous except the


tested system) should be stored in the project file.

 The fully tested system (plus any associated test harness or


simulation code and test data) should be backed up and
archived.

36
High-Level Overview of the System Testing Process

37
Software System Testing following steps needs to be executed

 Step 1) First & important step is preparation of System Test Plan:


• Goals & Objective
• Scope
• Critical areas Area to focus
• Test Deliverable
• Testing Strategy
• Testing Schedule
• Entry and exit criteria
• Suspension & resumption criteria
• Test Environment
• Roles and Responsibilities
• Glossary

38
 Step 2) Second step is to creation Test Cases

Test Test Suite Name How to Test Data Expected Actual Pass/Fail
Case ID Test? Result Result

 Step 3) Creation of test data which used for this testing type.
 Step 4) Automated test case execution.
 Step 5) Execution of normal test case & update test case if
using any test management tool (if any).
 Step 6) Bug Reporting, Bug verification & Regression testing.
 Step 7) Repeat testing life cycle (if required).

39
Systems Integration Testing

40
High-Level Overview of the System Integration Testing Process

41
Summary – System Testing
 Overview- System Test Approach
 System Test Data Requirements
 Roles and Responsibilities
 Test Planning and Resources
 Inputs
 Testing Techniques for Integration Testing
 Outputs

Summary – System Integration Testing


 Same as System Test
 It Provides Compatibility feature additionally

42
System Testing: What? Why? & How?

43

You might also like