You are on page 1of 15

Test Report

Testi ng Status by Category


2
Total Result 11
3

1
Workbench Integration 6

Performance 1

1
Logging

Error Handling 3

Count - Test Name Status


Coding Standards 1
Test Category Fail 3 Pass (empty) Total Result
Coding Standards 3 1 4
0 1 2 3 4 5 6 7 8 9 10 11 12
Error Handling 3 3
Logging 1 1
Performance 1 1
Workbench Integration 6 1 7
Total Result 3 11 2 16
Test Number Test Name Test Description

Variable Naming Conventions Ensure that code is following variable


UT-0001 naming standards
Ensure that code is following function
Function Naming Conventions
UT-0002 naming standards
Ensure that each function's purpose is
UT-0003 Function Description well articulated

Ensure that logic flow of the code is


UT-0004 Inline Comments well documented

Ensure that system defined exceptions


are trapped and reported (where
UT-0005 System Exception Handling applicable)

Ensure that any user defined


exceptions are trapped and reported
UT-0006 User Defined Exception Handling (where applicable)

Ensure that there are no paths in the


logic that will make the code fail
behind the scenes without letting the
user know about the status. For
example, when a negative number is
passed to an argument which
theoritcally only accepts positive
UT-0007 Unhandled Exception values

Ensure that each code logs its


execution details in a persistent
UT-0008 Execution Logging manner

Ensure that any module code is


correctly registered in workbench
configuration database with all
UT-0009 Module Registration required information being captured

Ensure that any workflow is correctly


registered in workbench configuration
database with all required information
UT-0010 Workflow Registration being captured

Ensure that any toolbox is correctly


registered in workbench configuration
database with all required information
UT-0011 Toolbox Registration being captured
Ensure that any new type of data
source is correctly registered in
workbench configuration database
with all required information being
UT-0012 Data Source Registration captured

Ensure that any new chart type is


correctly registered in workbench
configuration database with all
UT-0013 Chart Type Registration required information being captured

Ensure that any module, workflow or


toolbox that is added to workbench
executes succesfully when invoked
UT-0014 Workbench Execution directly from workbench

Ensure that any new module runs on


the benchmark dataset within the
standard SLA on the test-bed
UT-0015 Performance configuration

Ensure that any new module is


providing the outputs in the correct
UT-0016 Visualization configuration format as per chart requirements
Expected Result Test Category Test Date

Variable names follow the project coding


standards Coding Standards 01-Jan-2001
Function names follow the project coding
standards Coding Standards 01-Jan-2001
Detailed description is included in the code for
each function Coding Standards 01-Jan-2001

Comments are available throughout the code


explaining each logical functionality and the
comments are easy to follow Coding Standards 01-Jan-2001

System exceptions are trapped and handled in


the code Error Handling 01-Jan-2001

User defined and function specific exceptions


are handled in the code Error Handling 01-Jan-2001

There are no unhandled exceptions in the code Error Handling 01-Jan-2001

Every call to the code saves an execution trail


(log) Logging 01-Jan-2001

Module's metadata is captured fully and


without any errors Workbench Integration 01-Jan-2001

Workflow's metadata is captured fully and


without any errors Workbench Integration 01-Jan-2001

Toolbox's metadata is captured fully and


without any errors Workbench Integration 01-Jan-2001
Data source metadata is captured fully and
without any errors Workbench Integration 01-Jan-2001

Chart type metadata is captured fully and


without any errors Workbench Integration 01-Jan-2001

Module/Workflow/Toolbox executes when run


from workbench Workbench Integration 01-Jan-2001

Performance is code is found to be satisfactory Performance 01-Jan-2001

Module outputs can be visualized correctly Workbench Integration 01-Jan-2001


Iteration # Actual Result Status Screenshot No

1 Same as expected Fail 1

1 Same as expected Fail 1

1 Same as expected Fail 2

1 Same as expected Pass

1 Same as expected Pass

1 Same as expected Pass

1 Same as expected Pass

1 Same as expected

1 Same as expected

1 Same as expected Pass

1 Same as expected Pass


1 Same as expected Pass

1 Same as expected Pass

1 Same as expected Pass

1 Same as expected Pass

1 Same as expected Pass


Sl No Screenshot
1

4
5

8
9
Coding Standards Pass

Functionality Fail
Error Handling NA
Logging
Workbench Integration
Performance
Field Name
Test Number

Test Name
Test Description
Expected Result
Test Category
Test Date

Iteration #

Actual Result
Status
Screenshot No
Description
Unique number for each distinct test. For example for unit testing a tst number will
be of the form "UT-XXXX"
Name identifying the test
Detailed description of the purpose and scope of the test
Elaboration of what is the expected behaviour when the test is executed
Field identifying one of the many different categories of testing
Date of Testing in DD-MMM-YYYY format
When a test is run multiple times, this field helps identifies the actual iteration
number
Details of the outcome from the test
Pass/Fail. Use NA if test is not applicable
A reference to the Defect screenshots
You can pull the standard unit test cases "Unit Testing Template" from projec
mplate" from project folder under /Templates

You might also like