Professional Documents
Culture Documents
Abstract—Context: Testing is one of the most important phases though not all of them are feasible for all sizes of companies
of software development. However, in industry this phase is and teams. In addition, the establishment of a testing process
usually compromised by the lack of planning and resources. Due relying on a reference model becomes a hard task due to the
to it, the adoption of a streamlined testing process can lead to the
construction of software products with desirable quality levels. difficulty for model comprehension. Moreover, the models do
Objective: Presenting the results of a survey conducted to identify not define priorities in case of lack of time and/or resources,
a set of key practices to support the definition of a generic, thus hindering the whole model adoption.
streamlined software testing process, based on the practices de- According to Purper [6], the team responsible to define the
scribed in the TMMi (Test Maturity Model integration). Method: testing process usually outlines a mind map of the model re-
Based on the TMMi, we have performed a survey among software
testing professionals who work in both academia and industry. quirements in relation to the desired testing process. This team
Their responses were analysed quantitatively and qualitatively in manually verifies whether the mandatory practices, required
order to identify priority practices to build the intended generic by the model, are addressed. In general, these models indicate
process. Results: The analysis enabled us to identify practices that some prioritisation through their levels; however, within each
were ranked as mandatory, those that are essential and should level, it is not clear what should be satisfied at first.
be implemented in all cases. This set of practices (33 in total)
represents only 40% of the TMMi’s full set of practices, which For better results, the testing process should include all
sums up to 81 items related to the steps of a testing process. phases of software testing. However, the process should be as
Conclusion: The results show that there is consensus on a subset minimal as possible, according to the reality of the company
of practices that can guide the definition of a lean testing process and the model used for software development. This adequacy
when compared to a process that includes all TMMi practices. can make the testing process easier to be applied and does not
It is expected that such a process encourages a wider adoption
of testing activities in software development. require many resources or a large team. This shall help the
Index Terms—software testing process; TMMi practices; company achieve the goal of improving the product quality.
Based on this scenario, we conducted a survey in order
I. I NTRODUCTION to identify which are the practices of TMMi that should
be always present in a testing process. Our goal was to
Since software had become widely used, it has played characterise the context of Brazilian companies to provide
an important role in the people’s daily lives. Consequently, them with a direction on how to define a lightweight, still
its reliability cannot be ignored [1]. In this context, quality complete testing process. Therefore, the survey results reflect
assurance (QA) activities should monitor the whole develop- the point of view of Brazilian testing professionals. Given that
ment process, promoting the improvement of the final product a generic testing process encompasses phases such as planning,
quality, and hence making it more reliable. One of the main test case design, execution and analysis, and monitoring [7, 8],
QA activities is software testing that, when well executed, may we expected the survey could indicate which are the essential
deliver a final product with a low number of defects. practices for each phase. The assumption was that there are
Despite the importance of software testing, many companies basic activities related to each of these phases that should never
face difficulties in devising a software testing process and cus- be put aside, even though budget, time or staff are scarce.
tomising it to their reality. A major barrier is the difficulty in The remainder of this paper is organised as follows: Sec-
adapting testing maturity models for the specific environment tion II describes the underlying concepts of this research. Sec-
of the organisation [2]. Many organisations realise that process tion III presents the survey planning, how the participants were
improvement initiatives can solve these problems. However, in invited, and the data evaluation methods. Section IV shows the
practice, defining the steps that can be taken to improve and survey results and the participant’s profile. Section V discusses
control the testing process phases and the order they should these results for each stage of the generic testing process.
be implemented is, in general, a difficult task [3]. Finally, Section VI presents possible threats to the validity
Reference models, such as TMMi [4], point out what should of the survey, and Section VII presents the conclusions.
be done for the improvement of the software testing process.
However, such models do not indicate how to do it. Despite II. BACKGROUND
the model organisation in levels (such as in CMMI [5]), which TMMi [4] is a reference model that complements CMMI [5]
suggests an incremental implementation from the lowest level, and was established to guide the implementation and improve-
TMMi has a large number of practices that must be satisfied, ment of testing processes. It is similar to CMMI in structure,
138
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
SP1.1 Identify and prioritise test conditions
Level 2 PA2.4 Test Design SG1 Perform Test Analisys and SP1.2 Identify and prioritise test cases
and Execution Design using Test Design Techiniques SP1.3 Identify necessary specifc data
SP1.4 Maitain horizontal treceability with requirements
Test Case Design
SP3.1 Identify and prioritise non-functional test conditions
SP3.2 Identify and prioritise non-functional test cases
SG3 Perform Non-Funcional Test
Level 3 PA3.4 Non-Functional SP3.3 Identify necessary specifc data
Analisys and Design
Testing SP3.4 Maitain horizontal treceability with non-functional
requirements
139
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
Fig. 3. Example of question, structured according to a testing process phase and TMMi Specific Goals and Practices.
of the Identify and prioritise test conditions practice. The
answer distribution for this practice is summarised in Table II.
140
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
A. Profile Definition Besides this, 59% of subjects (22 out of 37) have stated to
Figure 6 summarises the level of knowledge of both subjects have only theoretical knowledge of TMMi, whereas 32%
and their institutions according the the profile characterisation (12 out of 37) do not know this reference model.
questions. A description of the charts comes in the sequence. Based on the results depicted in Figure 6, we concluded that
the sample is relevant with respect to the goals established for
this work. This conclusion relies on the fact that, amongst the
&' ! " 37 subjects who have fully answered the questionnaire, (i) 89%
() #
* $
of them have good to high knowledge of software testing
% (i.e. more than one-year experience); (ii) 65% work (or have
worked) in companies that officially have a software testing
process; (iii) 59% work (or have worked) in a CMMI- or
MR-MPS-certified company; and (iv) 67% are knowledgeable
of TMMi, at least in theory. For CMMI-certified companies,
the maturity levels vary from 2 to 5 (i.e. from Managed to
Optimising). For MR-MPS-certified companies, the maturity
+ , . , + ,
# levels range from G to E (i.e. from Partially Managed to
$ /0 "1
+ 2 Partially Defined).
%
To analyse the results regarding the level of importance of
- - TMMi practices according to the subjects’ personal opinion,
we defined three different profiles as follows:
• Profile-Specialist: compound by 12 subjects who have at
least three years of experience with software testing and
work (or have worked) in a company that has a formally
implemented software testing process.
• Profile-MR-MPS: compound by 20 subjects that are
knowledgeable of MR-MPS and use this reference model
in practice.
• Profile-TMMi: compound by 25 subjects that are knowl-
edgeable of TMMi.
The choice for a MPS.BR-related profile was motivated
by the straight relationship between the reference model and
context of Brazilian software companies. Furthermore, these
Fig. 6. Summary of profile characterisation. three specific profiles were defined because we believe the
associated subjects’ tacit knowledge is very representative.
a) Experience: this chart shows that 46% of the subjects (17 Note that the opinion of experts in CMMI was not overlooked
out of 37) have more than three years of experience in at all; instead, such experts’ opinion are spread over the
testing either in industry or academy; only 11% (4 out of analysed profiles. Finally, we also considered the answers of
37) have less than one-year experience. all subjects, in a group named Complete Set.
b) Testing Process: this chart shows that 65% of the subjects
(24 out of 37) work (or have worked) in a company that B. Characterising the Importance of TMMi Practices
has a testing process officially implemented (i.e. an explicit As previously mentioned, the results herein describe are
testing process). From the remaining subjects, 22% (8 out based on the three profiles (namely, Profile-Specialist,
of 37) do not (or have not) worked in a company with an Profile-MR-MPS and Profile-TMMi) as well as on the whole
explicit testing process, while around 14% (5 out of 37) survey sample. Within each profile, we identified which prac-
have not answered this question. tices were mostly ranked as mandatory. The Venn diagram
c) Certification: this chart shows that 59% of the subjects (22 depicted in Figure 7 includes all mandatory practices, accord-
out of 37) work (or have worked) in a company that has ing to each profile. The practices are represented by numbers
been certified with respect to a software process maturity and are listed in the table shown together with the diagram.
model (e.g. CMMI, MR-MPS). The remaining subjects In Figure 7, the practices with grey background are also
have never worked in a certified company (24%) or have present in the set obtained solely from the statistical analysis
not answered this question (16%). described in Section III-D. As the reader can notice, this
d) Type of Certification: from the subjects that reported to set of practices appears in the intersection of all profiles.
work (or have worked) in a certified company – chart (c) Furthermore, practices with bold labels (e.g. practices 5, 7,
of Figure 6 –, half of them (i.e. 11 subjects) are (or were) 22, 31 etc.) are present in the set aimed to compose a lean
in a CMMI-certified company, while the remaining are (or testing process (this is discussed in details in Section V). Next
were) in a MR-MPS-certified company. we describe the results depicted in Figure 7.
e) TMMi: this chart reveals that only 8% of the subjects (3 • Complete Set: taking the full sample into account, 31
out of 37) have had any practical experience with TMMi. practices were assigned level 4 of importance (i.e. ranked
141
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
2 Identify product risks 42 Implement the test environment
3 Analyse product risks 45 Perform test environment intake test
4 Identify items and features to be tested 46 Develop and prioritise non-functional test procedures
5 Define the test approach 49 Execute test cases
7 Define exit criteria 50 Report test incidents
9 Establish a top-level work breakdown structure 51 Write test log
10 Define test lifecycle 52 Decide disposition of test incidents in configuration control board
11 Determine estimates for test effort and cost 53 Perform appropriate action close the test incident
12 Establish the test schedule 54 Track the status of test incidents
13 Plan for testing staffing 55 Execute non-functional test cases
15 Identify test project risks 56 Report non-functional test incidents
16 Establish the test plan 57 Write test log (non-functional)
17 Review test plan 58 Conduct peer reviews
19 Obtain test plan commitments 60 Analyse peer review data
20 Elicit test environment needs 63 Monitor test commitments
21 Develop the test environment requirements 66 Conduct test progress reviews
22 Analyse the test environment requirements 67 Conduct test progress milestone reviews
23 Identify non-functional product risks 69 Monitor defects
25 Identify non-functional features to be tested 71 Monitor exit criteria
26 Define the non-functional test approach 72 Monitor suspension and resumption criteria
27 Define non-functional exit criteria 73 Conduct product quality reviews
28 Identify work products to be reviewed 74 Conduct product quality milestone reviews
29 Define peer review criteria 75 Analyse issues
30 Identify and prioritise test conditions 76 Take corrective action
31 Identify and prioritise test cases 77 Manage corrective action
32 Identify necessary specific test data 79 Perform test data management
33 Maintain horizontal traceability with requirements 80 Co-ordinate the availability and usage of the test environments
38 Develop and prioritise test procedures 81 Report and manage test environment incidents
41 Develop test execution schedule
Fig. 7. Venn diagram that shows the intersections of results with respect to the practices ranked as mandatory by the majority of subjects within each profile.
as mandatory) by most of the subjects. The majority of with the other profiles; only 3 practices are considered
them are also present in the other profile-specific sets, mandatory exclusively for subjects of this profile.
as shown in Figure 7. The reduced set of practices to • Profile-TMMi: for those who know TMMi, 42 practices are
compose a lean testing process includes these 31 items, and mandatory, from which 41 ones appear in the intersections
is complemented with practices 5 and 7 (the justification is with other profiles.
presented in Section V).
• Profile-Specialist: 49 practices were ranked as mandatory V. A NALYSIS AND D ISCUSSION
by most of subjects within this profile. From these, 27 Before the definition of the aimed reduced set of practices,
practices appear in the intersection with at least another set; we analysed the results of the second questionnaire, which
• Profile-MR-MPS: subjects of this profile ranked 33 prac- has been designed to resolve some dependencies observed in
tices as mandatory, from which only 30 are in intersections the initial dataset (i.e. based on the 37 analysed answers).
142
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
The dependencies have been identified by Höhn [9], who has ( 33+ &4&5 4 5
& &
pointed out some practices that must be implemented before 4 5
the implementation of others. Based on the feedback of 14
subjects, all included in the initial sample, we were able to
&
resolve the observed dependencies, which are related to the
( 2 !
following practices: Analyse product risks, Define the test +)
approach, and Define exit criteria. # )
Regarding Analyse product risks, the subjects were asked if +)
this task should be done as part of the testing process. We got %
&
12 positive answers, thus indicating this practice is relevant, % ( ( (
for example, to support the prioritisation of test cases. In fact, (
#)
the Analyse product risks practice was already present in the
-.
reduced set of practices identified from the first part of the
survey. In spite of this, we wanted to make sure the subjects
have had clear comprehension that it should be performed as &# # % (,"
part of the testing process. % ( % "
The subjects were also asked whether a testing approach
# )
could be considered fully defined when the product risks # #
were already analysed, and items and features to be tested
were already defined. This question was motivated by the fact - % (( (
that Define the test approach (practice 5 in Figure 7) was -
&-
not present in the reduced set of practices derived from the
- # (
initial questionnaire. For this question, we received 10 negative
answers; that is, one cannot consider the testing approach fully - -+) /
defined only by analysing product risks and defining items and
- % ((
features to be tested. Therefore, we included practice 5 in the
final set, thus resolving a dependency reported by Höhn [9]. & !"
$%*
The third question of the second questionnaire addressed the ( !"
Define exit criteria practice (#5 in Figure 7), since it was not
identified as mandatory after the first data analysis. Subjects #$%
&
were asked whether it is possible to run a test process without
explicit exit criteria (i.e. information about when test should
!' (
stop). Based on 9 negative answers (i.e. 65%), this practice '
was also included in the reduced set. # ) (
'
&
This second analysis helped us to either clarify or resolve 0,
1 +),
the aforementioned dependencies amongst TMMi practices. !
In the next sections we analyse and discuss the survey results. ) ,
For this, we adapted Höhn’s mind map [9] (Figures 8–12), # # -
0,1 & +),
according to each phase of a generic testing process. Practices % (0, %
highlighted in grey are identified as mandatory and should be 1
( (,
implemented in any testing process. (
#,.
A. Planning &
% ( +)" %
Planning the testing activity is definitely one of the most # # !" ( "
143
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
defining the test plan. Figure 7 shows that these two practices is likely that part of the subjects consider that the test plan
were mostly ranked as mandatory considering all profiles. itself already fulfils the needs regarding test case design, thus
According to the IEEE-829 Standard for Software and most of the practices are not really necessary. For instance,
System Test Documentation [15], a test plan shall include: if we considered solely the Profile-MR-MPS, none of the
a list of what will be and will not be tested; the approach practices within this phase would appear in the results (see
to be used; the schedule; the testing team; test classes and Figure 7 to crosscheck this finding). On the other hand,
conditions; exit criteria etc. In our survey, Identify items and subjects of the other profiles consider some other practices of
features to be tested, Establish the test schedule and Plan for this phase should be explicitly performed in a testing process.
test staffing practices were mostly ranked as mandatory. They For instance, subjects of the Profile-Specialist profile ranked
are directly related to Establish the test plan, and address the Identify and prioritise test conditions, Identify necessary
definition of most of the items listed in the IEEE-829 Standard. specific test data and Maintain horizontal traceability with
This is complemented with the Define exit criteria, selected requirements as mandatory. For the Profile-TMMi subjects,
after the dependency resolution. This evinces the coherence Identify and prioritise test cases and Maintain horizontal
of the survey’s subject choices for mandatory practices with traceability with requirements should be mandatory.
respect to the Planning phase. From these results, we can conclude that there is uncertainty
The Planing phase also includes practices that address the about what should indeed be done during the test case design
definition of the test environment. In regard to this, Elicit test phase. Moreover, this uncertainty may also indicate that not
environment needs and Analyse the test environment require- always test cases are documented separately from the test
ments are ranked as mandatory and as clearly inter-related. plan; the plan itself includes the testing approach (and its
To conclude this analysis regarding the Planning phase, note underlying conditions) and the exit criteria. Thus, the two
that not all TMMi specific goals are achieved only with the selected practices for this phase complement the needs to
execution of this selection of mandatory practices. Despite this, compose a feasible, streamlined testing process.
the selected practices are able to yield a feasible test plan and
C. Setup of Test Environment and Data
make the process clear, managed and measurable.
As discussed in Section V-A, in the Planning phase test
After Planning, the next phase is related to Test Case
environment requirements are identified and described. The
Design. The input to this phase if the test plan, which includes
Setup of Test Environment and Data phase addresses the pri-
some essential definitions such as risk analysis, the items
oritisation and implementation of such requirements. Figure 10
which will be tested and the adopted approach.
shows the TMMi specific goals and practices for this phase.
B. Test Case Design
( ++* ! ! ) ,)- ! ! !! ,-
Figure 9 summarises the results of our survey for this phase, )! ! ! ! ! )
based on the set of TMMi practices identified by Höhn [9]. As ! ,-
the reader can notice, only two practices were mostly ranked
as mandatory by the Complete Set group of subjects: Identify
and prioritise test cases and Identify necessary specific test #
!
data (both shown in grey background in Figure 9).
) $ ! !
$ ((# !& $ . 0!.1 !& $ ' 0!1 * ! % & !
.' ' $'' $!& $ .
' ' 01 '!
!(
144
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
the prioritisation of test case execution. The other two practices information needs to be organised and consolidated to enable
(i.e. Implement the test environment and Perform test envi- rapid status checking and, if necessary, corrective actions. This
ronment intake test) address the environment implementation is addressed during the Monitoring and Control phase [7].
and ensuring it is operational, respectively. The conclusion Figure 12 depicts the TMMi practices with respect to this
regarding this phase is that the four practices are sufficient to phase. Again, the practices ranked as mandatory by most
create an adequate environment to run the tests. of the subjects are highlighted in grey. Note that there is
D. Execution and Evaluation consensus amongst all profile groups (i.e. Profile-Specialist,
The next phase of a generic testing process consists of Profile-MR-MPS, Profile-TMMi and the Complete Set)
test case execution and evaluation. At this point, the team about what is mandatory regarding Monitoring and Control.
runs the tests and, eventually, creates the defect reports. The This can be crosschecked in Figure 7.
evaluation aims to assure the test goals were achieved and
to inform the results to the stakeholders [8]. For this phase, ( 2 # ( #' 1'2 # ( # # #12
' # # ( # ( # ( #'
Höhn [9] identified 13 TMMi practices, which are related to # 12
test execution goals, management of incidents, non-functional
test execution and peer reviews. This can be seen in Figure 11.
As the reader can notice, only four practices were not ranked
' & # &
as mandatory. This makes evident the relevance of this phase,
since it encompasses the activities which are related to test #
execution and management of incidents. ! "# $
!,
+ &# &#-& %
'
,
( #$
'
# % &
# &
#$ %&
'
$# # #
" # # #
'
(% %/ ' & ( %
%% (
)'
Fig. 11. TMMi practices related to Execution and Evaluation. !.
#
The results summarised in Figure 11 include practices that Fig. 12. TMMi practices related to Monitoring and Control.
regard the execution of non-functional tests. However, in the
Planning an Test Case Design phases, the selected practices do Performing the Conduct test progress reviews and Con-
not address the definition of such type of tests. Although this duct product quality reviews practices means keeping track
sounds incoherent, this may indicate that, from the planning of both the testing process status and the product quality,
and design viewpoints, there is not a clear separation between respectively. Monitor defects addresses gathering metrics that
functional and non-functional testing. The separation is a char- concern incidents (also referred to as issues), while Analyse
acteristic of the TMMi structure, but for the testing community issues, Take corrective action and Manage corrective action
these two types of testing are performed in conjunction, since are clearly inter-related practices. The two other practices
the associated practices as described in TMMi are very similar considered mandatory within this phase are Co-ordinate the
in both cases. availability and usage of the test environments and Report
E. Monitoring and Control and manage test environment incidents. Both are important
The execution of the four phases of a generic testing since either unavailability or incidents in the test environment
process yields a substantial amount of information. Such may compromise the activity as a whole.
145
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.
As a final note with respect to the survey results, we empha- The practices highlighted in Figures 8–12 can also indicate
sise that the subjects were not provided with any information the priority of implementation for a company that is using
about dependencies amongst TMMi practices. Besides this, we TMMi as a reference for its testing process. This model does
were aware that the inclusion of practices not mostly ranked as not indicate what can be implemented first, or the possible
mandatory might have been created new broken dependencies. dependencies amongst the process areas. Nonetheless, the
Despite this, the analysis of the final set of mandatory practices results of this study point out a set of activities that can be
shows that all dependencies are resolved. implemented as a priority. At a later stage, the company may
decide to continue to deploy the remaining practices required
VI. VALIDITY T HREATS by the model in order to obtain the TMMi certification.
This section describes some issues that may threaten the TMMi is fine-grained in terms of practices and their dis-
validity of our results. Despite this, the study limitations did tribution across the specific goals and process areas. Even
not prevent the achievement of significant results with respect though this may ease the implementation of practices, this
to software testing process definition, based on the opinion of makes the model complex and difficult to understand. Once
software testing professionals. a company is willing to build a testing process based on a
A first limitation concerns the questionnaire design. The reference model, this process must be in accordance with its
questions were based on the TMMi structure, so were the reality. Not all TMMi practices are feasible for all sizes of
help notes provided together with the questions. Even though companies and teams. Thus, it is important be aware of a
the intent of the help notes was facilitating the subjects’ basic set of practices that, if not performed, may compromise
understanding regarding the questions, they might not have the quality of the process, and hence the quality of the product
been enough to allow for correct comprehension. Although under test. In this context, we hope the results of this work can
the TMMi structure is very detailed aiming to facilitate support small and medium companies that wish to implement
its implementation, this structure can become confusing for a new testing process, or even improve their current processes.
readers, who cannot comprehend the difference between some
ACKNOWLEDGEMENTS
activities. For instance, in this survey it was clear that the We thank the financial support by CAPES and CNPq.
practices related to functional and non-functional testing were
not understood as distinct activities, since they were ranked as R EFERENCES
mandatory only in the Execution and Evaluation phase. [1] P. Cao, Z. Dong, and K. Liu, “An Optimal Release Policy for
Another threat regards the scale of values used in the first Software Testing Process,” in 29th Chinese Control Conference,
2010, pp. 6037–6042.
questionnaire. The answer scale was composed of four values. [2] A. Rodrigues, P. R. Pinheiro, and A. Albuquerque, “The defini-
This represented a limitation for the statistical analysis, since ton of a testing process to small-sized companies: The Brazilian
the responses were mostly concentrated in values 3 and 4. If scenario,” in QUATIC’10. IEEE, 2010, pp. 298–303.
[3] J. Andersin, “TPI- a model for test process improvement,”
a wider scale were used, e.g. from 1 to 10, this could have University of Helsinki, Helsinki - Finland, Seminar, 2004.
yielded a better distribution of answers, thus enabling us to [4] TMMI Foudation, “Test Maturity Model integration (TMMi)
apply a more adequate interpretation model. (Version 3.1),” pp. 1–181, 2010.
The sample size was also a limitation of the study. In [5] SEI, “Capability Maturity Model Integration Version 1.2.
(CMMI-SE/SW, V1.2 – Continuous Representation),” Carnegie
practice, although the sample includes only software testing Mellon University, Tech. Report CMU/SEI-2006-TR-001, 2006.
professionals, its size is reduced in the face of the real popu- [6] C. B. Purper, “Transcribing Process Model Standards
lation. Perhaps the way the participation call was announced into Meta-Processes,” in EWSPT’00. London - UK:
and the time it was available have limited the sample. Springer-Verlag, 2000, pp. 55–68.
[7] A. N. Crespo, M. Jino, M. Argollo, P. M. S. Bueno, and
C. P. Barros, “Generic process model for software testing,” On-
VII. C ONCLUSIONS AND F UTURE W ORK line, 2010, http://www.softwarepublico.gov.br/5cqualibr/xowiki/
This paper described a survey that was conducted in two Teste-item13 - accessed on 16/04/2013 (in Portuguese).
[8] A. M. J. Hass, “Testing processes,” in ICSTW’08. IEEE, 2008,
stages and investigated whether there is a subset of TMMi pp. 321–327.
practices that can be considered essential for a generic testing [9] E. N. Höhn, “KITest: A framework of knowledge and improve-
process. The survey was applied amongst professionals who ment of testing process,” Ph.D. dissertation, University of São
work with software testing. The analysis led us to conclude Paulo, São Carlos, SP - Brazil, Jun. 2011, (in Portuguese).
[10] “Limesurvey,” http://www.limesurvey.org/, Apr. 2011. [Online].
that, from the set of 81 TMMi practices distributed by Höhn Available: http://www.limesurvey.org/
[9] across the phases of a generic testing process, 33 are [11] Softex, Imrovement of Brazilian Software Process - General
considered essential for maintaining consistency when such a Guide (in Portuguese), Online, Softex - Association for Pro-
process is defined. This represents a reduction of around 60% moting Excellency in Brazilian Software, 2011.
[12] E. Whitley and J. Ball, “Statistics review 6: Nonparametric
in the number of TMMi practices. Note that the other TMMi methods,” Critical Care, vol. 6, no. 6, p. 509, Sep. 2002.
practices are not disposable; however, when the goal is to [13] J. Miller, “Statistical significance testing–a panacea for software
implement a streamlined process, or even when the company technology experiments?” Journal of Systems and Software,
vol. 73, no. 2, pp. 183–192, 2004.
does not have the necessary know-how to implement its own [14] V. Basili and R. W. Reiter, “A controlled experiment quan-
testing process, it can use this reduced set of practices to do so. titatively comparing software development approaches,” IEEE
Thus, the results reported in this paper represent a simplified Trans. Soft. Engineering, vol. SE-7, no. 3, pp. 299–320, 1981.
way to create or improve testing processes, which is based on [15] “IEEE standard for software and system test documentation,”
IEEE Std 829-2008, pp. 1 –118, 2008.
a recognised reference model.
146
Authorized licensed use limited to: ULAKBIM UASL - CELAL BAYAR UNIVERSITESI. Downloaded on February 18,2024 at 08:18:10 UTC from IEEE Xplore. Restrictions apply.