You are on page 1of 20

Evaluating Testing Strategies Dams and Rajadorai

Evaluating Testing Strategies and


Processes in Software Development and
Porting Software Applications
Gabriel Lazarus Dams (1) and Kesava Pillai Rajadorai (2)
(1) Department of Mathematical Sciences, Tafawa Balewa Way, Kaduna State
University – Kaduna, Nigeria
Email: damsgabe@gmail.com or damsgabe@kasu.edu.ng
(2) Department of Computer Science and Software Engineering, Asia Pacific
University of Technology & Innovation (APU), Technology Park Malaysia, Kuala
Lumpur, Malaysia
Email: kesava95@gmail.com or kesava@apu.edu.ng

ABSTRACT
One of the important component of software quality assurance in software
development is testing. This is because testing software consumes over 50%
of the development effort. To minimize development time, porting software
applications has become a necessity which has not really improved testing
activities – Software porting is the process of making necessary changes to
an existing application running on an operating system and hardware
architecture, with the aim of enabling the application to run on another
operating system and hardware architecture.
The testing activities carried out during software development or porting are
adhoc without well established guidelines. As such, several testing models
has been established by researchers and establishments such as IEEE and
ISO to guide and ease testing activities.
This paper reviewed seven (7) different testing models and approaches used
during software development or porting. The strengths and weaknesses of
these testing approaches are enumerated and evaluated. After evaluation, the
limitation of each testing technique are pointed out with the aim of overcoming
the limitations in future work.

Keywords: Ported Software Applications, Porting, Software Testing Strategy,


Testing Standards

1- INTRODUCTION
Software testing to begin with encompasses a wide spectrum of diverse
activities, ranging from unit testing to acceptance testing. Also, its position
cannot be overemphasized as it is the most important component of software
quality assurance which according to [1], consumes more than 50% of the
development effort. This makes it a significant part of software engineering [2,
3]. Despite the consumption of resources that testing does, [4] observed that
test selections performed in testing activities is largely adhoc and expensive.
This makes the testing performed unpredictably effective which could be due

45
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

to using the same testing technique and crafting testing methods instead of
the application of good engineering practice [5]. The use of good engineering
practice would aid in the selection of suitable testing methods appropriate for
the particular software or product to be tested.
According to [6], having a testing strategy by using the most effective and
efficient methods of testing is most important in testing. However, there are a
lot of different methods of testing which makes choosing a technique for
testing by practitioners to be complex because of lack of following well-
established guidelines. This complexity makes practitioners avoid properly
testing software or applications because of its time consumption [7] especially
when it comes to the testing of software applications for smart devices such
as the mobile devices which has additional requirements not commonly found
in the traditional ones [8].
For example, over 260 Android devices were released in 2011 where each
device has different form factors and set of features [9] which makes the
mobile application’s testing technique or strategy different from the traditional
ones being pc-based. When these mobile applications have been ported, they
need to be tested properly without assumption that since it worked well on one
platform, it will work on another. Testing activities need to be strategic so that
testing efforts can be reduced without foregoing the quality and reliability of
the application as stated by [1] which is of crucial importance today. One of
the ways in which testing efforts has been reduced is through a process called
software porting.
Software porting can be seen as a process of taking an existing application
which has been successfully running on an operating system and hardware
architecture, making necessary changes to the application with the aim of
enabling the application to run on another operating system and hardware
architecture.
The increase of various smart devices and varieties of software in the market,
with the aim of effectively and efficiently meeting market demands has made
software porting a necessity. Software porting focuses on source code
modification. These modifications are performed in such a way that the
intended system’s functionality and design are not affected [10]. The need for
porting generally sprang up from technical and business constraints where
different devices were released to meet the needs of various customer profiles
within a short period of time. Developers as such are required to make
available, multiple versions of software applications optimized for every
device. This will significantly reduce a software application’s time to the
market (via code reusability) and reduce the work of creating a software from
the scratch.[11, 12].
To ascertain the quality of a ported software application, its reliability
estimation and verification or validation status, testing is needed [13].
Verification is a process of checking and ensuring that a product or software
application works as desired and the software meets the specified
requirements. Validation is the process of checking and ensuring that a
software fulfills the conditions laid down or executes correctly as it is
programmed.

46
Evaluating Testing Strategies Dams and Rajadorai

Since in any software design process, testing is given major consideration by


verifying software applications’ correctness, therefore testing cannot be
overlooked on ported applications [14, 15].
In this research, various testing strategies from a testing process perspective
are evaluated to enumerate the strength and weaknesses of these strategies
with respect to porting software applications.

2- REVIEWING SOFTWARE TESTING STRATEGIES AND


PROCESSES
According to [16], any group of activities in a test process which describes the
steps to be taken given consideration to the quantity of effort and resources
required to achieve a good software construction is a testing strategy.
[5] mentioned that “strategy gives road map to testing”. Also, a software
strategy shouldn’t be rigid rather, it should be considerably flexible to promote
testing approach which is customizable at the same time correct.
Testing activities are directly affected by the testing strategy components as
mentioned by [17] who defined the components that makes up a testing
strategy in an organization (shown in Fig. 1 below) and stating that testing
strategy is the core of the test process which defines the overall framework
settings for testing software applications in an organization.
[18] stressed that organizations need to establish a test strategy because it
aids in identifying and reducing business risks. As such, test strategies should
be defined from the software requirement specifications (SRS) together with
user requirements (UR) to elaborate plans for validation.
The concept of testing strategy has been defined in several certification
models or industry standards. For example, according to [19] the V-model
emphasis on the importance of testing suggested testing activities to be
parallel with the software development. IEEE-829 [20] testing strategy
emphasizes on the production of document for every stage of the testing
phase. The ISO/IEC-29119 [21] testing strategy involves the whole
organization where the model splits the test process into three layers. The
top/first layer is the test organization level where the test policy and strategy of
the whole organization are defined. The second layer is the test management
level where the testing activities are defined in the project. The bottom/last
level is the level where the actual testing is performed.
Some of the certification models or industry standards will be further
discussed in order to have a better understanding of the testing practices and
strategies suggested by these models and standards.

47
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

Fig. 1 – Components of Test Strategy adapted from Kasurinen (2011)

2-1 THE V-MODEL TESTING STRATEGY


The V-model was born from the waterfall model. However, the V-model is
different from the waterfall model in the sense that the waterfall suggests
testing to be predominantly performed towards the end of a software
development phase while the V-model makes emphasis on the importance of
testing. Therefore, the V-model suggests testing to be performed in parallel
with software development. The V-model describes how verification and
validation activities are carried along (hand-in-hand) with the software
development cycle. The V-model describes how testing activities are linked to
all development stages; covering from reviewing user requirements, defining
the scope of the system by describing test plans, component testing,
integration testing and lastly, user acceptance testing [19].
The V-model shown in Fig. 2 begins from top to down in a linear form and
then from down to up to form a V shape. This depicts the basic sequence of
the software development and the testing activities. The model highlights
different levels of testing which is at the right of the model and shows the
manner in which each testing level relates to each development phase which
is at the left of the model. The model portrays a horizontal relationship
between the left side of the model and the right side. For example, the
requirement gathering stage gathers, verify and validates the requirements so
as to justify the project. The business requirements which are gathered at the
requirement stage also give guidance to the user acceptance testing (UAT).
Therefore, the model illustrates how the work done in each phase
subsequently should be verified and validated in the previous phase. It also
illustrates how the work done (when developing) is used to provide guidance
to the different testing phases [22-24].

48
Evaluating Testing Strategies Dams and Rajadorai

Fig. 2 – The V-Model Testing Strategy

2-2 TESTING STRATEGY BY IEEE 829 STANDARD


IEEE-829 [20] testing standard made emphasis on the production of sets of
documents throughout different defined stages of software testing as a form of
strategy. Each stage was to potentially produce its own document separately.
The standard specified the format of the documents which is to be produced
but did not include any criteria regarding adequate content for these
documents. The testing strategy in IEEE 829 defined the activities which
should take place in each process. The summary of the process and activities
is shown in Table 1 below.
Table 1 – IEEE 829 - 2008 Test Process Strategy adapted from IEEE 829 (2008)

Process Activity(ies) Comment


Management Test management The process involves preparing plans
for the test process execution.
The activities involve monitoring and
evaluating all test results. Examples
of activities are Master Test Plan
(MTP) generation, management
review of test effort and so on.
Acquisition Acquisition Support The process covers the activities
which are involved in initiating the
testing project. Example of the
activities to support the process is
preparing Request for Proposal
(RFP), Assessing system
requirements and so on.
Supply Test Planning The supply process initiates the
development of the entire testing
plan in the project. Examples of
activities are defining the scope of

49
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

testing, the integrity level of


application and so on.
Development - Concept The process here covers the
- Requirement development activities (requirements
- Design analysis, design, integration of
- Implementation component, testing installation and
- Test acceptance) by organizing the
- testing life cycle activities to verify
Installation/Checkout the development activities.
Operation Operational Test The process covers the application’s
operation and operation support for
users. Activities such executing
operational test, evaluating operation
test result and so on are performed
here.
Maintenance Maintenance Test The process is activated when there
is any form of modification either the
documentation or application.
Examples of activities performed are:
revising all affected test
documentation, performing anomaly
evaluation and so on.

The IEEE 829 standard further states that once each task has been identified,
the needed inputs and the resultant outputs can be identified; where the
inputs and outputs in this case determine the test documentation which is
needed. For example, Fig. 3 shows an overview of the types of test
documentation described by this standard using four levels of test; where the
test documentation contains each type of document and its primary focus.
Based on the Fig. 3 shown, there are two types of test planning documents
which are Master Test Plan (MTP) and Level Test Plan (LTP) documents. The
LTP for example shows the acceptance test plan, system test plan,
component integration test plan and component test plan. The MTP identifies
the number of levels of test required, meanwhile the LTP could just be one or
many for each level identified in the MTP. Each LTP could result to a more
detailed series of documents. Test executions are performed from the Level
Test Case(s) (LTCs) and the Level Test Procedure(s) (LTPr) documents
which provide enough details information. During the execution of the actual
testing, multiple instances of providing an interim Level Test Status Report.
There could be four types of report documents produced subsequent to
execution of the actual test namely: Level Test Log (LTL), Anomaly Report
(AR), one LTR for each level of test and one MTR which finalizes and
concludes all the levels of tests.

50
Evaluating Testing Strategies Dams and Rajadorai

Fig. 3 – Testing Strategy via Test Documentation by IEEE 829-2008

2-3 TESTING STRATEGY BY ISO/IEC 29119 SOFTWARE TESTING


STANDARD
The testing strategy stated in the working draft of the ISO/IEC 29119 software
standard as mentioned by [17] has a different approach to process of software
testing. The ISO/IEC 29119 testing strategies include both validation and
verification activities of the item being tested. The strategy takes into
consideration the whole organization where the organization is responsible for
developing and maintaining organizational test specifications such as the
testing policies and testing strategies. The policies of the organization and its
strategies steers the testing work at the project level, where test plans are
created at the project management level. Monitoring and controlling of the
testing activities are performed at the fundamental level. The result obtained
at the fundamental level is used to create a test completion report which is
used along with the feedback to develop testing at both the project and
organizational level. Fig. 4 shows the standard test process in a nutshell while
fig. 5 gives a broader view illustration of the different test levels and
structures.
Comparing the standard by ISO/IEC 29119 with the earlier mentioned
standard by IEEE 829; it can be stated that while the IEEE 829 emphasized

51
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

on different test documentation of each test process, ISO/IEC 29119


combined the IEEE 829 concepts and more specific standards yielding one
process model which encapsulates the software organization entirely and
producing four main documents (test policy, test strategy, test plan and test
completion report) for the test process defining how an organization and
individual project should conduct testing.

Fig 4 – ISO/IEC 29119 Test Process Model Overview

Fig. 5 – ISO/IEC 29119 Test Process Test Levels and Structure

52
Evaluating Testing Strategies Dams and Rajadorai

2-4 TESTING STRATEGY IN MOBILE APPLICATIONS DOMAIN


According to [25], mobile applications development in many respects are
similar to other applications development like the PC-based applications. For
example integration with other hardware devices, reliability, performance,
storage, security issues which are considered in traditional software
development are also considered in the mobile application development.
However, [26] mentioned that the mobile application domains are not like the
PC-based application domains as they don’t simply imitate the desktop
environment. The mobile applications have their own kind of user interface
(UI) requirements, infrastructure dependencies and business process flows.
Although, the computing resources of the mobile applications are limited, they
are often designed to be as agile and reliable as the PC-based applications
and thus, the testing strategies applied in the mobile application should take a
separate dimension.
Despite the fact that the mobile application domain is different from the PC-
based domain, the testing strategy for both domains has not been different.
Most mobile development companies are aware of the differences between
the mobile and PC-based application domains but do not have a specific
process of testing their mobile applications. In other words, testing performed
for mobile applications by these organizations are not strategic or executed
systematically. One of the reasons to this is insufficient formal research
performed around their software engineering process despite the fact that
there are existing body of knowledge with a lot of guidelines [25, 27].
For instance, [28] developed a testing flowchart which was a strategy based
on the V-model software development to test a mobile application being
developed; stating that testing must go along with the development process.
The test flowchart (shown in fig. 6) was divided into four (4) phases namely:
test development, test preparation, test execution and test summary.
In the development phase, the test plan (TP), software requirement features
(SRF) and features to be tested (FTT) would be developed where the TP
would be established by the product manager; SRF would be performed by
the test engineer. The result in the development phase would be used in
updating the features to be tested within the mobile application.
In the test preparation phase, the TP would have been prepared defining the
verification and validation activities. Test cases (TC) are prepared based on
the SRF; tested features of the mobile application are also prepared. Testing
equipment and the testers are prepared ready for execution of the actual
testing activities.
In the test execution phase, the actual testing is performed by the testers
following the TP and TCs prepared. Errors/bugs found would be reported, and
the flow of error report goes to the top management as progress report (see
figure 6).
The test summary phase is the production of the overall testing report for the
whole testing activities performed which concludes the strategy.

53
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

Fig. 6 – Testing Flowchart in Cao et al (2012) Testing Strategy

2-5 REVIEWING TESTING STRATEGIES FOR PORTED SOFTWARE


APPLICATIONS
Porting an application according to [29] is a process of taking an application
which is running on an operating system and hardware architecture, making
necessary changes to the application by recompiling and enabling it to run on
another operating system and hardware architecture. Porting of mobile
applications as stated earlier, stems from combining the technical and
business constraints together where manufacturers release different devices
in a short period of time which is targeted at meeting different customer’s
needs. As such, developers are forced to provide multiple versions of the
application and optimize each of these applications to a specific device [11].
[30] stated that when porting is properly planned, prepared and analyzed, it is
cost effective and less time consuming because developers would not have to
go through the work of redeveloping the application again. [31] further
mentioned that porting specifications are clear since the application is already
in existence on a particular platform however that shouldn’t be taken for
granted as improper planning, especially in the conduction of testing could be
more costly and more time consuming. As such, it could be stated that careful
planning and having a proper testing strategy is necessary when porting
mobile applications. Assuming that, since an application worked on a
particular platform, it will work across other platforms could be costly [9].

54
Evaluating Testing Strategies Dams and Rajadorai

In an effort to properly evaluate the available testing strategies for porting


software applications, three (3) different testing strategies are to be
considered by [32], [33] and [34, 35].

2-5.1 Testing Strategy by Jerome et al


In the porting of a web application to mobile devices performed by Jerome et.
al [32], the testing strategy used was parallel testing strategy. This means that
testing phase “was performed in parallel with the implementation of the
various interfaces” during porting of the application. The porting process was
divided into 5 phases namely: analysis of existing system, functional analysis,
interface sketches (mock ups), implementation and tests (see fig. 7)
The functional analysis, interface sketches (mock ups) and implementation
phases were validated by users with the architectural knowledge of the native
application to ensure that the application to be ported took all the user’s needs
into consideration. The users with the architectural background knowledge of
the native application before the actual porting act as the testers to validate
the phases prior to the implementation phase. While another group of users
with the knowledge of the initial application would perform the actual testing
during the implementation phase which would be parallel to the
implementation. The users performing the actual testing are concerned with
reporting functionalities missing and suggest possible improvements.

Fig. 7 – Testing Strategy of Jerome et al (2010) During Porting

2-5.2 Testing Strategy by Ralph and Gerard


[33] ported a legacy application to a newer platform using the Test-Driven
Development technique of which they tagged it “Test-Driven Porting”. The
process is diagrammatically illustrated in figure 8. The analysis phase
analyzed the features, functions, databases and requirements of the legacy
system to acquire the porting specifications. These specifications determined
what was to be added/dropped. The next phase was the design phase which
involved the design of a regression test-suite by the development team
(mainly by the domain expert) for the legacy system which was also to be
used for the target application.

55
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

The implementation phase involved the actual porting activities performed by


developers where the functionalities were ported incrementally alongside with
the first phase of regression tests performed by the developers.
After the relevant test has passed, the ported functionality was considered
complete by the developers and then passed to the testers to perform
comparative functional testing using the same generated test-suite used by
the developers. This activity in the implementation phase was performed
iteratively throughout the project and released in a series of increments.

Fig. 8 – Testing Strategy Illustration of Ralph and Gerard (2005) During Porting

2-5.3 Testing Strategy by RapidSoft


The porting process presented by [34] places the testing phase after the port
development phase (see fig. 9) and the testing strategy employed in [35] were
unit testing, integration testing, functional testing, system testing and
acceptance testing. Each port according to the author is tested and verified on
a live mobile device to ensure that it functions as it was supposed to or as the
original mobile application.

56
Evaluating Testing Strategies Dams and Rajadorai

Fig. 9 – Testing Strategy by RapidSoft_Systems (2008 and 2009)

3- SUMMARY OF TEST PROCESSES AND STRATEGIES


IDENTIFIED
The table 2 below presents a summary of testing process components
identified from several testing processes used for software applications
development or porting. From the testing processes, two (2) testing standards
(IEEE 829 and ISO 92119), one (1) model (V-Model) and four (4) testing
strategies by different sources (Cao et al, Jerome et al, Ralph and Gerrard,
and RapidSoft Systems) have been identified. The ‘strategy concept’ row
highlights the techniques used in each testing process, meanwhile the
‘process components and activities’ row shows the process components and
activities carried out in each process in a particular order.

57
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

Table 2 – Summary of Test process components and activities from identified testing
strategies

TESTING STANDARDS, MODEL STRATEGY process components and


AND STRATEGIES BY OTHERS CONCEPT activities
• VERIFICATION
- Requirement Analysis
- Design
Verification and
- Implementation
Validation
V-Model • VALIDATION
Technique;
- Unit Test
Parallel Strategy
- Integration Test
- System Test
- Acceptance Test

• MANAGEMENT
• ACQUISITION
• SUPPLY
• DEVELOPMENT
- Concept
Test
- Requirement
IEEE 829 Documentation
- Design
Technique
- Implementation
STANDARDS - Test
AND MODELS
- Installation/Checkout
(PC-BASED) • OPERATION
• MAINTENANCE

• ORGANIZATIONAL
LEVEL
- Test Policy
- Test Strategy
• TEST MANAGEMENT
LEVEL
Hierarchical - Test Plan
ISO 92119
based Technique - Test Completion Report
• FUNDAMENTAL LEVEL
- Test Design and
Implementation
- Test Reporting
- Test Execution
- Testing Environment

58
Evaluating Testing Strategies Dams and Rajadorai

• DEVELOPMENT
MOBILE Concept • TESTING
DEVELOPMENT Cao et al. extracted from V- PREPARATION
BY Model • TESTING EXECUTION
• TESTING SUMMARY

• ANALYSIS OF EXISTING
SYSTEM
• FUNCTIONAL
ANALYSIS
Verification and - Validation by users
Jerome et
Validation • INTERFACE
al
Technique SKETCHING
- Validation by users
• IMPLEMENTATION
- Validation by users
• TESTS

• ANALYSIS OF EXISTING
SYSTEM
• PORTING
Ralph and Test-driven
REQUIREMENTS AND
PORTING TO Gerard porting technique
SPECIFICATIONS
MOBILE BY • TEST SUITE DESIGN
• IMPLEMENTATION

• PORT DESIGN
- Verification
• DEVICE APPRAISAL
- Verification
• PORT DEVELOPMENT
Concept
RapidSoft - Review
extracted from V-
Systems • PORT TEST PHASE
Model
- Unit Test
- Integration Test
- System Test
- Functional
- Acceptance Test

59
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

4- EVALUATION AND GAP-IN ANALYSIS IN THE TESTING


STRATEGIES REVIEWED
It can be observed from the testing processes reviewed, that testing standards
or testing models have given birth to various testing techniques used by
different researchers to create adhoc techniques as highlighted in Table 2
above. In other words, the adhoc testing techniques usually used, are based
either on any of the two (2) established testing standards (IEEE 829 or ISO/IEC
29119) or the V-Model.
These testing standards and models however do not cover other aspects of
testing process such as testing process within the mobile environment and
testing process when porting software applications.
For example, the V-model addresses testing activities to be in parallel with the
development activities. But the requirements gathering procedure in the V-
model should be different from the requirements gathering procedure when
porting a software application. The requirements in the V-model, approaches
testing activities when building an application from the scratch but in the case of
porting, part of the requirement exist; it’s just that the existing application would
be ported to another environment. The V-model did not address the situation.
The IEEE 829 testing strategy makes emphasis on documentation of each
phase, but the mobile application environment for instance, is agile-based which
emphasizes on less documentation. Despite that, the rate at which mobile
applications are needed by users nowadays will not be feasible to follow the
documentation recommendation by IEEE 829.
The ISO/IEC 29119 on the other hand captured the IEEE 829 into their model
(with less documentation) and emphasized that testing strategy should be at the
organizational level. However, how the test strategy should be performed was
not described which might be tedious for upcoming organization that want to
adopt this test process for mobile applications or porting software applications.
Moreover, with the diversity of mobile applications made for different platforms,
organizations cannot afford to use the traditional strategy of testing and apply it
to the mobile application environment considering the fact that the environment
of the mobile application is different from the desktop or pc-based environment.
[32] ported a web application to the mobile devices and described how the
porting and testing activities were done. However, the authors used the principle
of parallel testing which could still be the concept of the V-Model (concept of
verification and validation process).
[33] used a concept they called “Test-Driven Porting” but focus was on the
programmers. The programmers performed the porting, develop test cases and
did part of the testing. The tester verified what the developer had already tested
using the test cases prepared by the developers. But this could cause delay and
rework because testers appear not to be in the full picture of the porting
lifecycle. That means the need to understand the test cases prepared by the
developers and if any problem occurred during the course of the testing
activities, the developers need to be contacted for clarification before
proceeding. This would slow down testing activities.
Finally, the testing strategy by [34] clearly shows that the validation concept of
the V-model was used in the testing process.

60
Evaluating Testing Strategies Dams and Rajadorai

5- CONTRIBUTION OF THIS RESEARCH PAPER


[17] mentioned that a testing process should be efficient when the right
information of “what” is to be done, “when” the appropriate time is and “how” it is
to be performed is provided. As such, this paper has been able to highlight
available testing process components from different sources by pointing out the
limitations of the testing processes in the light of testing strategies and porting
software applications. Based on the evaluation, three (3) level of groups
responsible for influencing the testing activities have been identified namely:
organizational level, programmers’ level and testing teams’ level.

5-1 ORGANIZATIONAL LEVEL


The emphasis of ISO/IEC 29119 testing strategy at the organizational level
lead to the identification of the group.

5-2 PROGRAMMERS AND TESTING TEAMS LEVEL


The programmers and the testing teams were identified from Ralph testing
strategy called “Test-Driven Porting” [33]. In this strategy, the programmers
prepared test cases and pass it to the testers for verification.
To overcome the limitations identified (which will be part of the future work), a
testing model will be designed to encapsulate all the three (3) group levels
identified in the software development life cycle. The model should be able to
define each stakeholder’s role from the beginning of testing activities to the end.
The model should also be able to incorporate all testing techniques which can
be used for testing ported applications.

6- SUMMARY AND CONCLUSION


This paper discussed software testing in general, enumerating its significance in
the software development lifecycle (SDLC) and the need of having a testing
strategy in this agile world. It further discussed the overview of testing strategies
by other standards and model such as the V-Model, IEEE 829 and ISO/IEC
29119 testing strategies. It was established that the software applications such
as mobile application environment adopted its testing techniques from the V-
model however, this was performed ad hoc due to lack of having a particular
technique for the mobile applications.
Afterwards, the software application porting was discussed (mobile and pc-
based) with the testing strategy in view which was performed by some authors
mentioned. The limitations or gaps identified in these strategies were discussed
with the contribution of this research to overcoming the limitations.

7- FUTURE WORK
In the future, the gaps identified will be filled through first, gathering information,
identifying and discussing the issues affecting testing activities in software
development from experts. The experts will be from all the three (3) level groups

61
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

identified. The information will be gathered through questionnaires and


interviews. The result obtained (after analysis) should lead to the development
of a model (diagrammatically represented) which should serve as testing
strategy for aiding software testing organizations to achieve a better degree of
quality in their applications (built from scratch or ported). The testing strategy
will also be designed in such a way that testing organizations can tailor the
model to suit their project needs.s

REFERENCES
[1] Elberzhager, F., A. Rosbach, J. Münch, and R. Eschbach, "Reducing test
effort: A Systematic Mapping Study on Existing Approaches," Information
and Software Technology, 54(10), pp.1092-1106, 2012.
[2] Lu, L., "Software Testing Techniques," Institute for software research
international Carnegie mellon university Pittsburgh, PA, pp. 1-19, 2001.
[3] Jovanović, I., "Software Testing Methods and Techniques," IPSI BgD
Journals, pp. 30 - 41, 2009.
[4] Bertolino, "A. Software Testing Research: Achievements, Challenges,
Dreams," Future of Software Engineering, FOSE '07, 2007.
[5] Sawant, A.A., P.H. Bari, and P.M. Chawan, "Software Testing Techniques
and Strategies," International Journal of Engineering Research and
Applications, Vol. 2 (3), pp. 980-986, 2012.
[6] Farooq, U., "Evaluating Effectiveness of Software Testing Techniques with
Emphasis on Enhancing Software Reliability," Department of Computer
Sciences, University of Kashmir, University of Kashmir: Srinagar. pp. 132,
2012.
[7] Keynote "Testing Strategies and Tactics for Mobile Applications," The
Mobile & Internet Performance Authority, 2012.
[8] Muccini, H., A. Di Francesco, and P. Esposito, "Software Testing of Mobile
Applications: Challenges and Future Research Directions," 7th
International Workshop on Automation of Software Test (AST), 2012.
[9] Perfecto "Mobile Practical Guide for Choosing the Right Mobile Testing
Solution for Your Enterprise," 2011.
[10] Cho, D. and D. Bae., "Case Study on Installing a Porting Process for
Embedded Operating System in a Small Team," IEEE 5th International
Conference on Secure Software Integration & Reliability Improvement
Companion (SSIRI-C), 2011.
[11] Alves, V., I. Cardim, H. Vital, P. Sampaio, A. Damasceno, P. Borba, and G.
Ramalho, "Comparative analysis of porting strategies in J2ME games,"
Proceedings of the 21st IEEE International Conference on Software
Maintenance, ICSM'05, 2005.
[12] Bigrigg, M.W. and J.G. Slember, "Testing the Portability of Desktop
Applications to a Networked Embedded System," Workshop on Reliable
Embedded Systems, in conjunction with the 20th IEEE Symposium on
Reliable Distributed Systems, 2001.
[13] Khan, M.E., "Different Forms of Software Testing Techniques for Finding
Errors," International Journal of Computer Science, Vol. 7(3), pp. 11-16,
2010.

62
Evaluating Testing Strategies Dams and Rajadorai

[14] Zachariah, B., "Analysis of Software Testing Strategies Through Attained


Failure Size," IEEE Transactions on Reliability, pp. 569-579, 2012.
[15] Anuradha, M. and M. Sanjay, "Test Strategies in Distributed Software
Development Environments," Computers in Industry, pp. 1-9, 2013.
[16] Méndez, E., M. Pérez, and L.E. Mendoza, "Improving Software Test
Strategy with a Method to Specify Test Cases (MSTC)," International
Conference on Enterprise Information Systems, ICEIS Lisbon, Portugal,
2008.
[17] Kasurinen, J., "Software Test Process Development," Department of
Information Technology, Lappeenranta University of Technology: Finland,
pp. 13-102, 2011.
[18] Shi, M., "Software Functional Testing from the Perspective of Business
Practice," Computer and Information Science, Vol. 3(4), pp. 49-52, 2010.
[19] Graham, D., E.V. Veenendaal, I. Evans, and R. Black, "Foundations of
Software Testing," London: Cengage Learning Inc, 2008.
[20] IEEE-829, "IEEE Standard for Software and System Test Documentation,"
IEEE Std 829-2008, pp. 1-118, 2008.
[21] ISO/IEC-29119, "Test Process ISO/IEC 29119 Software Testing: The New
International Software Testing Standard," 2012.
[22] Qian, H.-m. and C. Zheng, "An Embedded Software Testing Process
Model," IEEE International Conference on Computational Intelligence and
Software Engineering (CiSE 2009), 2009.
[23] Mathur, S. and S. Malik, "Advancements in the V-Model," International
Journal of Computer Applications, vol 1(12), pp. 29-34, 2010.
[24] Eldh, S., "Test Design," Mälardalen University, Sweden: Sweden. pp. 438,
2011.
[25] Wasserman, A.I., "Software Engineering Issues for Mobile Application
Development," Proceedings of the FSE/SDP Workshop on Future of
Software Engineering Research, ACM: Santa Fe, New Mexico, USA. p. pp.
397-400, 2010.
[26] Kualitatem Kualitatem Star Mobile Testing Methodology. Mobile
Application Testing, pp. 1 - 12, 2012.
[27] Dantas, V.L.L., F.G. Marinho, A.L.d. Costa, and R.M.C. Andrade, "Testing
Requirements for Mobile Applications," IEEE 24th International
Symposium on Computer and Information Sciences (ISCIS) 2009.
[28] Cao, G., J. Yang, Q. Zhou, and W. Chen, "Software Testing Strategy for
Mobile Phone," Advances and Applications in Mobile Computing, 2012.
[29] Mendoza, A., C. Skawratananond, and A. Walker, "UNIX to Linux Porting:
A Comprehensive Reference, Porting Project Considerations," Prentice
Hall. p. pp. 1-27, 2006.
[30] Henrik, C., "Embedded Software Porting for Automotive Applications,"
Department of Computer and Information Science, Linköping University,
ESLAB - Embedded Systems Laboratory: LiU Electronic Press, pp. 1-37,
2010.
[31] Mindfire_Solutions, "Testing a Software Port," 2001.
[32] Jerome, C., K. Sylvain, S. Lou, and G. Annie, "Portage of a Web
Application to Mobile Devices," IHM’10, pp. 9-10, 2010.
[33] Ralph, B. and M. Gerard, "Test-Driven Porting," Proceedings of Agile
Conference, IEEE Computer Society, 2005.

63
Int. J. of Software Engineering, IJSE Vol.9 No.2 July 2016

[34] RapidSoft, "Systems Porting Mobile Applications - Overcoming the


Hurdles," pp. 1-6, 2008.
[35] RapidSoft, "Systems Mobile Software Development and Mobile Porting –
Technology and Cost Issues," pp. 1-10, 2009.

64

You might also like