Professional Documents
Culture Documents
PDF - Js Viewer
PDF - Js Viewer
612
picking up the method that is intensively used in a specific and find faults) and Master Test Suite Execution (Find
program. Although a mobile application is functionally perfect, if performance faults by performing all test cases).
it does not guarantee real-time performance, it is useless. The
issues related to timing, synchronization, and speed are important
2.3 MOPAD PROJECT
factors in mobile applications, therefore this will focus on time
This Section describes the MOPAD (Mobile Platform based
consumption such as Timing, and process reaction time for
Application Developer) Project. The MOPAD project is a
performance measurement factors to be considered in
progressing project for the goal of developing the mobile
performance testing. For that, we will present the methods of
application SW development environment technique. We are
performance testing in TDD that considers non-functional
participating in the research for mobile performance testing
measurement factors by advancing performance testing that is
method as a part of the project. In general, mobile applications are
mostly carried out at the end of development at the unit test level
developed based on J2ME and WIPI mobile platforms. That is, it
and introduce a testing tool that assists performance testing on
is downloaded and used in a mobile phone in VM (Virtual
software development phase.
Machine) environment.
This paper is organized as follows. In Section 2, TDD and Test-
It is differentiated from general software in the following aspects:
First Performance(TFP), which are the basis of methods of
First, the life cycle is short. Mobile application SW is developed
performance testing are introduced. MOPAD project in progress
in a short development period to meet the competitive market's
is also introduced. In Section 3, the methods of performance unit
demand. They put more priority in meeting the market's demand
testing for mobile applications are explained. In Section 4, a
than safety that requires a long-term design and tests and as a
performance unit testing tool is introduced. Finally, Section 5
result, SW with low quality and reliability has been coming out in
concludes this paper with mentioning its conclusion and future
the market. Secondly, accurate tests are required for mobile
work.
application SW before installation because unlike PC, it is
impossible to have continuous updates in mobile applications due
2. RELATED WORK to technical limitation and there are few additional updates once
SW is installed.
2.1 Test Driven Development(TDD)
Test-driven development(TDD) is a software development We are participating in the project focusing on the development of
technique that uses short development iterations based on pre- the architecture-based mobile application SW development
written test cases that define desired improvements or new environment, which provides shorter design and implementation
functions. It was introduced by Kent Beck and is also known as time than existing development methods and dynamically
Test-First Development. TDD requires developers to generate reconstructs the functions according to the change in performance
automated unit tests that define code requirements before writing environment while performing mobile applications when
the code itself. The tests contain assertions that are either true or developing mobile applications with such characteristics. We are
false. Running the tests rapidly confirms correct behavior as focusing on mobile application SW unit tests and in the process of
developers evolve and refactor the code. Such development researching the unit test support tools in the RAD-oriented mobile
process induces progressive growth of design and completion of applications development frame work. Our methodology of
progressive codes and results in optimized unit tests carried out[8]. performance unit testing is described in the next section.
613
Test Phase is divided into test basics level, prototype test phase, which method, and which part of method causes the delay. And
software test phase, and production test phase. Test environment keep performing the test till the test result is a pass.
according to the test steps includes document based test
environment, simulator based test environment, emulator based
3.2 Normalization of performance
test environment, and real target based test environment.
Perform the performance time revision work of emulator and
It is emulator based test environment that this paper focuses on. mobile phone to obtain reliable performance results. The
Although developers generally develop mobile applications under important work in Normalization of performance is to establish
the assumption that the test results in the emulator and real target normalized mobile performance DB and to develop the revision
are the same, actually most of the results performed in the algorithm that gives accurate prediction using DB.
emulator are different from the results performed on the real
mobile phone. For this reason, there are many cases that 3.2.1 Mobile Performance DB
applications tested in the emulator work with no problem when Mobile performance DB is a core factor to perform the
developing applications but they do not work properly or show performance unit test. Figure 3 shows the method to establish
different performance results after the applications are loaded on mobile performance DB with a bench mark test program. DB is
the real target, mobile phone. built by loading and performing a BMT program by API and
Therefore, test results of performance unit testing are provided by organizing the result. The actual information of mobile phone
collecting information with BMT (Bench mark Test) in real target other than performance results including the information such as
based test environment and building Mobile Performance DB in CPU, Heap Memory Size, and Color Depth are to be used in the
order to perform reliable performance unit tests by providing performance unit testing in the emulator.
information that can be expected in the emulator based test
environment without loading and testing on a number of mobile
phones. Detailed test steps and performance revision methods are
described in the next section.
614
4. PERFORMANCE UNIT TESTING TOOL project to test in the run page of the project properties dialogue
box.
FOR MOBILE APPLICATIONS
Tools which support writing performance unit tests and
performing based on J2ME/WIPI mobile platform are needed to 4.2 Architecture
carry out performance test on software development stage. These Figure 4 shows the performance unit testing tool architecture. The
frameworks measure features of application program related to tool consists of 4 modules.
performance such as response time, execution time, resource
consumption, reliability of operation, a limit of processing,
efficiency and perform the testing. Features related to time, this is,
the process reaction speed per unit time are supported
preferentially. Mobile program is the event-driven software, so
measuring time according to events matters, and to confirm time
limitations is a must. To find out the time cost of methods by
measuring time according to events of each unit test helps to
improve performance of program by eliminating unnecessary
codes or objects generation to find methods called intensively,
and to find the bottleneck. Therefore, we have developed the tool
that supports the automation of unit test creation and performance
with a research on the method to test the performance of mobile
applications in the unit test environment.
615
methods and classes to see whether the test fits time limitation and Test results of performance unit testing are as shown in Figure 7.
trace where the exception occurs. Hence, JMUnit is not a good The difference between the predicted time and actual time is
choice to perform the test we want. Or, according to JMUnit shown in graph and the test prediction results by phone list are
properties, it cannot perform reflection and stack tracing and does visualized.
not have UI to print out the information in detail, execution time
measuring and performance unit testing of each unit method is
impossible. For this, Unit Test Engine run in mobile environment
like JMUnit is required, so this study developed PJUnit, unit test
engine for java mobile performance.
The hierarchy structure of test executed through the test
performance module is as shown in Figure 6.
616
applications quality and performance testing by helping to detect [4] M.Glinz ., “On Non-Functional Requirements.”Pro. 15th
software faults. In the future research, it is required to expand the IEEE Int’1 Requirements Eng. Conf. (RE’07), IEEE CS
research on the performance unit testing method, which reflects Press, pp.21-26.
the characteristics of mobile applications. Also, more concrete [5] E.J. Weyuker and F.I. Vokolos, “Experience with
research is required in relation to the test prediction module which Performance Testing of Software Systems: Issues, an
was developed as a prototype. We can promote a fast development Approach, and Case Study,” IEEE Trans. Software Eng., vol.
and quality improvement having reliable and useful performance 26, no. 12, Dec. 2000, pp. 1147–115
information provided in the development level.
[6] J2ME : http://java.sun.com/j2me/
6. ACKNOWLEDGMENTS [7] WIPI : http://www.wipi.or.kr/
This work was supported by the IT R&D program of MKE/IITA. [8] L.E. Williams, M. Maximilien, and M.A. Vouk, “Test-
[2007-S032-02, Development of mobile application software Driven Development as a Defect Reduction Practice,” Proc.
development environment technology supporting multi-platform]. 14th Int’l Symp. Software Reliability Eng., IEEE CS Press,
Thanks to XCE Co.,Ltd. for building mobile performance DB. 2003, pp. 34–35.
[9] JUnitPerf: http://www.clarkware.com/software/JUnitPerf.htm
7. REFERENCES [10] JMeter: http://www.jmeter.org
[1] Kent Beck, “Test-Driven Development By Example”, 2003
[11] Johnson, Michael J.; Ho, Chih-Wei; Maximilien, E. Michael;
[2] http://www.junit.org Williams, Laurie, “Incorporating Performance Testing in
[3] Hamlet, R.,’Unit testing for software assurance’, Computer Test-Driven Development,” Software, IEEE Volume 24,
Assurance, 1989. Issue 3, May-June 2007, pp. 67-73.
[12] JMJUnit: http://sourceforge.net/projects/jmunit/
617