You are on page 1of 13

Huang, T.-H., Liu, Y.-C., & Chang, H.-C. (2012).

Learning Achievement in Solving Word-Based Mathematical Questions


through a Computer-Assisted Learning System. Educational Technology & Society, 15 (1), 248–259.

Learning Achievement in Solving Word-Based Mathematical Questions


through a Computer-Assisted Learning System
Tzu-Hua Huang, Yuan-Chen Liu1 and Hsiu-Chen Chang2
Graduate School of Curriculum and Instruction, National Taipei University of Education, No.134, Sec. 2, Heping
E. Rd., Da-an District, Taipei City 106, Taiwan (R.O.C.) // 1Graduate School of Communication and Technology,
National Taipei University of Education, No.134, Sec. 2, Heping E. Rd., Da-an District, Taipei City 106, Taiwan
(R.O.C.) // 2Longtan Elementary School, No.71, Yulong Rd., Jiaoxi Township, Yilan County 262, Taiwan
(R.O.C.) // anteater1029@mail2000.com.tw // liu@tea.ntue.edu.tw // jane@ilc.edu.tw

ABSTRACT
This study developed a computer-assisted mathematical problem-solving system in the form of a network
instruction website to help low-achieving second- and third-graders in mathematics with word-based addition
and subtraction questions in Taiwan. According to Polya’s problem-solving model, the system is designed to
guide these low-achievers through the parts of the problem-solving process that they often ignore. The
situations of verbal questions are visualised to walk the students through the course of thinking so they can
solve the question with proper understanding of its meaning. We found that the mathematical problem-
solving ability of experiment group students was significantly superior to that of control group students.
Most of the participants were able to continue the practice of solving word-based mathematical questions,
and their willingness to use the system was high. This indicates that the computer-assisted mathematical
problem solving system can serve effectively as a tool for teachers engaged in remedial education.

Keywords
Low-achiever, Problem-Based Learning (PBL), Polya’s Problem-Solving Process, Word-Based Mathematical
Questions

Introduction
In traditional teaching, assessment of whether students had understood a mathematical problem was based on
whether they could describe the correct arithmetic procedure. However, it was not enough to evaluate students’
mathematics concepts and abilities of solving math problems merely depending on their writing. Some oral
interpretation and explanation should be considered from multiple assessment points of view (Hwang, Chen &
Hsu. 2006). Several researches also pointed out that good problem solving skills are the key to acquiring a
successful solution in learning mathematics (Gagne, 1985; Mayer, 1992). The cognitive thinking process, as well
as abstract mathematical concepts and representations, is often extremely difficult for middle-school and
elementary students. Many efforts have been made to explore alternative ways of teaching mathematics by
creating curricula and didactic material incorporating new tools, pedagogical approaches, and models or methods,
to engage learners in a more pleasant mathematical learning process (Szendrei, 1996).

Fleischner and Marzola (1988) indicated that the mathematics learning achievement of about 6% of all middle-
school and elementary students were severely flawed. In one of Polya’s speeches (1962), he expounded on
learning, teaching and learning to teach. He stated the fact that only 1% of the students would need to study
mathematics, 29% would need to use mathematics in the future, and 70% would never need knowledge of
mathematics beyond the elementary level in their daily life (Mayer, 1992). Obviously, if a student could
successfully solve a math problem by arithmetic calculation, that did not mean the student really understood it.,
Many students and teachers have found the learning and teaching of solving verbal questions to be the most
challenging part of mathematics education (De Corte & Verschaffel, 1993). Cardelle-Flawar (1992) discovered
that, in order to help these low-achievers sharpen their problem-solving skills; their verbal ability must be
improved first to enable them to identify the core of the problem.

Problem-Based Learning (PBL)


Problem-based learning (PBL) is considered to be one of the most appropriate solutions to increase students’
learning motivation and to develop practical skills. It is a student-centered instructional strategy, and
students can solve problems collaboratively and reflect on their experiences. It also can help students
develop skills and acquire knowledge (Bruer, 1993; Williams, 1993). Polya’s Mathematics Problem-Solving
ISSN 1436-4522 (online) and 1176-3647 (print). © International Forum of Educational Technology & Society (IFETS). The authors and the forum jointly retain the
248
copyright of the articles. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies
are not made or distributed for profit or commercial advantage and that copies bear the full citation on the first page. Copyrights for components of this work owned by
others than IFETS must be honoured. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior
specific permission and/or a fee. Request permissions from the editors at kinshuk@ieee.org.
Process is one of the PBL instructional strategies. Lee, Shen and Tsai (2008) used PBL instructional design
to enhance students’ learning and they also specified that teachers can help students to regulate their
learning.

Polya’s Mathematic Problem-Solving Process

The mathematician Polya was the first to introduce the concept of problem-solving model. Believing that
mathematics is not all about the result, he argued that the essence of mathematics education lies in the thinking
and creativity employed in the problem-solving process. In his book “How to Solve It” (Polya, 1945) he proposed
the following four steps of problem solving:

Understanding the problem

Problem solvers must understand the meaning of a sentence; identify the known, the unknown and the
relationship between them; and know what previously learned concepts are available for solving the problem.

Devising a plan

Problem solvers must clarify the relations between conditions in a question, utilise personal knowledge to
develop ideas for solving a problem, and devise a plan.

Carrying out the plan

Following the planned path, problem solvers carry out various calculations and other required operations.

Looking back

Problem solvers examine the answer and carefully review the course that they went through in an attempt to see if
this experience helps solve other problems or if other problem-solving paths exist.

There is no clear distinction between these four steps. If it is determined that the plan cannot be carried out, or the
examination reveals errors, review must be conducted to see if every hint has been correctly understood. A new
plan must then be devised for implementation. This is necessary to emphasise the fact that in problem solving the
process is more important than the outcome.

Examples of Computer-Assisted Mathematics Problem Solving Tools

Most students view computer-assisted learning environments as aids in learning mathematics and as motivational
tools (Lopez-Morteo & Lo’pez, 2007). However, previous computer-assisted problem-solving systems have
incorporated all the problem-solving steps within a single stage, making it difficult to diagnose stages at which
errors occurred when a student encounters difficulties and imposing a too-high cognitive load on students in their
problem solving (Chang, K. E., Sung, Y. T., & Lin, S. F., 2006). Because the systems have not been investigated
empirically, the effectiveness of applying the systems to actual teaching is unknown. With technological
advancement and the arrival of the multimedia computer instruction era, the attention of more and more studies
has been fastened on real-time, interactive teaching methods and mediums through multimedia computers. The
use of computers to implement findings from qualitative research related to problem-solving teaching strategies
can furnish more effective learning opportunities for learners.

Hour and Chen (1999) investigated effects of using personalised context examples in CAI (Computer-
Assisted Instruction) for fourth graders when solving mathematical word problems. Students' attitude toward
the mathematical CAI was also examined in this study. The main finding was that students had favourable
attitudes towards mathematical CAI. Data from personal interviews indicated that students were motivated

249
by personalised context examples in the CAI sessions. Generally speaking, students of middle and low
problem-solving ability had positive mathematical belief after their exposure to personalised context
examples CAI.

Frith, Jaftha and Prince (2004) found that when students used interactive spreadsheet-based computer tutorials in
a mathematical literacy course, it brought to the foreground theories relating to the role of computer technology
as a mediator for learning of mathematics. Outcomes showed that simple definitions of disadvantage were
inadequate to account for the poor performance of students in the lower quartile. Olkun, Altun and Smith (2005)
investigated the possible impacts of computers on Turkish fourth-grade students' geometry scores and further
geometric learning. The study used a pretest–intervention–posttest experimental design. Results showed that
students who did not have computers at home initially had lower geometry scores. The result suggested that at
schools, it seemed more effective to integrate mathematical content and technology in a manner that enabled
students to make playful mathematical discoveries.

Ashton, Beevers, Korabinski and Youngson (2006) indicated that in a mathematical examination on paper, partial
credit was normally awarded for an answer that was not correct but nevertheless contained some of the correct
working. Assessments on computers normally marked an incorrect answer wrong and awarded no marks. This
can lead to discrepancies between marks awarded for the same examination given in the two different media. In
light of the findings, developments to the assessment engine have been made and some questions redesigned for
use in real automated examinations. The results were obtained as part of the Project for Assessments in Scotland
using Information Technology (PASS-IT), a major collaborative programme involving the leading educational
agencies in Scotland.

Based on Polya’s four problem-solving steps (understanding the problem, devising a plan, carrying out the plan
and looking back), Ma and Wu (2000) designed a set of interesting, active learning materials for teaching.
Research outcomes indicated both students’ learning interest and achievement had improved. Chang (2004)
incorporated strategies such as key-point marking, diagram illustration and answer review in the problem-solving
process and developed a process-oriented, computer-aided mathematics problem solving system. The system was
applied mathematical questions (mainly elementary-level arithmetic computation) with fifth graders as the
subjects of the empirical study. Results showed that the system was effective in enhancing low-achievers’
problem-solving ability.

Chang et al. (2006) proposed a computer-assisted system named MathCAL, the design of which was based on the
following four problem-solving stages: (1) understanding the problem, (2) making a plan, (3) executing the plan
and (4) reviewing the solution. A sample of 130 fifth-grade students (average 11 years old) completed a range of
elementary school mathematical problems. The result showed that MathCAL was effective in improving the
performance of students with lower problem solving ability. These assistances improved students’ problem-
solving skills in each stage.

Summary of the discourse above reveals that computer-assisted mathematics-problem-solving systems have a
positive impact on children’s problem-solving ability.

System design and Framework

The system design of this study is based on the mathematical problem-solving process proposed by Polya (1945).
It guides the children to think and solve the mathematical questions by using tag questions. Graphical
representation strategy used in addition and subtraction for lower achievement children proposed by Fuson &
Willis (1989) was taken as a reference for main design of strategy guide in solving the mathematics problem. In
the study of Fuson & Willis, they found that in the situation of addition and subtraction, when the semantic
structure conflicts with the solving strategy to get the unknown value, students might feel the answer is hardly to
get. Fuson & Willis developed a schematic drawing to assist the lower achievement students to solve the problem
in addition and subtraction with single step. The questions would be divided into four types based on the
schematic drawings: (1) PUT-TOGETHER, (2) CHANGE-GET-MORE, (3) CHANGE-GET-LESS, and (4) COMPARE.
Refer to Figure 1.

250
Figure1. Schematic drawings filled in for exemplars of each word problem type

Figure 2. Students’ Problem-Solving Guidance Flowchart


251
Students’ Problem-Solving Guidance Framework and Design

Students’ problem-solving guidance process is shown in Figure 2. A fixed-format interface was employed to
guide students in the problem-solving process to reduce the system operation learning load of users. Students can
follow the problem-solving guidance process provided by the system and solve word-based mathematical
questions.

Teacher and Manager’s Online Real-Time Query and Management

The system allowed the teacher’s online real-time inquiry about students’ account number and password, as well
as students’ learning progress as shown in Figure 3.

Figure 3. Teacher and Manager’s Online Real-Time Query and Management Framework

The screenshots of problem-solving process

Screenshots of students’ mathematics-problem-solving process are shown in Figures 4-8.

Figure 4. The screenshot of understanding the question

252
Figure 5. The screenshot of understanding the question by animation guidance

Figure 6. The screenshot of listing the equation

Figure 7. The screenshot of listing the equation by animation guidance

253
Figure 8. The screenshot of double checking

Research Method
The participants in this study were selected from six classes of second and third grades in two elementary schools
in Yilan County, Taiwan. The selection criteria were listed as follows: (1) Children with low mathematics
achievement were identified by the class advisor that they had basic learning ability, need remedial instruction
and had the average scores in mathematics posterior 25% of their class in the first monthly exam. (2) The class
advisor consulted to the parents to find out the children who were willing to take the project as the experimental
group.

For the students in experimental group trained by the computer-assisted mathematic-problem-solving system after
school, there were seventeen second graders and third graders with low-achieving selected by above two criteria.
Eleven students with low-achieving in mathematics that parents couldn’t send and pick up were the control group.

To test the discrimination and difficulty of the test questions, a pilot study was held between first monthly exam
and second monthly exam of the first semester in academic year for the students from nine classes of second and
third grades in two elementary schools in Yilan County, Taiwan. The test problems were divided into two
examination papers, A and B. Both papers were designed for the students of second and third grades respectively.
The difference between these two papers was the number magnitude. Most of the number used in the paper for
the third grade was three figures; however, the number used in the paper for the second grade was double figures.
After finishing the test, we combined the papers of the second and third grades to analyze of split-half reliability,
discrimination and difficulty. Since the aim of this study was the remedial instruction of children with low
mathematics achievement, the test questions were made based on the examination paper for the second grade
(i.e., the number was double figures).

This study employed independent-sample dual-factor covariance analysis. The dependent variable was the
posttest score, the independent variables included the group and grade, and the control variable (covariance) was
the pretest score. Exploration was first conducted to see if interaction was present between the group and the
grade, and then to see if significant discrepancy was present between different factors in terms of mathematical-
problem-solving ability enhancement after conclusion of the experiment. Through independent-sample dual-
factor covariance analysis, this study then looked into the presence of interaction between the group and the
grade. It explored the impact of different factors on learning achievement of the computer-assisted mathematic-
problem-solving system and determines whether significant discrepancy in learning achievement was present
before and after the experiment in order to clarify whether the computer-assisted mathematic-problem-solving
system could serve as an effective tool for remedial education. Answers given by students to the “Computer-
Assisted Mathematic-Problem-Solving System Questionnaire” were analysed in terms of percentage for
understanding of students’ opinion of the computer-assisted mathematics-problem-solving system.

254
Results and Discussion
System Achievement Analysis

Through the approach of descriptive statistics, this study first explained the average pretest and posttest scores of
participants in each group and grade as shown in Table 1.

Table 1. Number of Pretest and Posttest Participants in Each Group and Pretest and Posttest Average Scores
Group Grade Total Pretest Posttest
Experimental 2 11.27 15.00
17
Group 3 11.33 13.83
2 12.00 11.67
Control Group 11
3 15.20 14.80

An intra-group regression coefficient homogeneity examination confirmed the assumed homogeneity of the
covariance regression coefficients, so the dual-factor covariance analysis was conducted. The abstract is shown in
Table 2. The result was not significant (F value=3.664; p=.068). Data by grade shows: F value = 0.162; p=.691;
this interaction is not significant either. Significant interaction was only found between the groups. Therefore, the
experiment did not result in a significant discrepancy between the low-achieving second graders and third
graders.

Table 2. Abstract of Dual-Factor Covariance Analysis


Variance Source Type III SS df MS F Sig.
Explainable
84.384 4 21.096 5.667 .003
Variance
Interaction 104.711 1 104.711 28.130 .000
Independent
0.603 1 0.603 0.162 .691
Variable (grade)
Dependent Variable
25.042 1 25.042 6.727 .016
(group)
Covariance 37.351 1 37.351 10.034 .004
Main Effect 13.640 1 13.640 3.664 .068
Error 85.616 23 3.722
Adjusted Total 5658.000 23
R Square = .496 (adjusted R square = .409)

Table 3 shows post-experiment learning achievements of experimental-group and control-group students through
independent-sample single-factor covariance analysis. The outcomes indicated: F value=9.455**; p=.005. The
adjusted average of the experimental group was 14.987, and the standard deviation was .495; the adjusted average
of the experimental group was 12.474, and the standard deviation was .623.

The significantly higher scores of experimental-group students compared to control-group students (Pairwise
Comparisons, Table 4) indicate that the computer-assisted mathematics problem solving system developed for
this study can help teachers with remedial education and help students with basic word-based arithmetic
questions.

Table 3. Abstract of Single-Factor Covariance Analysis


Variance Source Type III SS df MS F
Explainable
70.738 2 35.369 8.908
Variance
Interaction 99.844 1 99.844 25.146
Covariance 55.765 1 55.765 14.045
Main Effect 37.541 1 37.541 9.455
Error 99.262 25 3.970
Adjusted Total 170.000 27
R Square = .416 (adjusted R square = .396)
255
Table 4. Abstract of Pairwise Comparisons
Group Group Mean Difference Standard Deviation Sig.
Experiment Control 2.513 .817 .005**
**
Indicates p<0.01

Analysis of Students’ Problem-Solving Process

Problem-solving changes at each stage of the problem-solving process were analysed and are presented in Table
5.

Table 5. Analysis of Students’ Problem-Solving Process


No. Content Total Mean or Percentage
1 No. of students completing the practice 14 82.4
2 Average number of questions practiced per person 1104 64.9
65.1
3 Average problem-solving time per question 3.83 (min.)
(min.)
4 No. of calculation errors 256 15.06
5 No. of problem-solving errors 168 9.88
6 No. of requests for question understanding assistance 129 7.6
7 No. of requests for expression or calculation assistance 938 55.2

Fourteen experimental-group students completed the practice, and the other three finished more than 40
questions. The average number of questions completed reaches 64.9, the average problem-solving time per
question is 3.83 minutes, and the average number of practices is 7. Thereby we can see students' willingness to
use the tool for learning problem solving is very high.

The average number of calculation errors per person detected by the system is 15.06, and the average number of
completed questions is 64.9, indicating the ratio of calculation errors is 23.2%. Data indicates the ratio by which
the system detects student’s calculation errors is not high, yet the researcher’s observation reveals that the
proportion of the students who will examine or verify the calculation on their own initiative is not very high.

The average number of problem-solving errors per person detected by the system is 9.88, the average number of
completed questions is 64.9, and the number of requests for question understanding assistance is 7.6. Yet the
number of requests for expression or calculation assistance is as high as 55.2, indicating most of the students is
still more inclined to look for the answer directly.

System Questionnaire Analysis

To collect information concerning user’s attitude toward and opinion of the computer-assisted mathematic
problem solving system, this study designed the “Computer-Assisted Mathematic-Problem-Solving System
Questionnaire.” The outcomes are presented in Table 6 and Table 7.

Table 6. Computer-Assisted Mathematic-Problem-Solving System Questionnaire Analysis A


Percentage
No. Content of students
who agree
You feel that the operational instruction given by the system you are using is very
1 100
clear.
2 After the explanation, I know how to work on each stage of the practice. 100
When there is an error, I understand the error hint given by the computer and I
3 82.4
know where the problem is.
I feel it is easier for me to reach the answer through the computer than through the
4 100
test sheet.

256
5 I feel practicing applied math questions through the computer is quite interesting. 100
6 I enjoy solving applied math questions through the computer. 100
I feel computer-assisted practice helps me develop a better ability to solve applied
7 100
math questions.
In the stage of “I know the meaning of the question”, I feel reading the instruction
8 100
given by the computer helps me understand the question better.
In the stage of “I know the meaning of the question”, I feel the questions asked by
9 70.6
the computer help me understand the key points of the question better.
In the stage of “I know how to list the expression and calculate”, I feel the
10 illustrated explanations given by the computer help me figure out the way to solve 100
the question.
In the stage of “I know how to list the expression and calculate”, I feel the
11 94.1
calculation explanations given by the computer help me solve calculation problems.
In the stage of “I will check again”, I feel this step encourages me to go back to
12 82.4
examine my calculation again.
In the stage of “I will check again”, I feel this step encourages me to go back to
13 100
examine my expressions again.
The method provided by the computer inspires me to think if there are other
14 94.1
approaches.
I know completion of an applied math question must go through the following 3
steps: (please list them in actual problem-solving order) (1) I know how to list the
15 82.4
expression and calculate (2) I will check again (3) I know the meaning of the
question

Table 7. Computer-Assisted Mathematic-Problem-Solving System Questionnaire Analysis B


I know the I know how to list I will
All very All very
No. Content meaning of the expression and check
simple difficult
the question calculate again
16 The simplest step is 5.9 23.5 5.9 47.1 17.6
17 The hardest step is 11.8 17.6 17.6 47.1 5.9
18 Favorite step 23.5 29.4 41.2 5.9 0
19 Least desired step 0 35.3 11.8 52.9 0

System use attitude (questions 1~3)

Table 6 shows the majority of users (82.4-100%) thought that system operation and instructions could be easily
understood, and they had no problem grasping the error feedback.

System assistance cognition (questions 4~7)

Table 6 shows all users agreed that solving math questions through the computer was easier than through the test
sheet. It’s unanimous that solving word-based math questions through the computer is both interesting and
enjoyable.

Mathematic problem-solving process cognition

Table 6 shows most of the users (82.4%) became familiar with the required stages for solving math questions,
because they were able to list the problem-solving steps in correct order.

Cognition of each stage of the problem-solving process (questions 8~14)

Table 6 shows all participants agreed that motion pictures provided by the system to help understand questions
helped them grasp the meaning of the questions. Most of the participants (70.6%) agreed that the guiding
257
questions designed for the system helped them identify the key points of the math questions. Over 94.1% of the
participants indicated that the expression- and calculation-guiding motion pictures furnished by the system helped
them make out the right problem-solving direction and complete the calculation. Over 82.4% of the participants
said the double-check stage of the system helped them verify the expressions and calculations. A great number of
the participants (94.1%) agreed that the method provided by the computer inspired them to think that there may
have been better problem-solving approaches.

Preference of each stage of the problem-solving process (questions 16~19)

Table 7 shows about half of the users (47.1%) felt all the stages were very simple; 17.6% of the users considered
all word-based questions to have been very difficult. About a quarter of the users (23.5%) regard expression
listing and calculation as the easiest, whereas 5.9% each consider the question understanding step and the double
check step the easiest, respectively. When asked again which step was most difficult, 5.9% of the users still said
all the steps were very difficult, whereas 17.6% each consider the expression listing and calculation step and the
double check step the hardest, respectively. Meanwhile, 11.8% see the question understanding step most
challenging. Comparison of the favorite steps reveals that 41.2% of the users prefer the double check step, and
29.4% enjoy the expression listing and calculation step the most. Therefore, we can see users acquire the greatest
sense of achievement from successful completion of a question. The result revealed that 5.9% of the users enjoy
all 3 steps very much and 23.5% indicated their favorite part is the question understanding stage.

Conclusion
The ability to solve basic word-based addition and subtraction questions is the foundation for developing the
ability to tackle complicated questions. Taiwanese students who experienced mathematical learning difficulties
became more and more frustrated when they reached the middle grades because the difficulty and range of
mathematical courses intensified and expanded. Hwang, Chen, Dung and Yang (2007) also recommended that
teachers design problem solving activities to improve students’ mathematical representation skills. The main
purpose of this study, therefore, was to develop an Internet-based computer-assisted mathematics learning system
to help low-achieving elementary students improve their ability to solve basic word-based addition and
subtraction questions and enhances their willingness to continue the learning. Outcomes indicated that the
computer-assisted mathematical learning system developed for this study can serve as a supplementary tool that
helps teachers with remedial instruction and enhances the problem-solving ability of low achievers.

References
Ashton, H. S., Beevers, C. E., Korabinski, A. A. & Youngson, M. A. (2006). Incorporating partial credit in computer-aided
assessment of Mathematics in secondary education. British Journal of Educational Technology, 37(1), 93-119.
Bruer, J. T. (1993). Schools for Thought: A Science of Learning in the Classroom. Cambridge, MA: MIT Press.
Cardelle-Flawar, M. (1992). Effects of teaching metacognitive skills to students with low mathematics ability. Teaching and
Teacher Education, 22, 266-280.
Chang, H. E. (2004). Computer-Aided Learning System for Mathematic Problem Solving. Education Study, 152, 51-64.
Chang, K. E., Sung, Y. T., & Lin, S. F. (2006). Computer-assisted learning for mathematical problem solving. Computers &
Education, 46(2), 140-151.
De Corte, E., & Verschaffel, L., & DeWin, L. (1993). Influence of rewording verbal problems on children’s problem
representations and solutions. Journal of Educational Psychology, 77, 460-470.
Fleischner, J. E., & Marzola, E. S. (1988). Arithmetic. In K. A. Kauale, S. R. Forness & M. B. Bender (Eds). Handbook of
learning disabilities: Methods and interventions, 11, 89-110.
Frith, V., Jaftha, J., & Prince, R. (2004).Evaluating the effectiveness of interactive computer tutorials for an undergraduate
mathematical literacy course. British Journal of Educational Technology, 35(2), 159-171.

258
Fuson , K.C., & Willis , G. B.(1989). Second graders’ use of schematic drawings in solving addition and subtraction word
problems. Journal of Educational Psychology, 81,514-520.
Gagne, E. D. (1985). The Cognitive Psychology of School Learning, Boston: Little, Brown and company.
Hour, F. C., & Chen, L. C. (1999). Effects of Using Personalized Context Examples in CAI for Elementary Students on
Solving Mathematical Word Problems. Chinese Journal of Science Education, 7(3), 233-254.
Hwang, W. Y., Chen, N. S., & Hsu, R. L. (2006). Development and evaluation of multimedia whiteboard system for
improving mathematical problem solving. Computers & Education, 46(2), 105-121.
Hwang, W. Y., Chen, N. S., Dung, J. J., & Yang, Y. L. (2007). Multiple Repesentation Skills and Creativeity Effects on
Mathematical Problem Solving using a Multimedia Whiteboard System. Educational Technology & Society, 10(2), 191-212.
Lee, T. H., Shen, P.D., & Tsai, C.W. (2008). Applying Web-Enabled Problem-based Learning and Self-Regulated Learning to
Add Value to Computing Education in Taiwan’s Vocational Schools. Educational Technology & Society, 11 (3), 13-25.
Lopez-Morteo, G. & Lo´pez, G. (2007). Computer support for learning mathematics: A learning environment based on
recreational learning objects. Computers & Education, 48(4), 618-641.
Ma, H. L., & Wu, B. D. (2000). Development of Multimedia Computer-Aided Teaching System that Help Intensify
Elementary Students’ Mathematic Problem Solving Ability. Education Forum Symposium: Mathematic and Science
Education Unit, 87, 1465-1500.
Mayer, R. E. (1992).Thinking, Problem Solving, Cognition. New York: W. H. Freeman and Company.
Olkun, S., Altun, A. & Smith, G. (2005).Computers and 2D geometric learning of Turkish fourth and fifth graders. British
Journal of Educational Technology, 36(2), 317-326.
Polya, G. (1945). How to solve it. Princeton, New Jersey: Princeton University Press.
Polya, G. (1962). Mathematical discovery: On understanding, learning and teaching problem solving (Vol I). New York:
Wiley Press.
Szendrei, J. (1996). Concrete materials in the classroom. In A. J. Bishop, K. Clements, C. Keitel, J. Kilpatrick, & C. Laborde
(Eds.), International handbook of mathematics education. Part 1 (pp. 411-434). Dordrecht: Kluwer Academic Publishers.
Williams, S.M. (1993). Putting case-cased learning into context: Examples from legal, business, and medical education.
Journal of Learning Sciences, 2(4), 367-427.

259
© 2012. Notwithstanding the ProQuest Terms and Conditions, you may use this
content in accordance with the associated terms available at
https://www.j-ets.net/ETS/guide.html

You might also like