You are on page 1of 7

International Education Studies; Vol. 12, No.

8; 2019
ISSN 1913-9020 E-ISSN 1913-9039
Published by Canadian Center of Science and Education

Scoring Rubric of Problem-Solving on Computing Science Learning


Chacharin Lertyosbordin1, Sorakrich Maneewan1, Sakesun Yampinij1 & Kuntida Thamwipat1
1
Faculty of Industrial Education and Technology, King Mongkut’s University of Technology Thonburi, Bangkok,
Thailand
Correspondence: Chacharin Lertyosbordin, Faculty of Industrial Education and Technology, King Mongkut’s
University of Technology Thonburi, Bangkok, Thailand.

Received: March 14, 2019 Accepted: April 30, 2019 Online Published: July 29, 2019
doi:10.5539/ies.v12n8p26 URL: https://doi.org/10.5539/ies.v12n8p26

Abstract
Office of basic education commission of Thailand firstly declared the indicator for computing science of the
students in primary education and secondary education in 2018. The important of computing science is to develop
the learners to solve the questions of computing science by using technology correctly. To gain the effective
learning management in computing science, the method of evaluation is so important. This research aimed to
create the test of solving on computing science for the teachers to use in school and applied in computing sciences
in schools and improved better in the future. The evaluation on quality for validity of the test found that the test of
solving in computing science gain the item of congruence (IOC) at 1.00 and reliability was show harmonization at
the level of “much” (RAI = 0.94).
Keywords: scoring rubric of problem-solving, assessment of problem solving, test of problem-solving computing
science learning, computational science
1. Background of the Study
The institute for the promotion of teaching science and Technology (IPST) specified the learning standard and
indicator of computing science of science in primary education and secondary education in the core curriculum of
basic education revised edition 2017 of science department. The curriculum concerned to develop the learners to
gain the thinking skill, computing skill, analytical thinking, and problem-solving skill systematically and to apply
the computing science, information technology and communication (IPST, 2018). The important skill for
Computing science is the problem-solving skill of the learners in 21st century that composed of 7C (Partnership for
21st Century Skills, 2017). Problem-solving skill is a basic skill for other skill such as critical thinking, creative
thinking and innovation skill (Canter, 2004).
The ability of problem-solving refers to the qualification of learners to apply the thinking process using knowledge
and experience to reach the goal or objective by collecting information, connecting functions, using facts to
successfully. The institute for the Promotion of Teaching Science and Technology (IPST, 2014) specified the
behavior of problem-solving comprises 1) Understand the problem, 2) Plan to solve the problem, 3) Solve the
problem and evaluation, 4) Checking for problem-solving and apply for problem-solving. The operation
assessment on process of working is used to measure the problem-solving ability by the instrument for scoring the
characteristics in terms of rubrics (Phuvipadawat, 2001). Rubrics are the instrument to reflex the ability of the
learners in operating work. The teachers can assess the ability of learners easier and able to gain efficiency of the
learning management.
As mentioned above, the researcher had studied the components of ability to solve problem and design the
instrument for problem-solving skill in terms of rubrics for teachers to apply in the other courses, and use as an
example in computing science learning management.
1.1 Research Objectives
To develop the scoring rubric of problem-solving on computing science learning
1.2 Research Limitation
a) The 5 experts to validate the rubric. The qualification of experts were at least senior professional teacher or
assistant professor who experienced in learning management for thinking skill or computing science, derived from
purposive sampling.

26
ies.ccsenet.org International Education Studies Vol. 12, No. 8; 2019

b) The 20 pre-cadet students of Armed Forces Academies Preparatory School who were the members of
Mechatronics Club 2018.
c) The 5 rater to use rubric with empirical work of the students. The qualification of the assessors was to gain
experience in computing science at least 5 years, derived from purposive sampling.
2. Operating Definition
a) Computing science referred to the specific course for computing in arithmetic and scientific method to gain the
answer or result or the application of computer to do with something in process. (Cambridge Dictionary, 2018;
Oxford Dictionary, 2018)
b) Computing science learning management referred to the learning process of computing science to develop the
learners to gain the learning standard and indicator in the core curriculum of Thailand basic education commission
2008, revise edition 2017 in science department (IPST, 2018).
c) The ability of problem-solving in computing science referred to the qualification of learners that use the thinking
process using knowledge and experience to gain the learning objectives by collecting information and connecting
the functions and using facts to successfully composed of 4 steps; 1) analysis and specify the problem description,
2) planning for solving problem, 3) solving the problem, and 4) inspecting and evaluating the results (Bloom,
1956; Guilford & Hoepfner, 1971; Khammani, 2017; Weir, 1974; Jonassen, 2011).
d) Learning process of computing science using the process of engineering design and online simulation to
enhance the ability of problem-solving referred to the process of learning management that learners got the
problem as a question for writing program to control Micro bit board using engineering design which contained 5
steps as follow: 1) identify the problem, 2) collecting information to solve the problem, 3) design, plan and draft
the flowchart of solving, 4) action on problem-solving, 5) evaluate the operation work for problem-solving. After
the learners got the question of problem, the learners had to send back the project work to the teacher as follow; 1)
mind mapping that reflected (1.1) the objective of writing program, (1.2) equipment in the process, (1.3) planning
of the task, 2) precise flowchart for immediately use, 3) source code as flowchart designed, 4) result of program, 5)
record of inspecting result and correction.
3. Research Methodology
This operation has divided the research process into steps as follows.
Step 1: Reviewed literature – to analyze and synthesize the components of developing the test of ability to solve
problem using Rowley and Slack (2004) principle.
Step 2: Designed and created the rubric of problem-solving for computing science.
Step 3: Validation of the rubric of problem-solving for learning management in computing science. The 5 experts
evaluate the validity of rubric of problem-solving for computing science by using the analysis of IOC: Index of
Item-Objective Congruence (Wadeecharoen, 2017) and improved as the experts’ suggestion.
Step 4: Managed the learning process of computing science using the engineering design and online simulation to
enhance the problem-solving ability for the pre cadet students.
Step 5: Inspected the reliability of the rubric of problem-solving for learning management in computing science.
The 5 rater the rubric with the empirical work of the pre cadet students by random selecting only 4 empirical works
from 20 students by using the analysis of the RAI: Rater Agreement Index (Burry-Stock et al., 1966).
4. Research Findings
The test of ability on problem-solving for computing science was designed and created in scoring rubrics of 4
levels. The teacher observed the evidence of learning, empirical works, and behavior reflecting the practical work
of the students. The evaluation was operated in any loop of learning on problem-solving and the criteria of
evaluation on ability of problem-solving composed of 4 components:
1) Analysis and specification of problem description – this step was to understand the description, limitation of
problem.
2) Planning for solving problem – this step was to think of the process of problem-solving step by step to gain the
result.
3) Practical of problem-solving – this step was to do solving problem in action.
4) Inspecting and evaluating the results – this step was parallel with the step of problem-solving. If the problem
was not solved, the students had to go back and repeat the step of problem-solving until reaching the complete

27
ies.ccsenet.org International Education Studies Vol. 12, No. 8; 2019

result.
After the validation of the rubric, the 5 experts evaluated the quality of the rubric was show in table.1 and the tryout
of the reliability by 5 experts from the 4 empirical works was show in Table 2.

Table 1. The result of evaluating the validity of the rubric of problem-solving on computing science learning
Result of evaluation
The ability of problem-solving approach Expert
Total IOC Meaning
1 2 3 4 5
1. Analysis and specification of problem description, Gain the validity
understanding of condition of problem and condition on 1 1 1 1 1 5 1.00 and harmonized
limitation of problem to the objectives
2. Planning for problem-solving, the finding of Gain the validity
problem-solving process form starting until finishing and 1 1 1 1 1 5 1.00 and harmonized
gain the result to the objectives
Gain the validity
3. The practice of problem-solving, the process of
1 1 1 1 1 5 1.00 and harmonized
applying the process in practice l
to the objectives
4. The inspection and evaluating the result, the parallel
Gain the validity
step of practical in problem-solving.
1 1 1 1 1 5 1.00 and harmonized
If the result was not complete, go back to practice along
to the objectives
the process again until reaching a complete result

From Table 1 the validation of the rubric score that evaluated by 5 experts has IOC = 1.00 meaning is the validity
and harmonized to the objectives.

Table 2. The result of evaluating the reliability of the rubric score of problem-solving on computing science
learning
Results of the evaluation
Expert 1 Expert 2 Expert 3 Expert 4 Expert 5
Empirical work of no.1

Empirical work of no.2

Empirical work of no.3

Empirical work of no.4

Empirical work of no.1

Empirical work of no.2

Empirical work of no.3

Empirical work of no.4

Empirical work of no.1

Empirical work of no.2

Empirical work of no.3

Empirical work of no.4

Empirical work of no.1

Empirical work of no.2

Empirical work of no.3

Empirical work of no.4

Empirical work of no.1

Empirical work of no.2

Empirical work of no.3

Empirical work of no.4


Item of
evaluation on the
ability of
problem-solving

1. Analysis and
identification of
2 3 4 2 2 3 4 2 2 3 4 2 2 4 4 2 2 3 4
problem
description
2. Planning for
2 3 3 3 2 3 4 3 2 3 4 3 2 3 4 3 2 3 4 3
problem-solving
3. Practical in
3 3 4 3 3 3 4 3 3 3 4 3 3 3 4 3 3 3 4 4
problem-solving
4. Inspection and
2 2 4 4 2 3 4 4 2 2 4 4 2 3 4 4 2 3 4 4
evaluation

From table 2 the reliability of the rubric score that evaluated by 5 rater has RAI = 0.94 meaning the harmonization
at the level of “much”.
5. Discussion
Measurement of problem-solving ability (high-order thinking skills) should be used to judge the skills of the

28
ies.ccsenet.org International Education Studies Vol. 12, No. 8; 2019

students by measuring the effect on the students to act out. Or in other words, call as “authentic assessment” (Xu
et al., 2013) in which it is difficult to give the instructor to assess and judge the workpiece precisely. It is
necessary to have scoring, quality standards (Gao & Grisham-Brown, 2011). Therefore, after scoring rubric of
problem-solving on computing science has designed and created. It has to verify the validity and found that the
index of item-objective congruence (IOC) was 1.00 can interpret the validity that is accepted, which implies that
this scoring rubric has set the measurement issues correctly and appropriately. It can be used practically.
According to Rovinelli and Hambleton (1976), commenting on the construction of any measure should be
checked before being used. In addition, this scoring rubric of problem-solving on computing science has been
evaluated for the rater agreement indexes that were 0.94. There is a very high consistency of the assessor. This
indicates that scoring rubric is reliable in actual applications. Müller et al. (2005) explained that evaluating of
measuring instruments by using multiple evaluators can tell the accuracy of the instrument. Which the accuracy
of the measurement is very much needed to measure the ability of learners with authentic assessment
(Burry-Stock et al., 1966). Consistent with Segal et al. (2003), who has used many evaluators to create learning
behavior measurement tools which can provide reliable results in the measurement and evaluation of learners’
reactions as well.
From the process of inspection until it appears that this scoring rubric of problem-solving on computing science
has the straightness and reliability of this instrument. The researcher can explain which was caused by the
researcher had synthesized the definition of computing science , limitation of in the core curriculum of thailand
basic education commission 2008, revised edition 2017 of science department, and studied the definition and
components of problem-solving skill and collecting the steps of problem-solving matching with the indicator of
computing science (IPST, 2018) by applying the method of creating rubrics score for authentic assessment
(Phuvipadawat, 2001) that composed of steps; 1) identifying the good results of empirical works by pro and cons,
2) discussion on the results or empirical works of the learners and summarized as a criteria 3) qualification level by
description of the empirical works: best, average and poor. In addition, the rubric that researcher was created can
be applied to evaluate the computational thinking (Supaluk, Khlaisang, & Songkram, 2018) that the composition is
classified into parts as follows 1) Decomposition: Breaking down data, processes, or complex problems into
smaller. 2) Pattern Recognition: Observing patterns, identifying similarproblems, and regularities in data. 3)
Abstraction: Identifying the general principles that generate these patterns or focusing only on the details and 4)
Algorithm Design: that are important, whilst irrelevant information is ignored. Moreover, this tool should be used
as a part of the computing science project-based learning that also enhances the cognitive and critical thinking in
engineering problem solving among students (Rahman et al., 2009), which is expected to show more efficiency of
this kind of learning management.
Acknowledgments
This research article was supported form “Petch Pra Jom Klao Master’s Degree Research Scholarship from King
Mongkut’s University of Technology Thonburi”.
References
Bloom, B. S. (1956). Taxonomy of educational objectives: The classification of educational goals. Cognitive
domain.
Burry-Stock, J. A., Shaw, D. G., Laurie, C., & Chissom, B. S. (1996). Rater agreement indexes for performance
assessment. Educational and Psychological Measurement, 56(2), 251-262.
https://doi.org/10.1177/0013164496056002006
Cambridge Dictionary. (2018). Meaning of “Computational” in the English Dictionary. Retrieved from
https://dictionary.cambridge.org/ dictionary/english/computational
Cambridge Dictionary. (2018). Meaning of “Science” in the English Dictionary. Retrieved from
https://dictionary.cambridge.org/dictionary/english/science?q=Science
Canter, A. (2004). A problem-solving model for improving student achievement. Principal Leadership Magazine,
5(4), 11-15. Retrieved from https://pdfs.semanticscholar.org/72f1/322102abfe42c3c8f40d7139a0c88100415
b.pdf
Gao, X., & Grisham-Brown, J. (2011). The Use of Authentic Assessment to Report Accountability Data on
Young Children’s Language, Literacy and Pre-Math Competency. International Education Studies, 4(2),
41-50. https://doi.org/10.5539/ies.v4n2p41
Guilford, J. P., & Hoepfner, R. (1971). The analysis of intelligence. McGraw-Hill Companies.

29
ies.ccsenet.org International Education Studies Vol. 12, No. 8; 2019

Jonassen, D. H. (2011). Learning to solve problems: A handbook for designing problem-solving learning
environments (pp. 138-151). Routledge, NY. https://doi.org/10.4324/9780203847527
Khammani, T. (2017). Teaching Styles: A Variety of Options. Chulalongkorn University Printing House,
Bangkok.
Müller, C. M., de Vos, R. A., Maurage, C. A., Thal, D. R., Tolnay, M., & Braak, H. (2005). Staging of sporadic
Parkinson disease-related α-synuclein pathology: inter-and intra-rater reliability. Journal of Neuropathology
& Experimental Neurology, 64(7), 623-628. https://doi.org/10.1097/01.jnen.0000171652.40083.15
Oxford Dictionary. (2018). Definition of Computational in English. Retrieved from
https://en.oxforddictionaries.com/definition/computational
Oxford Dictionary. (2018). Definition of Science in English. Retrieved from
https://en.oxforddictionaries.com/definition/science
Partnership for 21st Century Skills. (2017). Assessment: A 21st Century Skills Implementation Guide. Retrieved
from http://www.p21.org/storage/documents/p21-stateimp_assessment.pdf
Phuvipadawat, S. (2001). Learner centeredness & authentic assessment (pp. 137-141). The Knowledge Center.
Chiangmai.
Rahman, M. B. H. A., Daud, K. A. M., Jusoff, K., & Ghani, N. A. A. (2009). Project Based Learning (PjBL)
Practices at Politeknik Kota Bharu, Malaysia. International Education Studies, 2(4), 140-148.
https://doi.org/10.5539/ies.v2n4p140
Rovinelli, R. J., & Hambleton, R. K. (1976). On the use of content specialists in the assessment of
criterion-referenced test item validity. Retrieved from
https://eric.ed.gov/contentdelivery/servlet/ERICServlet?accno =ED121845
Rowley, J., & Slack, F. (2004). Conducting a Literature Review. Management Research News, 27(6), 31.
https://doi.org/10.1108/01409170410784185
Segal, D., Chen, P. Y., Gordon, D. A., Kacir, C. D., & Gylys, J. (2003). Development and evaluation of a
parenting intervention program: Integration of scientific and practical approaches. International Journal of
Human-Computer Interaction, 15(3), 453-467. https://doi.org/10.1207/s15327590ijhc1503_09
Supaluk, S., Khlaisang, J., & Songkram, N. (2018). A Proposed Model of a Cloud Based Learning System Using
P2P Reverse Engineering Approach To Enhance Computational Thinking of undergraduate students.
Veridian E-Journal, Silpakorn University (Humanities, Social Sciences and arts), 11(5), 533-553.
The Institute for the Promotion of teaching Science and Technology (IPST). (2018). Course Description
Technology (Computational science). Retrieved from http://oho.ipst.ac.th/download/mediaBook/IPST-CS-
course-description-secondary-school.pdf
The Institute for the Promotion of teaching Science and Technology (IPST). (2018). Summary of curriculum and
indicators Information Technology and Communication Curriculum 2008 and Technology (Computational
Science) Curriculum Improvement 2018. Retrieved from http://oho.ipst.ac.th/download/mediaBook/cs-ict.
pdf
The Institute for the Promotion of teaching Science and Technology (IPST). (2018). Online Teacher Training
Program “Learning Computational Sciences”. Retrieved from http://oho.ipst.ac.th/wp-content/uploads/sites
/3/2018/04/TrainingOnlineCS61-Detail.pdf
The Institute for the Promotion of teaching Science and Technology (IPST). (2014). Science evaluation guide.
Standard Assessment Branch Institute for the Promotion of Teaching Science and Technology (IPST) (p.
13).
Wadeecharoen, W. (2017). Research methods from theoretical concepts to practice (p. 243). SE-ED, Bangkok.
Weir, J. J. (1974). PROBLEM SOLVING IS EVERYBODY'S PROBLEM. The Science Teacher, 41(4), 16-18.
Wells, C. S., & Wollack, J. A. (2003). An instructor’s guide to understanding test reliability. Testing &
Evaluation Services. University of Wisconsin.
Xu, Q., Heller, K., Hsu, L., & Aryal, B. (2013, January). Authentic assessment of students’ problem solving. In
AIP Conference Proceedings (Vol. 1513, No. 1, pp. 434-437). AIP. http://doi: 10.1063/1.4789745

30
ies.ccsenet.org International Education Studies Vol. 12, No. 8; 2019

Appendix A
Table A1. Table of indicators of criteria for ability of problem-solving
Point 1 2 3 4
a. The learner was a. The learner was able a. The learner was able to a. The learner was able to set
Analyze and specify the

not able to set the to set the objective for set the objective for the objective for
problem description

objective for problem-solving. problem-solving. problem-solving.


problem-solving. b. The learner was not b. The learner was able to b. The learner was able to
b. The learner was able to identify identify limitation of the identify limitation of the
not able to identify limitation of the problem problem.
limitation of the problem c. The learner was able to
problem. identify knowledge or related
information of the problem.
a. The learner was a. The learner was able a. The learner was able to a. The learner was able to create
not able to create to create system create system flowchart. system flowchart.
system flowchart. flowchart. b. he learner was able to b. The learner was able to create
b. The learner was b. The learner was able create program flowchart program.
Planning for problem-solving

not able to create to create program flowchart. c. The learner was able to create
flowchart flowchart. c. The learner was able to the algorithm program but the
program. c. The learner was able create the algorithm algorithm was not ambiguous.
c. The learner was to create the algorithm program but the d.The learner was able to use
not able to create program but the algorithm was not the symbol of writing flowchart
algorithm of algorithm was ambiguous. over or equal 80% of
program. ambiguous. d. The learner was able to international criteria
d. The learner was able use the symbol of writing
to use the symbol of flowchart over 50% of
writing flowchart less international criteria
than or equal 50% of
international criteria
a. he learner was a. The learner was able a. The learner was able to a. The learner was able to
not able to operate to operate the work as operate the work as the operate the work as the system
problem-solving

the work as the the system flowchart system flowchart over flowchart over or equal 80%.
Practical of

system flowchart less than 50%. 50% but less than 80%. b. The learner was able to write
b. The learner was b. The learner was able b. The learner was able to the program as the program
not able to write to write the program as write the program as the flowchart over or equal 80%.
the program as the the program flowchart program flowchart over
program less than or equal 50%. 50% but less than 80%.
flowchart.
a. The learner was a. The learner was able a. The learner was a. The learner records the
not able to record to record the inspection occasionally able to inspection and evaluation of
Inspection and

the inspection and and evaluation of record the inspection and working more than one time
evaluation

evaluation of working more than one evaluation of working. and inspects the result
working. time. b. The learner was able to occasionally.
evaluate the cause of b. The learner was able to
error but incorrectly. evaluate the cause of error
correctly.
* The full score is 16 points lower than 6 points mean should be improved
6-9 points means fair
10-13 points means good
14 points or more means very good

31
ies.ccsenet.org Internationnal Education Stuudies Vol. 12, No. 8; 2019

Appendixx B
The empirrical works of the students foor evaluating ffor the reliabiliity test were bbrought from thhe project worrks of
the pre-caddet students viaa computing sccience learningg management using engineeering design annd online simulation
to enhancee the ability onn problem-solviing. The samples were 30 stuudents but reseearcher random m 4 empirical works
w
for reliabillity test by rateer agreement inndex (RAI).
Forrmula for Rateer Agreement IIndex (RAI)

Rmnk represents thee score of the eexpert m, empirical work n, iin the item of eevaluation k
Rnk represents thee mean score off the empiricall work of studeent n, in the iteem of evaluatioon k
K represents thee number of alll item for evaluuation
N represents thee number of em
mpirical work oof all students
M represents the numbers of exxpert (assessorrs)
I represents thee numbers of im
mpossible scorre

Copyrightts
Copyright for this articlee is retained byy the author(s), with first pubblication rightss granted to thee journal.
This is an open-access article
a distribuuted under the terms and connditions of thee Creative Com
mmons Attribution
license (htttp://creativecoommons.org/liccenses/by/4.0/).

32

You might also like