Professional Documents
Culture Documents
112
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
1. Introduction
There’s a growing number of interests to the competency assessment in vocational education and
training in mid-80’s and its’ role in meeting employers and learner’s needs and expectation
(Clayton, Blom & Bateman, 2003). Therefore, it is also important for education sector to supply
quality workforce to meet employer’s demand and expectation and integrating employability skills
within the vocational curriculum in secondary technical schools. In the year 2002, Malaysia
Education Ministry has implemented 22 vocational subjects in selected secondary schools in
Malaysia. Multimedia Production (MP) is one of the subjects based on learning ICT technologies
and multimedia authoring software to develop multimedia product. Students’ achievements are
based on their assessed skills and academically measured in national exams. However, the current
assessment method in the subject focuses more on skills than competencies and lacking of 21st
century learning skills such as communication, problem solving and collaboration skills.
Workforce equipped with technical skills with well-acquired soft skills competencies such as
“competencies such as creative thinking, problem-solving, and analytical skills are much needed
by the employer in the industry to meet the challenges faced by businesses” (Ibrahim, Noorazlina,
& Rahim, 2020, p. 59). Addressing these setbacks, this study proposes the development of
authentic-based competency assessment rubric to complement the current assessment method with
integration of group-based and authentic tasks.
2. Literature Review
2.1 Authentic-based Competency Assessment in Vocational Subject
Authentic learning and assessment are two components that cannot be separated in achieving
educational goals (Karim, Abduh, Manda, & Yunus, 2018, p. 496)
Authentic-assessment requires the students to practice the similar skills, or both of attitudes, skills
and knowledge in similar job environment (Ariev, 2005; Gulikers, Bastiaens, & Kirschner, 2004).
As for vocational subject, the authentic task can also resemble the real work-place problem. The
question should be open-ended and intellectually challenging and easily accessible by learners
(Chin & Chia, 2004). The given set of authentic problem should not limit to pre-determination of
project’s outcome or broad as this would demotivate the students engaging in problem solving
(Cook, Buck & Rogers, 2012). Project-based learning (PBL) is the main learning feature in
technical and vocational; education and one of the key guiding students’ characteristics is “driving
113
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
question” (Paris & Turner, 1994). PBL courses are the only multi-disciplinary that need complex
tasks based on challenging questions or problems that involves students into planning, problem
solving, decision-making and inquiry tasks. Being a part of project-based learning process, there
is a need to create scientific understanding on the concept and principles of the studied subjects or
area (Fallik, Eylon, & Rosenfeld, 2008). Therefore, there is need for a holistic assessment
mechanism in addressing and evaluating student’s competency in an authentic environment.
The above conclusions show that a rubric must be developed to be systematic, subject-specific and
with rating method. Even though rubric does not portray an effective decision of performance
assessment, the validity of a rubric rely on its validity done by expert raters. However, rubric is
seen an effective method of assessment for encouraging effective learning and creating meaningful
instructions. Furthermost educators and researchers have accepted that rubrics improve assessment
value (Jonsson & Svinby, 2007). This assessment method was chosen as to assess the secondary
school student’s skills, knowledge and attitude and making sure the quality of the curriculum
design, delivery and assessment is properly aligned at modular level. It also reports student’s
performance and their learning outcome achievement throughout their study period. Rubric
assessment method is seen as an emerging alternative assessment method by today’s researcher.
The rubric provides scores on every dimension and the total of the score will be summed up.
114
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
Therefore, this has given an insight to develop an authentic-based competency rubric to assess
student’s performance based on three learning outcome domains: Attitude (A), Skills (S) and
Knowledge (K). The three domains are based on Lorin Anderson’s (2001) suggestions on the
design of learning domains (Kasilingam, Ramalingam & Chinnavan, 2014) and they are:
• Cognitive domain which refers to mental skills or knowledge (K);
• Affective domain which refers to progress in emotions or attitude (A); and
• Psychomotor is also known as physical skills (S).
Skills and knowledge domains criteria were designed based on Carver, Lehrer, Connell and In
Erickson (1992) categories of design skills, ideas were noticed when students work on a series of
multimedia products (Winter, 2010, p. 5) such as:
• Allocating resources and time to different segments of a project;
• Searching for information;
• Analysing and interpreting information;
• Developing representations of information;
• Developing a structure for a presentation;
• Catching and maintaining audience interest.
The above criteria show the processes involved in a multimedia product design and development.
Therefore, research’s authentic-based competency assessment rubric was developed based on the
current competency assessment capacity with inclusion of four domains such as teamwork, skills,
knowledge and presentation. Two extra domains such as teamwork and presentation were taken
into consideration in addressing student’s efforts in communicating and collaborating with their
team members and presenting their ideas to their peers in a respectful manner. Teamwork domain
was included as to assess student’s interaction with their peers during the course of completing
give projects. Presentation domain was included as to assess student’s pitching ability in promoting
their end-product in front of their clients and students.
3. Problem Statement
It is important to keep various type of documents as this shows the improvement of a learning
object and learning standards that are swiftly advancing and showing present thought. Therefore,
it is important to keep updating the teaching and learning documents in the form of syllabus and
its’ content to suit the current century’s (i.e. 21st century) of learning. Based on the MP curriculum
document analysis, the current assessment module doesn’t have either authentic nor has group-
based tasks, and lacking of project-based learning criteria. Therefore, there is a need into
integrating these elements into current assessment method and complemented with the use of
rubric. Therefore, this study developed an authentic-based competency assessment rubric to
complement the current skill-based assessment method for Multimedia Production subject in
secondary school. The study has proposed stages involved in rubric development and validation
process by subject matter experts using Lawshe Method (1975).
115
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
Presentation P1: Use different types of presentation method to present ideas and product. 5
Assess students’ end- P2: Present end-product with clarity and confidence in front of peers.
product pitching and P3: Efficiently document information, resources and storyboards during the
presentation styles.
process of design and developing end-product using any method of electronic
presentation.
P4: Relate the final product with client’s needs.
P5: Show the sense of belonging and responsibility towards client’s needs and
product.
116
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
117
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
on the means of all of the CVR items. Both CVR and CVI are quantitative measures that validate
research’s evaluation items and overall rubric. The item(s) is considered crucial when panel of
expert’s agreement with CVR value is more than zero (0). In this research, the rubric’s content
validity is done in two stages:
a) First Stage: Calculating CVR for Each of the items to eliminate items which bears CVR
value below “0” and;
b) Second Stage: Calculating CVI to determine the validity of the whole set of items after
deletion of items during the first stage calculation.
First Stage: Calculating Content Validity Ratio (CVR) for Each of the Items
Wilson, Pan and Schumsky (2012) stated that CVR an internationally recognized method for
establishing content validity for statistical items or instruments whether the items are accepted or
rejected. The rubric items are retained when it is deemed “essential” by the expert and rejected if
it is “not necessary”. The CVR formula is “CVR= (Ne - N/2)/(N/2)” (Lawshe, 1975), where Ne is
the number of experts indicating "Essential" and N is the total number of experts. The CVR value
for each item was calculated in a single column using electronic spreadsheet. Table 3 shows CVR
value for each item calculated based on the total number of experts that select “Essential” (Ne) for
each Item.
Table 3: CVR Calculation for Each Item and CVI Value (n=20)
Item NUMBER OF EXPERTS SELECT “Essential” (Ne)
Total (Ne) CVR
(n=20) Expert1 Expert 2 Expert 3 Expert 4 Expert 5 Expert 6
T1 / / / / / / 6 1.00
T2 / / / / / 5 0.67
T3 / / / / 4 0.33
T4 / / / / / / 6 1.00
T5 / / / 3 0.00
S1 / / / / / / 6 1.00
S2 / / / / / / 6 1.00
S3 / / / / / / 6 1.00
S4 / / / / 4 0.33
S5 / / / / / / 2 1.00
K1 / / / / / 5 0.67
K2 / / / / / 5 0.67
K3 / / / 3 0.00
K4 / / / / 4 0.33
K5 / / / / 4 0.33
P1 / / / / / 5 0.67
P2 / / / / / / 6 1.00
P3 / / / / / / 6 1.00
P4 / / / / / / 6 1.00
P5 / / / / / / 6 1.00
CVR (critical) for the panel size (N) of 6 is 1 [1,5] CVI 0.70
Content Validity Ratio (CVR); Content Validity Index (CVI) value is overall CVR mean value.
Content evaluation agreement by six experts needs a minimum CVR critical value of 0.99 to 1 to
satisfy five percent level significance requirement (Ayre & Scally, 2014; Lawshe, 1975).
Therefore, items that has CVR value below than 0.99 is considered for deletion. Table 4 shows ten
items with CVR value lower than 0.99 value.
118
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
Table 4: List of Items with CVR Value Less Than 0.99 (n=10)
Item NUMBER OF EXPERTS SELECT “Essential” (Ne)
Total (Ne) CVR
(n=10) Expert1 Expert 2 Expert 3 Expert 4 Expert 5 Expert 6
T2 / / / / / 5 0.67
T3 / / / / 4 0.33
T5 / / / 3 0.00
S4 / / / / 4 0.33
K1 / / / / / 5 0.67
K2 / / / / / 5 0.67
K3 / / / 3 0.00
K4 / / / / 4 0.33
K5 / / / / 4 0.33
P1 / / / / / 5 0.67
The deletion of items is crucial for the next step of instrument validation. However, all of the items
were retained because the total mean value of CVR is 0.70 without deletion. The value, which
represents CVI value shows an acceptable value of the whole set of items (Tilden, Nelson & May,
1990). However, if the above items (refer Table 4) are deleted, the CVI value increased to 1.00.
119
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
5. Conclusion
The rubric was developed for a vocational subject to complement the current assessment method.
However, it must be accompanied with the use of authentic-tasks and group-based activity to
enable the use of such rubric as an evaluation form. The minimum requirement from the content
validation analysis is adequate to acquire before its implementation. Pilot testing can be done to
further establish the validity of the measuring instruments. Furthermore, the rubric development
stages involved serves as a guideline for assessors and researchers especially in social science or
in education field to develop and validate a self-made instrument for the use of assessing students’
product-based evidence. It also provides a cross-reference for researchers from education to adapt
content validation measures such as CVR and CVI from Lawshe Method in validating instruments
whereby the method is widely used in medical practices for development of skill-based assessment
instrument. Researchers from the same interest or sample features may use or adapt this dataset
for more comparisons and evaluation on its feasibility under different sets of learning environment.
6. Acknowledgement
We would like to extend our gratitude to all subject matter experts, students, assessors and research
supervisors that have involved in this research. This research is funded by university grant.
References
Ariev, P.R. (2005). A theoretical model for the authentic assessment of teaching. Practical Assessment,
Research & Evaluation, 10(2), 1-11.
Ayre, C., & Scally, A. J. (2014). Critical values for Lawshe’s content validity ratio: Revisiting the original
methods of calculation. Measurement and Evaluation in Counseling and Development, 47(1), 79–86.
https://doi.org/10.1177/0748175613513808
Carver, S. M., Lehrer, R., Connell, T., & Erickson, J. (1992). Learning by hypermedia design: Issues of
assessment and implementation. Educational Psychologist, 27(3), 385-404.
Chin, C., & Chia, L.-G. (2004). Implementing Project Work in Biology through Problem-Based Learning.
Journal of Biological Education, 38(2), 69-75.
Clayton, B., Blom, K., Meyers, D., & Bateman, A. (2003). Assessing and certifying generic skills. National
Centre for Vocational Education Research, 252.
Cook, K., Buck, G., & Park Rogers, M. (2012). Preparing Biology Teachers to Teach Evolution in a Project-
Based Approach. Science Educator, 21(2), 18-30.
Fallik, O., Eylon, B. S., & Rosenfeld, S. (2008). Motivating teachers to enact free-choice project-based
learning in science and technology (PBLSAT): Effects of a professional development model. Journal
of Science Teacher Education, 19(6), 565-591.
Gilbert, G. E., & Prion, S. (2016). Making Sense of Methods and Measurement: Lawshe’s Content Validity
Index. Clinical Simulation in Nursing, 12(12), 530–531. https://doi.org/10.1016/j.ecns.2016.08.002
Gulikers, J. T., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic
assessment. Educational technology research and development, 52(3), 67.
Habibi, A., Yusop, F. D., & Razak, R. A. (2020). The dataset for validation of factors affecting pre-service
teachers’ use of ICT during teaching practices: Indonesian context. Data in Brief, 28.
https://doi.org/10.1016/j.dib.2019.104875
Ibrahim, N. H., Noorazlina, I., & Rahim, A. (2020). Student perception toward mooc in study skills.
International Journal of Education and Pedagogy, 2(2), 58–65.
Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational
consequences. Educational research review, 2(2), 130-144.
120
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved
Karim, A. A., Abduh, A., Manda, D., & Yunus, M. (2018). The effectivity of authentic assessment based
character education evaluation model. TEM Journal, 7(3), 495–500. https://doi.org/ 10.18421/TEM73-
04
Krajcik, J., McNeill, K. L., & Reiser, B. J. (2008). Learning‐goals‐driven design model: Developing
curriculum materials that align with national standards and incorporate project‐based pedagogy.
Science Education, 92(1), 1-32.
Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28(4), 563–575.
https://doi.org/10.1111/j.1744-6570.1975.tb01393.x
Le Brun, M. J., & Johnstone, R. (1994). The quiet (r) evolution: Improving student learning in law. Law
Book Co.
Markkanen, H., Ponta, D. & Donzellini, G. (2001). NetPro: methodologies and tools for project-based
learning in internet, in Montgomerie, C. and Viteli, J. (Eds.): Proceedings of World Conference on
Educational Multimedia, Hypermedia and Telecommunications 2001, pp.1230–1235, AACE,
Chesapeake, VA.
Mulder, M., Weigel, T., & Collins, K. (2007). The concept of competence in the development of vocational
education and training in selected EU member states: a critical analysis. Journal of Vocational
Education & Training, 59(1), 67-88.
Paris, S. G., & Turner, J. C. (1994). Situated motivation. Student motivation, cognition, and learning:
Essays in honor of Wilbert J. McKeachie, 213-237.
Polit, D. F., & Beck, C. T. (2006). The content validity index: are you sure you know what's being reported?
Critique and recommendations. Research in nursing & health, 29(5), 489-497.
Preuss, D. A. (2002). Creating a project-based curriculum, Tech Directions, 62(3), 16– 19
Ramalingam, M., Kasilingam, G., & Chinnavan, E. (2014). Assessment of learning domains to improve
student's learning in higher education. Journal of Young Pharmacists, 6(1), 27.
Retnawati, H., Hadi, S., & Nugraha, A. C. (2016). Vocational High School Teachers' Difficulties in
Implementing the Assessment in Curriculum 2013 in Yogyakarta Province of Indonesia. International
Journal of Instruction, 9(1), 33-48.
Son, J. (2018). Back translation as a documentation tool. Translation and Interpreting, 10(2), 89–100.
https://doi.org/10.12807/ti.110202.2018.a07
Tilden, V. P., Nelson, C. A., & May, B. A. (1990). Use of qualitative methods to enhance content validity.
Nursing Research, 39, 172–175.
Thomas, J. W. (2000). A review of research on project-based learning. Buck Institute of Education.
Retrieved from http://www.newtechnetwork.org.590elmp01.blackmesh.com/sites/default/f
iles/dr/pblresearch2.pdf
Wilson, F. R., Pan, W., & Schumsky, D. A. (2012). Recalculation of the critical values for Lawshe’s content
validity ratio. Measurement and evaluation in counseling and development, 45(3), 197-210.
Whittaker, C. R., Salend, S. J., & Duhaney, D. (2001). Creating instructional rubrics for inclusive
classrooms. Teaching Exceptional Children, 34(2), 8-13.
Winter, J. (2010). Educative assessment for/of teacher competency. Assessment in Education, 17, Retrieved
from http://proquest.umi.com/pqdweb?did=1979837691&
Fmt=7&clientId=11263&RQT=309 &VName=PQD
Yuvienco, J. (2010). Evaluating peer assessment within project-based learning in second/foreign language
education. In Global Learn (pp. 441-447). Association for the Advancement of Computing in
Education (AACE).
121
Copyright © 2020 ACADEMIA INDUSTRY NETWORKS. All rights reserved