You are on page 1of 9

Chemistry Education

Research and Practice


View Article Online
PAPER View Journal

Investigating the viability of a competency-based,


qualitative laboratory assessment model in
Cite this: DOI: 10.1039/c7rp00249a
first-year undergraduate chemistry†
Reyne Pullen,* Stuart C. Thickett * and Alex C. Bissember *

In chemistry curricula, both the role of the laboratory program and the method of assessment used are
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

subject to scrutiny and debate. The ability to identify clearly defined competencies for the chemistry
laboratory program is crucial, given the numerous other disciplines that rely on foundation-level
chemistry knowledge and practical skills. In this report, we describe the design, implementation, results,
and feedback obtained on a competency-based assessment model recently introduced into the first-
year laboratory program at an Australian university. Previously, this laboratory program was assessed via
a quantitative, criterion-referenced assessment model. At the core of this new model was a set of
competency criteria relating to skills-acquisition, chemical knowledge and application of principles,
Received 15th December 2017, safety in the laboratory, as well as professionalism and teamwork. By design, these criteria were aligned
Accepted 12th March 2018 with the learning outcomes of the course and the degree itself, as well as local accrediting bodies.
DOI: 10.1039/c7rp00249a Qualitative and quantitative feedback from students (and staff) obtained before and after the
implementation of this new model suggested this approach provided an enhanced learning experience
rsc.li/cerp enabling a greater focus on the acquisition of fundamental laboratory skills and techniques.

Introduction and the relevance of the skills acquired by modern graduates during
their degree program compared to the skills required by prospective
Chemistry is an experimental science and, as a result, the employers. They suggest that the solution lies in identifying and
laboratory occupies a central place in chemistry curricula in incorporating innovative and alternative means for the development
universities worldwide (Lagowski, 2000; DeMeo, 2001). The core skills and competencies emphasised in laboratories. However,
undergraduate laboratory serves a number of purposes, including effecting meaningful change in the teaching laboratory is a
the development of practical skills and techniques, in addition to challenging exercise, given the number of variables that can be
reinforcing and extending theoretical concepts presented in the investigated. To this end, Hofstein and Lunetta (2004) has outlined
classroom (Hofstein and Lunetta, 1982, 2004; Hegarty-Hazel, a range of features of laboratory programs that could be modified,
1990; Bennett and O’Neale, 1998; Moore, 2006). The role of the such as modes of learning and teaching styles, student attitudes
undergraduate laboratory and the nature of the skills developed towards chemistry laboratory work, students’ perceptions of the
in this core learning environment have long been the subject of laboratory learning environment, and the use of alternative modes
scrutiny and discussion (Garnet and Garnet, 1995; Reid and of assessment.
Shah, 2007). Bretz (2012) provides an elegant breakdown of assessment
An ongoing concern regarding the training of students pertains in its current form, stating that it can be reduced to two major
to the lack of ‘‘competent’’ graduates in science, technology, questions: ‘‘What do we want our students to know? (And),
engineering and mathematics (STEM) disciplines with the necessary ‘‘How will we know that they know it?’’ A number of reports
skills for future employment (National Academy of Science, 1996; have considered various aspects of these themes. For example,
Baranyai and Prinsley, 2015; Sarkar et al., 2016). Reid and Shah this includes studies concerning the development of detailed
(2007) propose several reasons for this phenomenon, which include rubrics for both general and experiment-specific purposes
an ongoing reduction of laboratory hours (due to various pressures) (McInerney and McInerney, 2010; Pullen 2016), peer- and self-
assessment tools (Wenzel, 2007; Seery et al., 2017), the use of
concept maps (Ghani et al., 2017), and using video responses
School of Physical Sciences – Chemistry, University of Tasmania, Hobart, Tasmania,
7001, Australia. E-mail: Reyne.Pullen@utas.edu.au, Stuart.Thickett@utas.edu.au,
(Erdmann and March, 2014; Tierney et al., 2014). To consider
Alex.Bissember@utas.edu.au laboratory assessment solely, it must be first stated as to what
† Electronic supplementary information (ESI) available. See DOI: 10.1039/c7rp00249a skills or competencies are sought as outcomes. Some approaches

This journal is © The Royal Society of Chemistry 2018 Chem. Educ. Res. Pract.
View Article Online

Paper Chemistry Education Research and Practice

consider the laboratory an opportunity for expansive learning as Tasmania in 2017. Pre- and post-data collection and analysis
demonstrated by Kirton et al. (2014) who developed Structured has enabled the comparison of the perceptions and experiences
Chemistry Examinations (SChemEs) as a means to develop of first-year students who completed the laboratory program
students in basic techniques, numeracy, apparatus assembly that featured either traditional quantitative, criterion-referenced
and handling, interpretive exercises, and information management. assessment (2016) or the new assessment model (2017). More
Hagen and Gragson (2010) approached laboratory assessment specifically, this work sought to investigate the viability of a
through the lens of developing learning tasks to improve and refine competency-based, qualitative laboratory assessment model in
technical writing skills. Another strategy is the use of ‘‘badging’’ or first-year undergraduate chemistry.
micro-credentialing to demonstrate the attainment of various This pilot study primarily aimed to answer the following
skills or competencies both in and outside of the laboratory research questions:
(Towns et al., 2015, 2016; Seery et al., 2017). (1) Will a competency centric assessment model enhance the
Assessment in the first-year undergraduate Chemistry development and consolidation of fundamental skills in the
laboratory program at the University of Tasmania, Australia first-year chemistry laboratory in the short-term?
has traditionally featured a quantitative model employing a (2) Can this assessment model shift student focus away from
criterion-referenced assessment rubric. When placed in the assessment to acquiring key laboratory techniques and reinforcing
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

context of developing student competency and related generic theory?


skills, it became apparent that there was a growing disconnect (3) Will the provision of a more transparent assessment
between what was being assessed – the end product – and the model enable students to more clearly identify the intended
skills developed during the laboratory. More specifically, there learning outcomes?
was concern that the focus of the laboratory program had
shifted from developing student competency to concentrating
much more on quantitatively assessing it. Furthermore, our Research methodology
observations and interactions with undergraduates indicated Background
that the existing form of assessment often led to students
KRA114 – Chemistry 1B represents one of the two first-year
focusing solely on obtaining a good grade, to the extent that
Chemistry units that form the necessary prerequisites for
the acquisition of fundamental skills was a much lower priority
students intending to major in Chemistry at the University of
and the techniques and concepts presented were often over-
Tasmania (UTAS) and for those intending to proceed to any
shadowed and/or quickly forgotten. Similar observations have
second-year UTAS chemistry unit. This unit builds on the year
been noted previously (Bailey and Garner, 2010; Black and
12 chemistry syllabus offered to secondary school students and
Wiliam, 2010; DeKorver and Towns, 2015). We were concerned
provides a foundation to four of the sub-disciplines of chemistry:
that the above-mentioned issues were hindering the development
organic, inorganic, physical, and analytical chemistry. Emphasis
of skills required in second-year chemistry units (and other non-
is placed on fostering an understanding of fundamental chemical
chemistry units comprising the BSc or other degree programs
concepts, development of problem-solving ability, and acquiring
offered at the University of Tasmania).
the skills necessary to carry out experiments safely and efficiently
After considering a range of alternative assessment strategies, a
in the laboratory. The KRA114 intended learning outcomes (ILOs)
competency-based assessment model was explored. This aligned
are provided in Table 1.
with the use of criterion-based assessment for learning outcomes
Each year at UTAS, B260 students enrol in KRA114. Consistent
advocated by both Biggs and Tang (2011) and the Guidelines for
with their respective degree requirements, students enrolled in
Good Assessment Practices (University of Tasmania, 2011). The
Pharmacy, Medical Research, Biotechnology, and Agricultural
latter states, ‘‘Criteria should be clearly based on the learning
Science degrees must all successfully complete KRA114 in order
outcomes in a unit outline.’’ We aimed to design a first-year
to progress through their identified degree pathways. Allowing for
chemistry laboratory program that provided an enhanced student
minor annual variations in the percentages of first-year Chemistry
experience, a greater focus on developing student competency, and
students enrolled in different degree programs, the respective
an improved alignment of the laboratory program with both the
profiles of the cohort is similar from year to year. Typically, only
Australian Council of Deans Teaching and Learning Centre’s
National Chemistry Threshold Learning Outcomes (TLOs) for
undergraduate university-level chemistry (Australian Learning Table 1 KRA114 intended learning outcomes
and Teaching Council, 2011) and unit-specific intended learning
(1) Demonstrate an understanding of fundamental chemical principles
outcomes (ILOs). Consequently, it was postulated that the use of and theories.
competency-based assessment model may be best-suited to (2) Apply chemical principles to explain the chemical and physical
addressing this challenge. Employing competency-based assessment properties of substances, their structure and the interactions that take
place between them.
in Chemistry has been reported previously (Foukaridis and (3) Apply chemical concepts and theoretical principles to solve
McFarlane, 1988). problems.
This report describes the implementation and evaluation (4) Test and investigate theoretical chemical principles by successfully
using fundamental experimental techniques in the laboratory.
of a new competency-based assessment model introduced into (5) Demonstrate a capacity for self-directed learning, working responsibly
the first-year Chemistry laboratory program at the University of and safely and recognise the critical role chemistry plays in society.

Chem. Educ. Res. Pract. This journal is © The Royal Society of Chemistry 2018
View Article Online

Chemistry Education Research and Practice Paper

B20% of these first-year Chemistry students choose to enrol in enhancing the first-year UTAS Chemistry laboratory program,
second-year Chemistry units. Consequently, it is imperative that providing improved alignment between KRA114 unit ILOs and
students who are not progressing to further chemistry study also the Australian Council of Deans Teaching and Learning Centre’s
learn and demonstrate competency in fundamental chemistry National Chemistry Threshold Learning Outcomes (TLOs) for
skills and techniques consistent with the requirements and undergraduate university-level chemistry (Australian Learning and
expectations of other disciplines. Teaching Council, 2011) and in compliance with the Royal
Throughout the course of the laboratory program, students Australian Chemical Institute’s (RACI) degree accreditation
complete eight experiments on a variety of topics each designed requirements. The National Chemistry TLOs include four categories
to facilitate the development core skills or techniques and including understanding the culture of chemistry, inquiry, problem
reinforce key concepts. A laboratory session is composed of solving and critical thinking, communication, and personal
up to 48 students overseen by three instructors (a ratio limit of and social responsibility. The expanded TLOs can be found in
16 students per instructor) with support provided by a laboratory Appendix 2 (ESI†).
technician. Traditionally, for each experiment, students are awarded A key consideration in the development of a new KRA114
a summative grade based on two components. A pre-laboratory laboratory assessment model was determining the fundamental
multiple-choice quiz (30 marks) is undertaken on arrival to the competency-based criteria that would serve as the assessment
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

laboratory to measure a student’s preparedness for this experiment. framework. By design, assessment would not only be based on
As such, the quiz contains a small number of questions targeting the results submitted at the end of each practical session, but
key techniques, calculations, and safety procedures they will utilise also on student performance in the laboratory in areas, including:
during the experiment. The remainder of the summative grade preparation, demonstrating competency in particular skills and
(100 marks) is undertaken by the allocated laboratory instructors, techniques, bookkeeping, experimental accuracy, adherence to
with the aid of a criterion-referenced assessment rubric as a safe work practices, understanding of fundamental chemical
guide. This component encapsulates pre-laboratory requirements principles, teamwork, and professionalism. With this in mind,
and safety within the laboratory, use of correct techniques eleven competency-based assessment criteria were identified
and calculations, and understanding of the key concepts and (Table 2). A full breakdown of each criterion is provided in
principles. The KRA114 assessment rubric employed in 2016 is Appendix 3 (ESI†). Each experiment in the lab manual outlined
included in Appendix 1 (ESI†). the criteria under investigation and clearly stated the require-
ments in order to achieve the various skills-based competencies
Development of a new competency-based qualitative laboratory assessed in each of the experiments. A representative 2017
assessment model KRA114 laboratory experiment is included in Appendix 4 (ESI†).
The overarching aim of this project was to test the viability of a By design, the KRA114 competency-based laboratory assessment
competency-based, qualitative laboratory assessment model in criteria were aligned with both KRA114 unit ILOs and the chemistry
first-year undergraduate Chemistry at UTAS. We were specifically TLOs. Biggs (2002) articulates the need for constructive alignment
interested in investigating the effect that a departure from a and its importance in the development of a learning environment
quantitative assessment in the KRA114 laboratory program to a that features learning activities, with the ultimate aim of students
qualitative, competency-based assessment model would have on meeting the ILOs and TLOs. Consequently, each criterion was
KRA114 learning outcomes and the student experience. Feedback aligned with at least one chemistry TLO and one unit-specific
from KRA114 students, laboratory instructors, technical staff ILO. Through this alignment it was intended that the ILOs would
involved with the unit, and UTAS Chemistry academic staff was be more evident to students and therefore more achievable. Table 2
used to inform and guide the development of a new competency- provides a summary of the alignment between each criterion with
based assessment model underpinned by a criterion-referenced both KRA114 ILOs and the chemistry TLOs.
assessment (CRA) framework. By design, this more skills-focused In the new assessment structure, the laboratory program
assessment model was specifically structured with a view to represented a hurdle requirement. Specifically, in order to be

Table 2 Links between the eleven KRA114 competency-based laboratory assessment criteria and KRA114 unit ILOs (see Appendix 3 (ESI) for more
details) and TLOs for undergraduate university-level chemistry in Australia (see Appendix 2 (ESI) for more details)

KRA114 competency-based criteria KRA114 ILOs Chemistry TLOs


C1: Proficiency in Using Analytical Glassware 4 2
C2: Proficiency in Using Chemical Glassware 4 2
C3: Experimental Accuracy 1, 2, 4 2, 3
C4: Recording Observations 1, 2, 3, 4 1, 2, 3, 4
C5: Mastering Chemical Calculations and Equations 1, 2, 4 1, 2, 3
C6: Understanding and Applying Chemical Principles 1, 2, 3, 4, 5 1, 2, 3, 4
C7: Heating, Cooling and Isolating Materials 1, 2, 4 2
C8: Safety Awareness in a Chemical Laboratory 1, 4, 5 1, 4
C9: Efficiency and Time Management 5 4
C10: Professionalism and Preparation 5 4
C11: Collaboration and Teamwork 1, 5 4

This journal is © The Royal Society of Chemistry 2018 Chem. Educ. Res. Pract.
View Article Online

Paper Chemistry Education Research and Practice

eligible to pass KRA114, students were required to attend and a paper-based questionnaire was used to obtain student feedback
complete a minimum of seven of the eight laboratory classes on the KRA114 laboratory program (Table 4), in addition to the
and successfully demonstrate competency in at least ten of the general UTAS online unit evaluation survey. It was not compulsory
eleven listed competency-based assessment criteria. Each of the for students to provide feedback via these two mechanisms and all
eight experiments that comprise the laboratory program assess feedback was provided anonymously. The questionnaire was
the ability of a student across multiple criteria. Each criterion was designed primarily as a tool to aid the evaluation of this new
assessed at least twice during semester and students were required assessment model. The content of the paper-based questionnaire
to demonstrate proficiency in criterion 8 (safety awareness in a employed a mixed-methods approach, incorporating both Likert
chemical laboratory) at all times (Table 3). It was anticipated that response items (quantitative) and two free-text response items
the development of student competency could be achieved via (qualitative). Anecdotal feedback was also obtained from KRA114
several approaches. From the outset of the laboratory session, the laboratory instructors, technical staff involved with the unit, and
instructor would lead a brief discussion to provide an overview of UTAS Chemistry academic staff in 2016 and 2017. Furthermore,
the experiment and expectations. This would feature detailed unsolicited student and staff feedback was also obtained during
instructions for any safety procedures or key experiment-specific this period. Prior to the commencement of this study, ethics
techniques, which might include providing a physical demonstration approval for this research was granted by the UTAS Social Sciences
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

if necessary. Throughout the laboratory session each instructor was Human Research Ethics Committee (Ethics Reference Number:
expected to actively engage with individuals and groups to provide H0016079).
formative feedback on their progress. The instructor would lead Prior to the implementation of this new laboratory assessment
group discussions where appropriate to ensure consistency of structure, the academic staff responsible for the KRA114 laboratory
method development. Furthermore, the instructor would engage program focused on mentoring and training the laboratory
with each student individually throughout the laboratory session instructors. These instructors were either PhD students or post-
to discuss their performance and to identify areas for growth or doctoral research fellows working in UTAS Chemistry. Importantly,
improvement. This would include providing written feedback the majority of these people had prior experience as KRA114
within the laboratory manual in response to the students recorded laboratory instructors and, thus, were familiar with all of the eight
data and conclusions. KRA114 experiments. A training and mentoring program was
developed to ensure that the laboratory instructors understood
Participants and could clearly communicate the features and nuances of this
In 2016 and 2017, 257 and 235 students completed the KRA114 new assessment model to students, ensuring that they were
laboratory program, respectively. Each year, at the end of semester, suitably equipped to lead and facilitate lab classes.

Table 3 Links between the eight KRA114 laboratory experiments and KRA114 competency-based laboratory assessment criteria

Competency-based assessment criteria


KRA114 laboratory experiment 1 2 3 4 5 6 7 8 9 10 11
(1) The Determination of the Kf of the Iron Thiocyanate Complex by a Spectrometric Method | | | | | | | |
(2) Ionic Equilibria: Acids, Bases and Buffers | | | | | | |
(3) Solvent Extraction | | | | | | | | |
(4) Synthesis of Aspirin | | | | | | |
(5) The Hydrolysis of Methyl Salicylate | | | | | | | |
(6) Preparation of Potassium Trisoxalatoferrate(III) | | | | | |
(7) Coordination Complexes: The Study of Metal Ions in Solution | | | | | | |
(8) The Determination of the Rate of Hydrolysis of Crystal Violet | | | | | |

Table 4 KRA114 questionnaire used to obtain KRA114 student feedback in 2016 and 2017

Question Response options


Q1. I found the laboratory component to be interesting?
Q2. It was clear to me what I was supposed to learn from completing the laboratory component?
Q3. Based on the feedback that I have been provided during the laboratory component, I feel that
I understand the experiments and have developed my laboratory skills during the semester? Strongly Disagree; Disagree; Neutral;
Q4. It was clear to me how the laboratory experiments would be assessed? Agree; Strongly Agree
Q5. The way in which the laboratory component is assessed helps me learn and improve
my understanding of chemistry?
Q6. I enjoyed the chemistry laboratory component this semester?
Q7. Overall, as a learning experience, I would rate the laboratory component as. . . Very Poor; Poor; Average; Good; Excellent

Q8. What aspects of the laboratory did you find most enjoyable and interesting?
Q9. What skills have you acquired or improved upon completing the chemistry Free-text Response
laboratory component?

Chem. Educ. Res. Pract. This journal is © The Royal Society of Chemistry 2018
View Article Online

Chemistry Education Research and Practice Paper

Results and discussion approach to disseminating these surveys was followed in


Outcomes of competency-based, qualitative laboratory both years.
assessment model Although no formal analysis was undertaken for these
responses, these data generally indicated that the implementation
The results were collected in two forms, including Likert style of the new assessment model did not negatively affect or
responses to a series of student perception questions and two compromise the student experience (Table 4 and Fig. 1). More
text-based questions for students to elaborate on their thoughts specifically, these data suggested that across six of the seven
on the laboratory experience undertaken. Fig. 1 provides a questions posed, there was a noticeable improvement in the
comparison of student feedback for those who completed the student experience (on the order of B5–15% in these cases).
KRA114 lab program under the existing quantitative assessment However, responses to question 3 relating to feedback in the
model (2016) and undergraduates that undertook the lab program laboratory suggest that there was a slight decrease in student
under the revised qualitative assessment model (2017). For satisfaction as a result of the change (B3%). The positive impact of
simplicity, the five-point Likert scale responses have been the new (more transparent) competency-based assessment model
condensed into three groups; negative (Strongly Disagree, Dis- on improving student satisfaction is reflected in the responses to
agree; red), neutral (grey) and positive (Strongly Agree, Agree; questions 4 and 5. Furthermore, a greater percentage of students
green). Inspection of the 5-point breakdown of the results
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

appear to have enjoyed themselves in the laboratory and described


indicated that the majority of positive and negative responses it as a better learning experience (Q1, 6 and 7).
lay within the more conservative type (Agree and Disagree The final two questions (Q8 and 9, Table 4) from the
respectively). The uncondensed data can be found in Appendix questionnaire arguably offer greater and more detailed insight
5 (ESI†). In 2016, 182 students (B71% of the cohort) and, into students’ experiences and their understanding of the
in 2017, 121 students (B51%) provided feedback via the skills that the first-year laboratory program aimed to develop.
aforementioned questionnaire. No specific reasons have been The qualitative data were analysed using thematic analysis
identified for the 20% decrease in response rate, as the same methodology (Miles and Huberman, 1994). Individual responses
were grouped by theme and descriptive analyses were performed on
these data. In this way, a broad range of themes were identified in
each set of qualitative responses. The major themes (minimum of
10 responses) are depicted in Fig. 2 and 3. A more detailed
description of the themes identified is provided in Appendix 6 (ESI†).
A number of common major themes were shared in the
respective 2016 and 2017 question 8 (Q8) data (Fig. 2). Notably,
in both years, the most commonly discussed themes were

Fig. 1 Respective 2016 and 2017 KRA114 questionnaire responses to


questions 1–7 (see Table 4 for questions). This data is based on responses
from 182 KRA114 students (2016; top row) and 121 KRA114 students (2017;
bottom row). The responses have been divided into negative (Strongly
Disagree/Disagree; red), neutral (grey), and positive (Agree/Strongly Agree;
green) for questions 1–6. The responses have been divided into negative Fig. 2 2016 and 2017 KRA114 questionnaire responses to question 8 (see
(Very Poor/Poor; red), neutral (Average; grey), and positive (Good/Excellent; Table 4 for questions): common major themes analysis. A major theme is
green) for question 7. All feedback was provided voluntarily and anonymously. defined as having a minimum of 10 responses.

This journal is © The Royal Society of Chemistry 2018 Chem. Educ. Res. Pract.
View Article Online

Paper Chemistry Education Research and Practice

benefits of the laboratory course. Specifically, these were the minor


themes relating to a relaxed and reduced-stress environment,
continuity throughout the laboratory course, safe to make
mistakes, and student ownership.
A number of common major themes were shared in the
respective 2016 and 2017 question 9 (Q9) data. Notably, in both
years, the most commonly discussed theme was the use of
techniques and/or equipment (Fig. 3). Despite techniques and
equipment being the most highly identified skill acquired,
there was little specification in the 2016 Q9 data. In contrast,
the 2017 Q9 data illustrated that students were much more able
to identify and name the specific techniques and experience
that they acquired (Fig. 3). This included skills such as:
recrystallization, titration, distillation, filtration and spectro-
scopic methods. Furthermore, safety emerged as a new major
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

theme, previously a minor theme in 2016. This was often


discussed with reference to specific techniques and practices.
The minor themes for 2017 Q9 data varied greatly with a
high number of one-time responses. However, of these minor
themes some notable inclusions were: time management and
efficiency, accuracy and precision, planning and organization,
and communication. In a recent study by Sarkar et al. (2016), 20
Fig. 3 2016 and 2017 KRA114 questionnaire responses to question 9 (see areas of knowledge and skills were found to be important for
Table 4 for questions): common major themes analysis. A major theme is employability. When compared to our findings from the 2017
defined as having a minimum of 10 responses.
cohort, six of these have been recognised by student responses
from the first-year unit KRA114. These include: content knowledge,
application of knowledge and skills, research skills, mathe-
visual outcomes, lab skills/hands on, instructor, experiment- matical skills, team working skills, and time management and
specific themes and links to the lecture. There was however a organisational skills.
relative change in the identification of these major themes from These snapshots paint a picture of an environment where
2016 to 2017 – for example, ‘‘experiment specific’’ increased the focus shifted from achieving the highest possible grades
greatly (from B10% to B25%) while ‘‘visual outcomes’’ as a under high pressure, to an environment where skills taught in
response decreased (from B20% to B10%). However, more one experiment are recognised as being utilised at later stages
significant differences were observed in the minor themes. in the laboratory course with reduced stress and pressure.
Although the manner in which question 8 was posed has Moreover, the ability of students to learn concepts and techniques
positive bias we wanted to ensure that students commented on is not constrained by the strictures of quantitative assessment that
elements of the laboratory program that they enjoyed and can be inconsistent and also ‘‘stressful’’ when applied in an
found interesting (Table 4). In the past, student feedback was ‘‘intense’’ 3 hour period. This is borne out when comparing the
(understandably) directed to the negative aspects of laboratory combined responses from the laboratory questionnaire and the
program. In contrast, staff had a particularly limited under- end-of-unit responses of respective 2016 and 2017 KRA114 student
standing of what was working well and which features were feedback (Table 5).
positively received by the student cohort. Student perceptions, reflected in the end-of-unit responses,
The minor themes for the 2016 Q8 data included (in order of changed considerably from 2016 to 2017. The 2016 data indi-
number of responses, greatest to least): variety in the laboratories, cated that students found the laboratory useful for developing
inquiry/exploration, getting the correct answer, extending beyond an understanding of the chemistry content and linking this to
curriculum, pre-laboratory quizzes, and information provided. The lecture materials. Analysis of the perceived negative comments
minor themes for the 2017 Q8 data included (in order of number suggested that many students found the laboratory to be a high
of responses, greatest to least): working with others, understanding stress environment and quantitative assessment as being detri-
the content, preferred normal grading model, variety in the mental to the experience. Comments also indicated that some
laboratories, relaxed and reduced stress environment, new students were frustrated by the need to rush to complete the
competency assessment model, overall enjoyment, real world assessable content and they identified this as having a negative
laboratory experiments, the range of equipment used, continuity impact on their learning. In contrast, the 2017 data suggests
throughout the laboratory course, safe to make mistakes, and that, although some student feedback indicated a preference to
student ownership. Although most of these themes were identified return to the traditional assessment model, a much higher
in 2016 and 2017, some were unique to the new competency number of responses praised the relaxed nature of the lab
assessment model, which suggested a change in the perceived environment in this new assessment paradigm. Furthermore,

Chem. Educ. Res. Pract. This journal is © The Royal Society of Chemistry 2018
View Article Online

Chemistry Education Research and Practice Paper

Table 5 Comparison of all specific KRA114 feedback from individual students regarding laboratory experiences: quantitative laboratory assessment
(2016); competency-based laboratory assessment (2017). All feedback was provided voluntarily and anonymously

2016 KRA114 Student Feedback Regarding Laboratory Experiences


2016 Positive Comments
– ‘‘I found the practicals really good – all the staff involved in the practical classes were extremely helpful and approachable and they were more
than happy to help out with any questions about the content.’’
– ‘‘The labs were very helpful for understanding some of the content.’’
– ‘‘Practical classes were definitely the most helpful aspects, tying in lecture content to practical applications of knowledge.’’
– ‘‘One of the highlights of my week was filling out the prelab, getting the lab quiz done and then enjoying the lab.’’
– ‘‘Labs, although stressful in nature (no escaping that) were very well planned.’’
2016 Negative Comments
– ‘‘The only issue I found was that in many of the lab sessions, the amount of lab work required (both practical and written) was often far too much.
For this reason, I felt that it was quite difficult to learn what was intended to be learnt from the labs since most of my lab time was spent rushing
to get everything done in time.’’
– ‘‘The practical element is a very poor assessment, yes I agree that it should be part of the criteria but it is impossible by some weird marking
scheme for anyone to get 100%. Regardless of how well you do the Prac you will only be told oh, the lab techs can do it better and thus you can’t
have all the marks. there is also great inconsistencies between the marking from each demonstrator.’’
– ‘‘I found that the labs were more of a test than a learning exercise. I found the marking in the labs inconsistent. Last semester my marker was
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

quite brutal compared to some of my friends who got similar marks to me in college and vice versa this semester.’’
– ‘‘pracs can be made less stressful.’’
– ‘‘The laboratories were very stressful with too much weight focused on the test, if you did poorly on the test it caused the rest of the lab to be
stressful.’’
– ‘‘Chemistry laboratories are quite intimidating, particularly for people with anxiety. I think they could be improved by making laboratory groups
smaller, and taking more time to go through the practical prior to the experiment. Additionally, laboratories could be further improved by taking
time to show us how to do calculations.’’
– ‘‘The pre-lab 5 minute quizzes are highly stressful and not very good for accurately assessing what people understand. I can’t even read the
questions properly in 5 minutes, let alone do the maths and answer them.’’
– ‘‘The laboratory sessions are quite stressful, especially the short quiz at the beginning, I found it unfair that the short quiz at the beginning of the
lab would count towards so much of your lab mark.’’

2017 KRA114 Student Feedback Regarding Laboratory Experiences


2017 Positive Comments
– ‘‘The order of the labs allowed me to carry on and practice skills I had learned in the laboratory environment.’’
– ‘‘I could actively observe and apply the knowledge acquired in lectures to understand what was occurring in the experiments.’’
– ‘‘I enjoyed not having major assessments so it was relaxed and I could get familiar with the lab.’’
– ‘‘I enjoy the fact that the practicals are not counted in the overall marks of the semester. I feel less pressure and at the same time enjoy and learn
from my mistakes.’’
– ‘‘The experiments are not assessed, therefore I’m not as stressed when doing the experiment.’’
– ‘‘I liked that the 2nd semester labs built on our knowledge from semester 1 labs.’’
– ‘‘Very relaxed environment – stress free.’’
– ‘‘Not really serious – more relaxed and it’s okay to make mistakes.’’
– ‘‘Practice of techniques without pressure of major assessment.’’
– ‘‘The criterion based assessment for the laboratories is certainly appreciated (rather than having percentage grades).’’
– ‘‘I found the lab expectations of this course to be very beneficial and has made me feel more comfortable in the labs going into the future of my
degree.’’
– ‘‘The labs where great.’’
– ‘‘Good competency based assessment of the laboratory component, which was interesting and helped understanding by applying concepts in a
tactile way.’’
– ‘‘The labs are related to the lecture content and all the marking and assessment is fair and not subjective.’’
2017 Negative Comments
– ‘‘Some of the experiments had very vague instructions and were hard to understand and follow.’’
– ‘‘Maybe labs ought to contribute to your mark like they did prior to this year.’’
– ‘‘I found all the labs interesting, however, I would rather that the labs were graded as they have been in previous years.’’
– ‘‘I did not feel as though the practicals enhanced my knowledge of lecture content. I would have much preferred a review session instead.’’
– ‘‘Relevant questions in pre labs that are examinable.’’

students identified an increase in skills focus and ongoing feedback to the authors noting the improved ‘‘feeling’’ within
development and refinement of those skills and techniques the laboratory of students being at ease with the experience and
obtained through the laboratory course. However, this does not showing a genuine interest to extend or link the skills and
mean that improvements are not required; several responses techniques to other experiences. Representative feedback from
from the 2017 cohort indicated that greater clarity in the laboratory instructors is provided in Table 6. These comments
instructions and the design of relevant pre-laboratory questions indicated that as a result of this new approach, laboratory
could enhance the laboratory experience. instructors were able to concentrate their efforts more effectively
The observations of the KRA114 laboratory instructors provides on developing student competency rather than being preoccupied
additional weight to the responses that have been received from with quantitatively assessing it. Notably, instructors were concerned
the student cohort. Without exception, all instructors including that students reducing their efforts to excel within the laboratory.
technical staff present in the laboratory provided anecdotal However, when placed in the context of the entire unit,

This journal is © The Royal Society of Chemistry 2018 Chem. Educ. Res. Pract.
View Article Online

Paper Chemistry Education Research and Practice

Table 6 Representative Feedback from KRA114 laboratory instructors regarding the changes to KRA114 laboratory assessment

Representative Feedback from KRA114 Laboratory Instructors Regarding New Laboratory Experience
Positive Comments
– ‘‘The criteria are simple and clear. If a student has failed to meet a certain criterion it is very clear for them why, and what they must do to remedy
that in the future. This also seems to have alleviated a great deal of stress attached to the numbers in marking. Despite explaining to students
multiple times that each lab represents an incredibly tiny fraction of their overall mark, so they needn’t split hairs, students would often get visibly
distressed between earning an 83 or an 85. Have not heard the lament of ‘‘that’s not what my partner got’’ since this shift has been made.’’
Negative Comments
– ‘‘While the majority have come to lab classes keen to learn, some of the less inspired students have adopted an attitude of passing is a given and
that the criteria are seen as a minimum required therefore the maximum effort necessary.’’
– ‘‘The criteria do no set the bar particularly high and as such there is little separating of the wheat from the chaff and whether or not that is the
intent of the marking in the labs, it is disheartening to some students to have little reward in way of excelling among their peers. There is little
incentive to do anything other than reach the minimum and some students take advantage.’’
Overall Comments
– ‘‘There are a few small issues with certain students, but that will likely always be the case. I think it is a fantastic change. Moving to skills-based
model has definitely made the laboratories a more jovial and less stressful environment, which I believe is more conducive to students learning.
The relationship between instructor and student (in my opinion) has undergone a shift from assessor and nervous assesse to that of an educator
interacting students in a group learning session. . .’’
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

numerous opportunities remain for diligent students to distinguish Conflicts of interest


themselves from their peers (e.g. quizzes, tests, final exam).
There are no conflicts to declare.

Summary
Acknowledgements
As part of this study, data has been collected and analysed in order
to investigate the aforementioned three core research questions. The authors gratefully acknowledge Dr Luke Hunter (University
The student response data suggest that there were no significant of New South Wales) for helpful preliminary discussions.
negative changes and, as a result of moving to a competency
assessment model, there has been a shift in the perception of
students of the skills obtained through the laboratory course.
References
With respect to question 1, the student response data and Australian Learning and Teaching Council, (2011), Learning and
anecdotal evidence indicates that students are both identifying Teaching Academic Standards Project (Science), retrieved 2015,
and acquiring these skills and techniques in the short-term. from http://www.acds-tlcc.edu.au/wp-content/uploads/sites/
Future studies will focus on determining the longer-term 14/2015/02/altc_standards_SCIENCE_240811_v3_final.pdf.
impact on student performance in second and third year Bailey R. and Garner M., (2010), Is the feedback in higher
chemistry units. Regarding question 2, there was a definitive education assessment worth the paper it is written on?
change in students recognising the specific skills and techniques Teachers’ reflections on their practices, Teach. High. Educ.,
coupled with the recognition that a more relaxed assessment 15(2), 187–198.
model allows this focus. Furthermore, this feeds into question 3, Baranyai and Prinsley, (2015), Stem skills in the workforce: What
as students are better able to identify the skills and techniques do employer’s want? Occasional Paper Series, Issue 9, Office
they have acquired and their capacity to link these skills through of the Chief Scientist, Australian Government, Canberra.
multiple experiences demonstrates an improved understanding Bennett S. W. and O’Neale K., (1998), Skills development and
of the unit ILOs. Importantly, students found the assessment practical work in chemistry, Univ. Chem. Educ., 2, 58–62.
model more transparent. Biggs J. B., (2002), ‘Aligning the curriculum to promote good
learning’, constructive alignment in action: Imaginative curriculum
Conclusions symposium, LTSN Generic Centre.
Biggs J. B. and Tang C., (2011), Teaching for quality learning at
In conclusion, the above-mentioned study aimed to address a university, Retrieved from http://www.eblib.com.
number of issues concerning the skills and practices being Black D. P. and Wiliam D., (2010), ‘‘Kappan classic’’: inside the
developed (or not developed) by first year chemistry students in black box: raising standards through classroom assessment,
the laboratory. The data collected and analysed has provided Phi Delta Kappan, 92(1), 81–90.
insight into the success of implementing a competency-based Bretz S. L., (2012), Navigating the Landscape of Assessment,
assessment model and illustrates a shift in the perceptions of J. Chem. Educ., 89(6), 689–691.
students towards recognising these skills and techniques. In DeKorver B. K. and Towns M. H., (2015), General Chemistry
the future, we will continue to monitor and fine-tune this Students’ Goals for Chemistry Laboratory Coursework,
assessment model and the longer-term effects of this study will J. Chem. Educ., 92(12), 2031–2037.
be recorded at second- and third-year levels to determine the DeMeo S., (2001), Teaching chemical technique, a review of the
longevity of these outcomes for continuing students. literature, J. Chem. Educ., 78, 373–379.

Chem. Educ. Res. Pract. This journal is © The Royal Society of Chemistry 2018
View Article Online

Chemistry Education Research and Practice Paper

Erdmann M. A. and March J. L., (2014), Video reports as a novel National Academy of Science, (1996), From analysis to action:
alternative assessment in the undergraduate chemistry Undergraduate education in science, mathematics, engineering
laboratory, Chem. Educ. Res. Pract., 15, 650–657. and technology, Washington, DC: The National Academy
Foukaridis G. N. and McFarlane L. R., (1988), Competency- Press, pp. 1–10.
based training for chemists, J. Chem. Educ., 65(12), 1057–1059. Pullen R., (2016). An evaluation and redevelopment of current
Garnet P. J. and Garnet P. J., (1995), Refocusing the chemistry laboratory practices: an in-depth study into the differences
lab: a case for laboratory-based investigations, Aust. Sci. between learning and teaching styles, (PhD thesis), Retrieved
Teach. J., 41(2), 26–39. from UTAS Open Repository at http://eprints.utas.edu.au/
Ghani I. B. A., Ibrahim N. H., Yahaya N. A. and Surif J., (2017), 23475/.
Enhancing students’ HOTS in laboratory educational activity Reid N. and Shah I., (2007), The role of laboratory work in
by using concept map as an alternative assessment tool, university chemistry, Chem. Educ. Res. Pract., 8(2), 172–185.
Chem. Educ. Res. Pract., 18, 849–874. Sarkar M., Overton T., Thompson C. and Rayner G., (2016),
Hagen J. P. and Gragson D. E., (2010), Developing technical Graduate employability: views of recent science graduates
writing skills in the physical chemistry laboratory: a progressive and employers, Int. J. Innov. Sci. Math. Educ., 24(3), 31–48.
approach employing peer review, J. Chem. Educ., 87(1), 62–65. Seery M. K., Agustian H. Y., Doidge E. D., Kucharski M. M.,
Published on 20 March 2018. Downloaded on 20/03/2018 13:15:13.

Hegarty-Hazel E., (1990), The student laboratory and the science O’Connor H. M. and Price A., (2017), Developing laboratory
curriculum, London: Routledge. skills by incorporating peer-review and digitial badges,
Hofstein A. and Lunetta V. N., (1982), The laboratory in science Chem. Educ. Res. Pract., 18, 403–419.
teaching: neglected aspects of research, Rev. Educ. Res., 52, Tierney J., Bodek M., Fredricks S., Dudkini E. and Kistler K.,
201–217. (2014), Using web-based video as an assessment tool for
Hofstein A. and Lunetta V. N., (2004), The laboratory in science student performance in organic chemistry, J. Chem. Educ.,
education: foundation for the 21st century, Sci. Educ., 88, 91(7), 982–986.
28–54. Towns M., Harwood C. J., Robertshaw M. B., Fish J. and O’Shea
Kirton S. B., Al-Ahmad, A. and Fergus S., (2014), Using Structured K., (2015), The digital pipetting badge: a method to improve
Chemistry Examinations (SChemEs) as an assessment method student hands-on laboratory skills, J. Chem. Educ., 92(12),
to improve undergraduate students’ generic, practical and 2038–2044.
laboratory-based skills, J. Chem. Educ., 91, 648–654. Towns M., Hensiek S., DeKorver B. K., Harwood C. J., Fish
Lagowski J. J., (2000), Lessons for the 21st century: 1999 James J. and O’Shea K., (2016), Improving and assessing student
Flack Norris Award, sponsored by ACS Northeast Section, hands-on laboratory skills through digital badging, J. Chem.
J. Chem. Educ., 77, 818–823. Educ., 93(11), 1847–1854.
McInerney D. M. and McInerney V., (2010), Educational psychology University of Tasmania, (2011), Guidelines for good assessment
constructing learning, 5th edn, Frenchs Forest, NSW: Pearson practice, Rev. edn, Hobart, Australia, Retrieved from www.
Australia. teaching-learning.utas.edu.au/_data/assets/pdf_file/0004/158674/
Miles M. B. and Huberman A. M., (1994), Qualitative data analysis: GAG_v16_webversion.pdf.
an expanded sourcebook, 2nd edn, Thousand Oaks, CA: Sage. Wenzel T. J., (2007), Evaluation tools to guide students’ peer-
Moore J. W., (2006), Let’s go for an A in lab, J. Chem. Educ., assessment and self-assessment in group activities for the
83(4), 519. lab and classroom, J. Chem. Educ., 84(1), 182–186.

This journal is © The Royal Society of Chemistry 2018 Chem. Educ. Res. Pract.

You might also like