Professional Documents
Culture Documents
Car Gas 2017
Car Gas 2017
PII: S1871-1871(16)30159-6
DOI: http://dx.doi.org/doi:10.1016/j.tsc.2017.05.005
Reference: TSC 440
Please cite this article as: Cargas, Sarita., Williams, Sheri., & Rosenberg,
Martina., An Approach to Teaching Critical Thinking Across Disciplines Using
Performance Tasks with a Common Rubric.Thinking Skills and Creativity
http://dx.doi.org/10.1016/j.tsc.2017.05.005
This is a PDF file of an unedited manuscript that has been accepted for publication.
As a service to our customers we are providing this early version of the manuscript.
The manuscript will undergo copyediting, typesetting, and review of the resulting proof
before it is published in its final form. Please note that during the production process
errors may be discovered which could affect the content, and all legal disclaimers that
apply to the journal pertain.
An Approach to Teaching Critical Thinking Across Disciplines Using Performance
First Author
Sarita Cargas1, D. Phil., Assistant Professor1
Affiliation: 1Honors College, University of New Mexico
Address: Honors College 2E, MSC 06 3890, 1 University of New Mexico, Albuquerque,
NM 87131-0001, USA
cargas@unm.edu
FAX 505-277-4271
Phone 505.277.7405
Third Author
Martina Rosenberg3, Ph.D., Assistant Professor
Affiliation: 3Health Sciences Center, Department of Biochemistry & Molecular Biology,
University of New Mexico
Address: BRF 223 J, MSC 08 4670, 1 University of New Mexico, Albuquerque, NM
87131-0001, USA
MRosenberg@salud.unm.edu
FAX 505.272.6587
Phone 505.272.6778
1
Highlights: Ref.TSC_2016_86.3.3-28-17
proposed.
Performance tasks can be designed and explicitly taught using a common rubric
with corrective feedback, aiding both the development and assessment of critical
thinking.
collegiate settings.
student and instructor awareness of the tools and practices involved in critical
thinking.
Abstract
While there is large body of literature discussing critical thinking in higher education,
there is a less substantial body of scholarship exploring methods for teaching it. There
are several tests being used nationally to assess critical thinking. Rather than just
assessing critical thinking, we explored the use of performance tasks with a common
2
rubric as a way of raising student and instructor awareness of the tools and practices
involved in critical thinking. In this exploratory study, faculty in three different fields,
Teacher Education, Social Sciences, and Life Sciences, designed performance tasks in a
aligned to the skills of critical thinking. Although the tasks differed for each cohort, they
were structured similarly and explicitly taught using a common rubric with corrective
feedback, aiding both the development and assessment of critical thinking. Students
improvement among other cohorts. Qualitative results tended to confirm the value of
using performance tasks with corrective feedback on a common rubric may be useful in
many fields. We further suggest that regular use of performance tasks in a problem-based
learning environment can contribute to the transferability of critical thinking skills and
dispositions.
1. Introduction
transferable skills today, one reflects positively on colleges and universities and two
negatively. On the up side of the scale, faculty and administrators agree that a central
function of pedagogy is to teach critical thinking (Arum & Roska, 2010; Bailin & Siegel,
2003; Bok, 2006). Tilting the scale down, data show higher education is in general not
successfully teaching critical thinking at the tertiary level (Gabennesch, 2006; Halx &
3
Reybold, 2005; Pascarella & Terenzini, 2005; Snyder & Snyder, 2008; Stedman &
Adams, 2012; Willingham, 2008). Other studies indicate faculty members are not sure
what critical thinking is nor are they able to define it, complicating attempts to investigate
and ameliorate the discrepancy (Halx & Reybold, 2005; Stedman & Adams, 2012).
There are several instruments measuring critical thinking that are currently used
nationally, namely the Critical Thinking Assessment Test and the Collegiate Learning
Assessment. We wanted to know if it was possible to use one of the tests as a model for
designing learning experiences that would teach students critical thinking in almost any
discipline, and at the same time help faculty become more skilled at facilitating critical
thinking. We chose the Critical Thinking Assessment Test (CAT) for several reasons.
First, the CAT is validated for assessing critical thinking and creativity generically, i.e.,
the CAT does not rely on self-report and it avoids the reliability problems with self-
reported data (Boud & Falchikov, 1989; Ross, 2006). Lastly, the CAT itself models a
performance task such that the assessment of defined skill sets could be integrated
without being perceived by the learner or the instructor as unrelated to the course. We
conjectured that if a formulaic way of creating performance tasks could be developed that
would teach students the language and skills of critical thinking, then it also could be
useful for promoting the disposition of critical thinking among students in the higher
education setting.
thinking. We concur with the urgent need to improve the pedagogy of critical thinking,
present a discussion of how it is defined, and review the data about selected instructional
4
practices that do and do not work. This includes an examination of the arguments for and
against teaching critical thinking in disciplinary contexts; that is, there is a debate about
whether or not critical thinking should be taught in stand-alone courses or if the skills
examples of performance tasks we created and suggest a generic design that can be used
in many fields thereby demonstrating that teaching critical thinking can be based in
disciplinary content on a common rubric while at the same time illuminating the deeper
structure and fostering the transferability of the skills. In the study section of our paper,
we describe how we designed our common rubric and intervention for critical thinking
and share the results of our research. Finally, we present recommendations for further
exploration.
Daniel Kahneman, who won a 2002 Nobel Prize for his work on decision making
and judgment, provides evidence that critical thinking by any definition is not something
thought processes are driven more by the qualities of speed and cognitive laziness. We
rely on “biases and heuristics” for the sake of fast decisions, which provide shortcuts in
thinking (p. 8). Common biases, which Kahneman emphasizes no one can escape,
include: “the halo effect” as when we like or dislike everything about an idea or person
based on limited information; “the mere exposure effect” when we like or believe
something because we have been exposed to it frequently; “the confirmation bias” or the
tendency to only recognize information that we agree with rather than seeking out
additional information; and the “affect heuristic” which is relying on emotions rather than
5
evidence when making a decision, to list only a very few stumbling blocks for complex
To counteract this trap evolution set up for us, Kahneman argues we must train
ourselves to recognize a situation that is likely to create implicit bias, intentionally slow
down and engage our deliberative system. Translated into critical thinking pedagogy,
students must be taught the habits of detecting and preventing these biases by knowing
and becoming conscious of doing all of the above. This conclusion is supported by
others who maintain that critical thinking is not a skill that is developed as a byproduct of
merely maturing (Walsh & Paul, 1986), that it is challenging and difficult to do (Dweck,
2002; Kuhn, 2000; Marin & Halpern, 2010) and that all around us there “are powerful
messages that confound efforts to think critically” (Marin & Halpern, 2010, p. 3).
Examples of the latter point include advertising that appeals to our emotions and sound
It is not surprising the research also reveals that just being educated at any level
does not necessarily impart the ability to think critically. Scholars have speculated that
unless critical thinking is being taught explicitly it is not being taught effectively, even
though faculty may assume it is. Marin and Halpern (2010) found this to be true for
primary and secondary education, i.e., critical thinking was not being taught in teacher
several fields such as nursing, Staib (2003) finds little evidence that it is actually being
taught in nursing programs. These findings are consistent with those that suggest that
6
(Stedman & Adams, 2012, p. 9) and that “few are prepared to teach critical thinking”
Various scholars and stakeholders use the word urgency when calling for more
teaching of critical thinking: “Educators are not alone in their concern about the urgency
Zurkes, Tamin, & Zang, 2008, p. 1103). Abrami et al. cite Tsui’s (2002) finding that in
the United States, “a national survey of employers, policymakers, and educators found
consensus that the dispositional as well as the skills dimension of critical thinking should
critical thinking is especially warranted at a time when the quality of university education
is being closely scrutinized (Arum & Roska, 2010; Fethke & Policano, 2012). All this
points to the growing national recognition of the urgency of teaching critical thinking in
academia.
national scale, we argue there is an urgent need to develop a manageable method for
faculty to teach critical thinking. We will demonstrate that it is possible to use problems
germane to one’s discipline to promote students’ critical thinking. Although we did not
assess faculty improvement in critical thinking, it is worth noting that our method
requires instructors to become more familiar with their own understanding and
dispositions toward critical thinking. Again, research demonstrates that faculty are not
sure what critical thinking is and are not teaching it explicitly, which means they may not
be teaching it at all.
7
1.1.3. Definitions of critical thinking
understand its characteristics. We find that scholars who study critical thinking utilize a
range of definitions to describe it. Most contain overlapping elements and a few add
unique ones. Some descriptions focus on the skills involved in engaging in critical
thinking, some describe what having a disposition toward critical thinking consists of,
whereas others contain both. For example, Brookfield (2012) defines critical thinking in
terms of these skills: “identifying assumptions, checking out the validity of them,
examining different perspectives, and then making informed decisions” (p. 1). Halpern
(1998) also focuses on skills when she describes the activity as involving “solving
problems, formulating inferences, calculating likelihoods, and making decisions” (p. 450-
problem solving Bok (2006) includes the element of reflectiveness that is also found in
many definitions. Paul claims critical thinking is “thinking about your thinking, while
you’re thinking, in order to make your thinking better” (In Nosich, 2009, p. 2). Ennis
“reasonable, reflective thinking that is focused on what to believe or do” (1989, p. 4).
Less often in the definitions we find the requirement to see “both sides of an issue” and
“being open to new evidence that disconfirms your ideas” (Willingham, 2008, p. 21).
One educator of social justice goes so far as to say critical thinking must include a focus
106).
8
None of these definitions are recognized as confined to a discipline. All of them
assert that critical thinking is a general skill and/or disposition. However, some
disciplines adapt these definitions. Staib (2003) states, “some might think the scientific
method sums up critical thinking for the sciences” (p. 499). Likewise, Marin and
Halpern (2010) write that the goals of scientific literacy as stated by the National
Academy of Sciences; i.e., “observational and interpretive skills and making and
evaluating evidence” are similar to definitions of critical thinking (p. 12). Staib (2003)
also explains that an international committee of nurse experts took the above American
Philosophical Association (APA, 1990) definition and did its own study to determine if
they agreed with it. They did accept most of the APA criteria, but amended the
disposition aspect to include “creativity and intuition” (p. 499), similar to the Australian
1.1.4. Arguments for and against teaching critical thinking in disciplinary contexts
Although definitions of critical thinking are not unique to a single discipline, there
is some debate about whether or not critical thinking can be taught in a general course
focused only on critical thinking and independent of the discipline. Although countless
undergraduate critical thinking classes are taught every year, a preponderance of scholars
argue against such stand-alone courses because they find critical thinking is more
successfully taught within the context of discipline specific knowledge (Bailin & Siegel,
2003; Hatton & Smith, 1995; Pithers & Soden, 2000). However, Royalty (1991) and Sa,
Stanovich, and West (1999) defend the need for specific critical thinking courses where
learning critical thinking is the main outcome. In their meta-analysis Abrami et al. use
9
“…general, infusion, immersion, and mixed…. In the general course, critical
thinking skills and dispositions are learning objectives, without specific subject
course but not in the immersion course. In the mixed approach, critical thinking
Based on their analysis of over 150 studies, the authors found that the mixed approach of
teaching specific skills of critical thinking, along with applications to problems and
concepts of practice, had the largest effect on student learning. This lends support to our
Before we present our approach for teaching critical thinking, we review the
practices for teaching critical thinking that have been studied in the literature. In 2012,
Shim and Walczak reported that there was not a great deal of research on “specific
instructor-driven instructional practices that affect students’ critical thinking” (p. 16).
They report on two practices as effective, namely asking “challenging questions” in class
and requiring that students “integrate” ideas either through assignments that compare and
contrast or examine multiple perspectives (p. 24). Tsui (1999) also finds something
similar in that writing involving analysis, more than description, facilitates critical
thinking. And Browne and Freeman (2000) concur that asking frequent evaluative
questions in class aids critical thinking. Engagement and active learning are also cited in
the literature (Browne & Freeman, 2000; Snyder & Snyder, 2008; Gottesman & Hoskins,
10
2013). The latter present evidence that these practices are more effective than lectures
and memorization. The more active pedagogies are supported in an extensive report on
Waddington, Wade, and Persson (2015). The authors present data on 341 effect sizes
from studies “that used standardized measures of critical thinking as outcome variables”
(p. 275). The studies they included were published between 1930-2009 with the
preponderance appearing post 2000, “i.e., 1990–1999, 26.6%; 2000–2009, 44.6%, for a
combination of more than 70%” (p. 292). They conclude that critical thinking disposition
and skills are facilitated by three kinds of pedagogy: various kinds of dialogue, using
authentic problems and examples, and mentoring. In discussing dialogue, the authors
include critical dialogue, debates, whole class and small group discussions. Authentic
problems and examples are defined as students addressing problems, including role-
dyads, and internships. Other scholars also find moderate to high effect sizes with
improvement of critical thinking (Hattie, 2007; Huba & Freed, 2000; Marzano, Pickering
& Pollack, 2001). Black and Wiliam cite studies that show “firm evidence that
innovations designed to strengthen the frequent feedback that students receive about their
learning yield substantial learning gains” (1998, p. 1). In addition, Shiel reports “the
assessment-capable learners, which has an effect size of 1.44 when the average effect size
11
discipline dependent, we concur with van Gelder (2005) that critical thinking skills are
transferable; that is, if students can develop a disposition for critical thinking, the
disposition will facilitate transfers across subject domains when students develop an
orientation of applying critical thinking to their work. Also, if performance tasks based
such interventions might indeed help students comprehend the underlying structure of
critical thinking and promote more frequent and deliberate use. This has been shown to
be essential for transferability (Bailin, Case, Coombs, & Daniels, 1999; Lang, 2016; van
Gelder, 2005).
to “promote higher-order thinking abilities within specific contexts” (2007, p. 37). Thus,
we emphasize that our intervention in the creation of performance tasks with feedback
has been established that there is a paucity of critical thinking being taught, providing
faculty a useful and manageable method for teaching the skill has the potential to make
Two essential questions motivated this exploratory study: (1) can we create a
common rubric that would improve students’ critical thinking skills and dispositions?
And (2) can a common intervention approach focused on critical thinking skills and
12
To investigate the research questions in our exploratory study, both quantitative
and qualitative measures were used. Data was collected using the CAT instrument at the
administered near the completion of the semester to assess student perceptions of their
mastery and use of critical thinking skills as a result of their classwork. Students’ written
reflections were analyzed to determine student perceptions of the quality of their own
thinking skills and strategies for solving the performance tasks. Instructors were queried
on their reflections of the use of performance tasks and feedback to improve critical
American southwest in the spring 2015. Classes represented three different scholarly
fields with their own disciplinary culture. Participants included 28 students in two
Teacher Education classes, 28 students in two Social Sciences classes, and seven students
in one Life Sciences class. Data collection for this study was primarily conducted during
class time, with the exception that Life Sciences students were asked for responses to the
questionnaire and reflection on their critical thinking skills outside of class. This
additional student effort may explain the lower participation rate in Life Sciences.
Demographics were collected from all participants in the study. Most were
juniors and seniors (66.6%). The majority were female (78.1%) and 18-20 years of age
(54.7%). A majority reported their race as White (78.1%) and 39.1% identified their
13
primary language. The sample was atypical of the student population at the institution;
i.e., students by gender and ethnicity were overrepresented in the study. The study sample
was 78.1% female while the institution was 55% female, and the study sample was
78.1% White while the institution was 46.4% White. This disproportionality may be
partly explained by the lack of diverse candidates in the Social Sciences and Teacher
Education cohorts.
Two undergraduate Teacher Education classes at the 300- and 400-level were
included in the study. Both classes were part of the core curriculum for teacher licensure.
The classes met face-to-face in seminar rooms where instructors facilitated learning in
class addressed issues of literacy development across the core content areas in the
secondary school. The other class addressed the study of oral and written forms of
One class was a 200-level Social Science general education course based in the
Honors College. The classes were small and taught seminar style and employed many
active learning strategies. The focus of this particular class was on defining the
interrelationship of globalization and human rights, and then using world hunger as a case
study involving both topics. The assessments included low- and high-stakes writing
14
assignments. The other was a 300-level class focused on an examination of critical
This upper level Biochemistry class was the second part of a two-semester series,
students, most of whom planned to attend medical or other professional schools in the
health sciences. The course structure relied on a student-centered format, that included:
pre-class reading and low stake assignments, in-class peer learning, application level
2.3. Procedures
for use across the three disciplines in a problem-based learning environment (2.3.1),
using critical thinking skills (2.3.4), and student and instructor reflections (2.3.5). We
first adapted a common rubric based on elements of critical thinking and then developed
performance tasks for each discipline. Thus, all faculty used the common rubric while the
The common rubric employed in this study, as shown in Appendix A, was based
15
Sensibaugh, & Osgood (2011) and informed by the works of Andrade (2000). Our
adaptation of the rubric guided instructors in the design of performance tasks, provided
explicit expectations for students, and allowed instructors to apply the rubric to student
work across the disciplines. The criteria in the rubric were made explicit to the students
and served several important pedagogical purposes, namely to articulate the expected
learning outcomes of the performance tasks, to spell out the criteria and indicators of
accord with the rating labels, and to help instructors identify student weaknesses and
students before they worked on the performance task and sharing the results of applying
the rubric to their answers – was an essential part of the creation of the learning
environment. It is in these discussions that the elements of critical thinking were made
explicit and necessary for learning to take place. The discussions also provided group
reflection on any issues that arose in the utilization of the skills. The problem that was the
topic of each performance task was part of the content of the discussions as well.
and a set of higher-order questions for problem resolution, as aligned with the
solution, created a written product, or presented their results for peer and instructor
review. From this perspective, performance tasks differed from routine lessons in that
activities appropriate across the disciplines (Marzano, Pickering & McTighe, 1993). As
16
McTighe and Ferrara (1998) explain, extended performance tasks serve as “an
assessment activity… that elicits one or more responses to a question or problem,” and
The goal of the performance tasks was to provide students with (1) practice in
using the skills of critical thinking; i.e., making claims or hypothesizing, analyzing,
evaluating, and integrating or synthesizing, and (2) assessment of these skills scored on a
common rubric to provide actionable feedback to students. We chose these skills, because
they are common to the definitions of critical thinking in the literature and recognized as
essential to problem solving. Thus, they are the success criteria in our common rubric.
and dispositions (Dewey, 1933; Hatton & Smith, 1995; Schön, 1991).
explore a controversial issue regarding arguments for and against grade-level retention.
and editorials and commentaries. Similar to the problem-based learning process defined
by Wood (2003), students met in small groups to identify conflicting values in the
documents and to create new claims in response to the arguments. Students produced a
task for the secondary teacher education class involved an extensive critical analysis of
17
defensible and informed annotated bibliography. Students were expected to redefine or
literacy learning, and make connections to the larger context of effective literacy practice.
relevant data and irrelevant distractors, and then selected the most relevant information to
make a claim about gaps in current teaching practices. Students constructed new
approaches utilizing evidence from multiple texts to support their reasoning and
for instructor and peer review. Another task for the elementary teacher education class
was designed using the same genres of sources but the topic was identifying evidence-
based best practices in teaching language acquisition in the elementary grades and
incorporating the information into a personal knowledge base for effective instruction.
Feedback was provided to the students on the extent to which they used critical thinking
We designed the first performance task for Social Sciences on the safety of
peer-reviewed journal article, an article from a popular magazine, an article from a self-
proclaimed specialist on the Internet, and excerpts from several book chapters which
reviewed the same experiment found in the Internet article but with different conclusions.
18
Thus, with a total of less than twenty pages of text, students were exposed to different
genres of sources, with differing qualities of evidence, and with stronger and weaker
arguments. In writing, students were first asked to state each of the author’s main claims,
then analyze the evidence provided for those claims, and then evaluate the conclusion.
Students were asked to synthesize the findings in the combined readings and draw a
conclusion about the safety of GMOs. We then used the common rubric to assess and
provide feedback to the students on their work. Finally, students were asked to reflect
about their use of critical thinking after completing the performance task.
The second performance task for the Social Science classes was designed using
the same genres of sources but the topic was the health claims of the Paleo Diet.
The performance tasks for students in the Life Sciences class used the same rubric
for assessing student responses but the topic of the performance task, while complex, was
not a controversial issue. Instead students reviewed a clinical case study, created
integrated the findings, and reflected on the skills/thought processes needed to complete
the case. The task used in the Life Sciences class was most closely modeled to the work
of Mitchell, Anderson, Sensibaugh, & Osgood (2011), who emulated procedural skills
and the scientific method. In an interrupted case-study format students were asked to
dependent and independent variables and constant factors and controls) for a given
hypothesis, evaluated given results in the form of quantitative data, integrated that
information with given the most likely explanation of the scenario. The context of the
19
first task was related to the efficacy of plant-derived medicines and the second task used
Critical thinking skills were measured using the Critical Thinking Assessment
Test (Stein, Haynes, & Redding, 2016) as a pre- and post-test. The secure test consisted
of 15 questions, mostly short essay responses to topics unrelated to the course content,
which took students about one hour to complete. As mentioned in the introduction, we
chose the CAT notably for validity, reliability, cultural fairness, and authenticity. The
instrument showed no floor or ceiling effect and included a range of critical thinking
An all-day scoring workshop was conducted with ten faculty members at the
conclusion of the study. Two members of the research team were trained in scoring by
the CAT proprietors; they in turn trained the volunteer-faculty graders from the three
fields of study. Each question was scored by a minimum of two scorers and
inter-rater reliability, found none of the scored questions had error rates that could lead to
Test-retest reliability of the CAT 4.0 version which we used was > 0.80. Refinements in
the test and the scoring guide have yielded scoring reliability = 0.92 between the first and
For the data analysis procedure, we compared student pre-post tests using a paired
20
t-test. Rank correlation was used to determine if there was a relationship between
questions on the CAT instrument and the participants’ discipline, scores on college
variables in the study were predictive of other variables the following comparisons were
analyzed: a) all 15 CAT pre-post test questions to each other, b) the 15 CAT pre-post test
questions grouped by discipline, and c) the total CAT pre-post test results by standardized
tests widely used for college admissions in the United States (ACT and SAT), grade-level
We used two tests to perform the data analysis: ANOVA for group comparisons
and Spearman's rho for correlations among ordinal variables. Spearman’s rho was
such, it was deemed appropriate for assessing the relationship between ordinal variables
stepwise selection. Pairwise comparisons were made for the variables that remained.
perceptions on how critical thinking was understood and used during the semester. At
the end of the semester, participants were asked to assess their level of confidence in
using the skills of critical thinking to solve problems (Subset 1), their mastery and use of
critical thinking skills (Subset 2), and the degree to which instructor feedback was helpful
in improving their critical thinking skills (Subset 3). Students rated the items on a 5-point
Likert rating scale from strongly disagree to strongly agree. We chose these values to
yield the greatest consistency from respondents (Bartram & Yielding, 1973). Responses
21
were analyzed using simple descriptive statistics for counts of ordered categorical data.
This was necessary in order to compute means. The questionnaire was administered
spring 2015 with all participants in the study on paper in class, or online in Life Sciences.
Table 1 displays the questions for each of the indicators of the disposition toward critical
explore the degree to which students were disposed toward the critical thinking skills of
analysis, evaluation, and synthesis. On the reflective assignment, students were asked to
critically examine their skill level in completing the performance tasks and to identify
areas for growth. Students reflected on and evaluated the quality of their own thinking
skills and strategies for solving the problems in the performance tasks.
The instructors who accepted our invitation to use their classes for the study
implemented the performance tasks and provided feedback to students in the five classes
across the disciplines. We worked with the instructors to create the performance tasks,
conducting open-ended conversations with the instructors lasting about an hour each. We
looked for general patterns in how the instructors perceived the value of the performance
tasks and scoring rubric. We used open coding to create categories that inductively
emerged from our informal observations (Creswell, 2007). Knowledge from the review
22
of the literature allowed us to see the patterns in the observations that were situated in the
3. Results
taking each research question in turn, i.e., (1) can we create a problem-based learning
environment using performance task interventions based on a common rubric that would
improve students’ critical thinking skills and dispositions? (3.1.). And (2) can a common
intervention approach focused on critical thinking skills and dispositions be used across
disciplines? (3.2.).
information (Q. 13 and Q. 14) accounted for most of the variability in the total CAT
scores across disciplines, indicating tasks that prompt interpretation and evaluation-level
thinking are powerful methods for improving the critical thinking skills of students in
between the responses on evaluation-level thinking in the CAT pre-post test questions. A
evaluative thinking (Q13 and Q14 where rs [63] = 0.67, p < .05) and the post-test
questions (Q13 and Q14 where rs [63] = 0.7, p < .05). These pairings in the dataset were
found to be similar to the clusters posited by the designers of the CAT for strengths in
23
When grouped by discipline, there was no evidence to suggest overall significant
statistically significant relationship was found among the Social Sciences students on the
0.8, p < .05) and on the total questions (rs [28] = .75, p < .05). In the Life Sciences
questions (Q13 and Q14) and by all questions (rs [7] = .87, p < .05). In the Teacher
Education cohort, Q13 on evaluation in the post-test showed significance (rs [29] = 0.89,
p < .05).
When CAT test results were compared to recognized college admissions tests,
namely the ACT and SAT, the scores correlated to each other; however, SAT had a
stronger correlation to the pre-test questions (rs [63] = 0.839, p < .05) and no correlation
to the post-test questions. The designers of the CAT instrument report similar results;
i.e., ACT and SAT are normally distributed and pre-college SAT contributes minimally
development of critical thinking among upper-class students could have been influenced
by several factors unrelated to the study. Grade-level standing, maturity, course taken, or
24
3.2. Research question 2
perceived mastery of critical thinking skills. Following the example documented by Koon
students’ perceptions of the understanding and use of critical thinking skills. Our
justification for using the single questionnaire at the end of the study was supported in
part by the work of O’Connell and Dickinson who found student perceptions added value
to the assessment of learning beyond conventional pre- and post-test results (1993).
mastery, and feedback. Results indicate the data points tended to be very close to the
mean with few differences and little spread. Despite posting the lowest performance on
confident in their mastery of critical thinking skills and in the value of instructor feedback
than students in other disciplines. In contrast, despite higher performance on the CAT
instrument, the small sample of Life Sciences students appeared to be somewhat less
confident in the use of critical thinking skills in their course. This may be partially
explained by the lack of explicitly mentioning critical thinking during the course;
although students used critical thinking in their assignments, this was a tacit rather than
an explicit objective.
were disposed toward critical thinking. Overall, the performance task interventions
25
appeared to enhance students’ perceptions of their own critical thinking skills, such as
analysis, evaluation, and synthesis. The tasks also provided instructors with an authentic
products, students put their critical thinking skills into action, applying what they were
evaluation, and synthesis. Table 3 captures the key themes from the student reflections
and demonstrates that students, regardless of discipline, indicated they would be able to
use the skills of critical thinking in their future learning. A typical comment came from
one student who said, the instructor “provided us with an abundance of feedback that
many professors lack; this written and verbal feedback has helped me to identify
strengths and areas for growth as a critical thinker to my future learning.” Testimonials
such as this suggest the importance that feedback on a common rubric plays in students'
result of new information, 75% reported their opinion changed, 21% reported their
opinion did not change, and 4% reported their opinion changed somewhat. When asked
how they could improve as critical thinkers, students offered specific strategies to
improve their motivation and success as critical thinkers. They said they would attend to
“close reading,” strive for “clarity of written thought,” make explanations “more relatable
to future work,” and “not let emotions cloud thinking.” One student said, “I went from
being unsure of what exactly critical thinking was all about to making it my initial
response to many questions I face every day.” This student’s experience was consistent
26
with Fletcher’s (2015) premise that explicit instruction moves students from doubt to
defensible action. Reflections from the cohort in Life Sciences indicated that other factors
might influence mastery of critical thinking, such as fixed mindsets and prior knowledge.
We found unexpected benefits for instructors who participated in the study, both
in terms of influencing their own beliefs about whether critical thinking can be taught and
in the quality of student work on the performance tasks. Our informal observations in the
While our protocol did not include formal interviews with the instructors, we were able to
develop some general impressions using the open coding method described by Creswell
(2007).
What we discovered from our informal observations was that instructors were
generally inclined to express gratification with the way their students explored more than
their awareness of the use of performance tasks in generating multiple arguments based
on the assignments and explicit expectations in the rubric. Some registered surprise
about how well students were able to marshal quality support for their claims, regardless
of the particular position taken in the source documents. Some instructors intervened
directly when students were engaged in the learning environment by addressing student
that the use of complex performance tasks helped slow down the students’ typical rush to
27
able to create performance tasks with greater ease. Designing rigorous performance tasks
aligned to the rubric had been time intensive for some instructors, requiring more
participation in the study, instructors noted the design work provided them with a means
to enhance their pedagogy. In our investigation, we had the opportunity to learn from
interdisciplinary faculty how they used performance tasks and feedback on a common
rubric to deepen students’ critical thinking. Instructors agreed that the tasks were initially
challenging for students, but with repeated practice and regular feedback, students were
able to go beyond what was expected. Similar to the findings of Costa and Kallick
(2008), instructors noticed that students’ products and performances on the performance
tasks created evidence of critical thinking that moved well beyond the bubble of paper
4. Discussion
This study explored how critical thinking can be taught in college classes,
provided an approach to respond to the existing gap in the teaching of critical thinking,
and argued that skills connected to critical thinking are not dependent on a specific major.
We aimed to introduce a pedagogical method that most faculty could adapt into their
existing teaching strategies, thereby ensuring that students are explicitly asked to apply
environment with performance task interventions that would improve students’ critical
thinking skills and dispositions by using multiple measures. We can report gains on
individual questions and a trend towards improvement of the CAT scores, but we cannot
28
confirm critical thinking gain based on the overall aggregate CAT results for most of the
classes. However, students in one class did have a statistically significant gain between
the pre- and post-test. The Social Science class on globalization and human rights had an
extended focus on the question of how to approach world hunger. Although, there was no
additional time spent on the specific performance tasks compared to the other classes in
this study, it is possible that the length of time focusing on one topic area as a class
contributed to the detected improvement in critical thinking. This finding justifies the
students identifying themselves as either Hispanic or White. This aligns with findings of
the CAT designers who noted “once the effects of entering SAT score(s)… were taken
into account, neither gender, race, nor ethnic background were significant predictors”
(Stein, Haynes, & Redding, 2016, p. 4). The instrument was designed to eliminate
cultural bias and we took care not to introduce these effects involuntarily when we
While the CAT results showed some limited improvements in critical thinking,
the qualitative data provided information that enhanced interpretation of findings which
could not be captured by the CAT instrument alone. The qualitative data demonstrated a
their critical thinking and showed overall positive attitudes towards deliberate practice
29
created engendered lively discussions. As Abrami et al. demonstrate (2008), discussion
Contrary to claims made by Huber and Kuncel (2016), our study suggests critical
thinking improves when the skills are explicitly taught and embedded in the content of
the discipline. Critical thinking is a skill that must be practiced and repeated over and
conclusion can best be accomplished if people have a disposition and attitudinal readiness
that ensures they will make themselves conscious of the need to apply the skills. If
students and faculty start developing a disposition toward critical thinking, they should
expect to see gains in critical thinking over the course of a semester-long study in the
disciplines. Overall the evidence presented here leaves us hopeful that further
common intervention approach that focused on critical thinking skills and dispositions
and investigated its usability across disciplines. Because the initial pre-post assessment
data suggest, but do not strongly support the teachability of critical thinking, we can only
interpret our findings with caution. In our exploratory study, gains in critical thinking
reached significance level on evaluative thinking across the disciplines, but the aggregate
CAT results did not confirm overall critical thinking gain for most of the classes. We
therefore make relative statements about the power of performance task interventions
with cautious optimism that our goal to increase the use of performance tasks with
corrective feedback on a common rubric is warranted. Support for the notion that we
30
could design an intervention suitable to many disciplines comes from the qualitative data.
disciplines. We found that a common rubric can be used to explicitly teach students the
skills required for critical thinking. For at least a semester, our subjects became aware
that the practice of hypothesizing and identifying claims, analyzing, and evaluating and
synthesizing results was applicable to their understanding and use of critical thinking.
Overall, our results also contribute to the research that supports particular methods for
teaching critical thinking. The three main techniques for promoting the skills, as
presented in Section 1.1.5, include fostering critical dialogue and discussion, using
authentic problems in the course, and mentoring. Our problem-based environment using
performance tasks achieves all three techniques because authentic problems from the
disciplines were used in the tasks, instructors mentored the students through corrective
feedback on student work, and class discussion about the students’ analysis, evaluation,
4.1. Recommendations
Instructors can use this method to bring critical thinking into their classrooms
with little to no additional professional development required. They can adopt our rubric
problems important to their disciplines. Tasks can be done within the time constraints of
a course or as an extended homework assignment. What does need to take place within
class time is a presentation of the rubric that will be used to assess student work so the
skills of critical thinking are made explicit. Also, discussion of the student work needs to
31
occur during and after each task so students are able to use the feedback to reflect on their
work before they undertake a subsequent performance task. Through discussion faculty
provide mentoring which contributes to learning skills that can be applied to other
student results allows the class to further deepen their comprehension and critique of the
argue that any course can be turned into one that facilitates critical thinking in which
students are given high-rigor tasks that elicit the knowledge and skills needed to solve
complex problems in their disciplines (Stylianides & Stylianides, 2008). Explicit and
intentional tasks of high-quality, relevant content, and clear purpose predict higher
outcomes. Thus, following our method does not take away time from learning course
content. We contend instructors who feel anxious about having enough time in the
semester to add yet another item to their curriculum will find that our approach
maximizes time for learning. Finally, instructors wishing to collect more information
about their students’ thinking may choose to use a pre-test to determine the influence of
students’ prior experience in the understanding and use of critical thinking in solving
participants among the three cohorts and the interpretive nature of qualitative research
(Atieno, 2009). Steps were taken to reduce these possible limitations through open coding
(Creswell, 2007) and triangulation of data, which included pre-post assessment on the
32
informal interviews with instructors. Even so, we acknowledge the findings from the
study are not generalizable. With the disproportionately low number of participants in
Life Sciences, the students’ dispositions toward critical thinking may not have been
consistent results across tasks and short time periods, student self-assessment of skills
assessments (Boud & Falchikov, 1989; Ross, 2006). Finally, while we did not propose to
validate the questionnaire we employed, it appears that more redundancy in the survey
questions along with open-ended comment fields could aid in the interpretation of
participant responses.
promise for deepening the language and skills of critical thinking across disciplinary
performance on similar tasks (City, Elmore, Fairman & Teitel, 2009). As Lang reasons,
“frequent and deliberate practice” enhances learning (2016, p. 21), thus it would be
informative to conduct studies that use performance tasks more frequently in a course,
and better yet, use them in a series of courses or a full program of studies. Affirming that
repeated practice of critical thinking skills correlates with a greater ability and disposition
for critical thinking would strengthen the argument that faculty must explicitly use the
language of critical thinking and students must repeatedly practice the skills.
Our study suggested two additional avenues for research. First, while much is
33
known about the powerful effects on critical thinking of generating and testing claims,
problem-solving, and receiving corrective feedback (Abrami et al, 2015; Hattie, 2015;
Marzano, Pickering, & Pollock, 2001), it would be useful to determine the effect of
performance tasks that span the length of a course on the transferability of critical
one extended topical problem in a discipline could provide convincing evidence on the
research about increasing faculty knowledge of and facility in teaching critical thinking.
Further research is necessary if we are going to strengthen the argument for the routine
5. Conclusions
of exploring the teaching and learning of critical thinking by using generic performance
a common rubric, and an explicit focus on critical thinking skills and dispositions. Some
discernable improvement among other cohorts. While the overall statistical results were
less than anticipated, we found these results were not the only factor informing instructor
decisions about using performance tasks to teach critical thinking. As Hattie observed
increased “to the degree that students and teachers set and communicate appropriate,
specific, and challenging goals” (1999, p. 2). In our exploratory study the qualitative
results tended to confirm the value of student participation in rigorous and challenging
34
performance tasks. We also observed the practice of creating and scoring performance
tasks can aid critical thinking in the faculty, motivate their deliberate inclusion of critical
thinking in course designs, and prompt rich discussions about pedagogy which can lead
may be useful in many fields. Regular use of performance tasks could have a substantial
impact in higher education if both faculty and student understanding of critical thinking
skills and dispositions is improved and if the tasks given students are intentionally
Acknowledgements
This study was approved by the University’s Institutional Review Board (protocol
662865-2). We are grateful to the university statistics consultant for his support in the
data analysis and to the instructors for permitting us to conduct our study in their
classrooms. We also thank our colleagues who provided time and expertise to score the
CAT responses. Funding: This research did not receive any specific grant from funding
agencies in the public, commercial, or not-for-profit sectors. Expenses were offset with
35
Page intentionally left blank
36
References
Abrami, P., Bernard, R., Borokhovski, E., Wade, A., Surkes, M., Tamin, R., & Zhang, D.
1102-1134.
Abrami, P., Bernard, R., Borokhovski, E., Waddington, D., Wade, A. & Persson, T.
document ED 315-423.
13-18.
Bailin, S., Case, R., Coombs, J. & Daniels, L. (1999). Common misconceptions of critical
Bailin, S. & Siegel, H. (2003). Critical thinking. In Nigel Blake, Paul Smeyers, Richard
Smith, and Paul Standish, (Eds.), The Blackwell Guide to the Philosophy of
37
Bartram, P., & Yielding, D. (1973). The development of an empirical method of selecting
Bok, D. (2006). Our underachieving colleges: A candid look at how much students learn
and why they should be learning more. Princeton, NJ: Princeton University Press.
Brookfield, S. (2012). Teaching for critical thinking: Tools and techniques to help
https://www.tntech.edu/assets/userfiles/resourcefiles/8783/1454517036_CAT%20
Technical%20Information%20V8.pdf
City, E. A., Elmore, R. F., Fiarman, S. E., & Teitel, L. (2009). Instructional rounds in
Costa, A. & Kallick, B. (2008). (Eds.) Learning and leading with habits of mind.
Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five
38
approaches (2nd ed.). Thousand Oaks, CA: Sage.
Dewey, J. (1933). How we think: a restatement of the relation of reflective thinking to the
Dweck, C. S. (2002). Beliefs that make smart people dumb. In R.J. Sternberg (Ed). Why
smart people can be so stupid. New Haven, CT: Yale University Press.
Ennis, R. (1989). Critical thinking and subject specificity: Clarification and needed
Fethke, G. C., & Policano, D. J. (2012). Public no more: A new path to excellence for
Gabannesch, H. (2006). Critical thinking...What is it good for? (In fact, what is it?).
Hackman, H. (2005). Five components for social justice education. Equity & Excellence
Halpern, D. (1998). Teaching critical thinking for transfer across domains: Dispositions,
53(4), 449-455.
39
thinking capacity in undergraduate students. The Journal of General Education,
54(4), 293-315.
Hattie, J. (2007). The power of feedback. Review of Educational Research, 77(1) 81-112.
Hatton, N. & Smith, D. (1995). Reflection in teacher education: Towards definition and
Shifting the Focus from Teaching to Learning. Boston: Allyn and Bacon, 2000.
Huber, C. R. & Kuncel, N. R. (2016). Does college teach critical thinking? A meta-
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
Jossey-Bass.
Koon, J., & Murray, H. G. (1995). Using multiple outcomes to validate student ratings of
40
science of learning. San Francisco, CA: Jossey-Bass.
adolescents: Explicit instruction produces the greatest gains. Thinking Skills and
Creativity, 6, 1-13.
Marzano, R., Pickering, D., & Pollock, J. (2001). Classroom instruction that works:
ASCD.
McTighe, J., & Ferrara, S. (1998). Assessing learning in the classroom: Student
Distribution Center.
Mitchell, S. M., Anderson, W., Sensibaugh, C., & Osgood, M. (2011). What really
Newton, C., & Fisher, K. (2009). Take 8. Learning spaces: The transformation of
educational spaces for the 21st century. Victoria, Australia: The Australian
Nosich, G. (2009). Learning to think things through: A guide to critical thinking across
41
Development in Education, 27(1), 18-23.
Pascarella, E. T. & Terenzini, P. T. (2005). How college affects students: A third decade
Pithers, R. T., & Soden, R. (2000). Critical thinking in education: A review. Educational
Sa, W., Stanovich, K., & West, R. (1999). The domain specificity and generality of belief
Shiel, T. K. (2017). Designing and using performance tasks: Enhancing student learning
Shim, W. & Walczak, K. (2012). The impact of faculty teaching practices on the
Schön, D. A. (1991). The reflective turn: Case studies in and on educational practice.
Staib, S. (2003). Teaching and measuring critical thinking. Journal of Nursing Education,
Snyder, L. & Snyder, M. (2008). Teaching critical thinking and problem solving skills.
42
Stedman, N. & Adams B. (2012, June). Identifying faculty’s knowledge of critical
Stein, B., Haynes, A., & Redding, M. (2016). National dissemination of the CAT
Practice Symposium.
Tsui, L. (1999). Courses and instruction affecting critical thinking. Research in Higher
Tsui, L. (2002). Fostering critical thinking through effective pedagogy. The Journal of
van Gelder, T. (2005). Teaching critical thinking: Some lessons from cognitive science.
Walsh, D. & Paul, R. (1986). The goal of critical thinking: From educational ideal to
Wood, D. (2013). ABC of learning and teaching in medicine: Problem based learning.
43
Table 1 Student Questionnaire: Mastery and Use of Critical Thinking Skills
me.
from inferences.
irrelevant information.
to my future learning.
Source: Questionnaire created employing coding from the defined skill sets in the CAT
44
Table 2
Mean Ratings on Student Questionnaire by Discipline and Indicator (n=63), Spring 2015
(n=28)
(n=28)
(n=7)
(n=63)
45
Table 3
(CT) Skills
“This [activity] is a great way to show students how to back up their beliefs
Evaluation "I used critical thinking in evaluating each article's points against those
information."
I was able to “compare the two viewpoints and evaluate the validity of their
evidence.”
information.”
well, but almost became the norm a few months into the course. I went from
being unsure of what exactly critical thinking was all about to making it my
continued
46
Synthesis I learned “how perceptions can be biased.”
evidence.”
________________________________________________________________________
47
Appendix A: Common Rubric to Assess Quality of Response to a Performance Task
Criteria Below Meets Meets or
Expectation Expectation Exceeds Expectation
0-3 4-6 7-9
Hypothesize Makes Generates one hypothesis or Given a set of observations,
and/or claims/ claim, but fails to provide makes a claim and/or
Make hypotheses specific/underlying reasons generates hypotheses about 3
claims that are not that are likely to have or more underlying reasons
specific to produced or contributed to that are likely to have
the the observations. produced or contributed to the
observations. observations.
Analyze, Rationale is Rationale has some Given a hypothesis or claim to
Investigate not grounded legitimacy, but fails to be be investigated, presents a
in fact; convincing. Makes a claim sound rationale that describes
contains but does not explain why it what is known and unknown
fallacious is controversial. about the history/antecedents
and/or Does not specify the of the problem.
deceptive evidence needed to Specifies the evidence needed
reasoning. investigate the situation. to investigate the situation.
Explains why some Separates relevant from
observations and/or data are irrelevant observations or data;
irrelevant. separates factual information
from inferences.
Evaluate Overstates States one result, but fails to States 2 or more results that
Results the results; specify important values emerged from the evidence.
does not being observed. Identifies important values
address what (these may be expressed as
is known magnitudes, sizes, or amounts
from the in graphs/figures).
investigation
Synthesize, Over- Cites relevant results and Interprets results; explains the
Integrate interprets the identifies some observations link between the original basis
results; does and/or data that consistently for the observations; describes
not integrate support the the history and/or antecedents
information. hypothesis/claim, but fails of the problem.
to explain what the results Specifies the evidence used to
mean. support the claim/hypothesis.
Makes a new claim or counter-
claim that integrates and
synthesizes information from
the investigation.
Reflect Reflection is Misses several components Critically evaluates own
off-topic, of the required reflection. performance on the task;
aimless and Word choice is appropriate reflects on the skills/thought
disorganized with few errors. processes needed to complete
Word choice the task
is confusing. Specifies 2 or more strategies
Sentences or approaches used to
are awkward complete the task.
and distract Identifies 1 or more areas of
the reader. growth.
Total Points Possible (45 points possible) _____ (%)
Adapted from: Mitchell, Anderson, Sensibaugh, & Osgood (2011) and informed by Andrade (2000).
48
Appendix B: Sample Released Test Item from CAT Instrument
in bread causes criminal behavior. To support his theory, the scientist notes the following
evidence:
99.9% of the people who committed crimes consumed bread prior to committing
crimes.
Crime rates are extremely low in areas where bread is not consumed.
Do the data presented by the scientist strongly support the theory? Yes ___ No ___
Are there other explanations for the data besides the scientist’s theory? If so, describe.
What kind of additional information or evidence would support the scientist’s theory?”
institutions for course, program, and general education assessment. The Critical-thinking
Assessment Test (CAT) was developed with input from faculty across a wide range of
sciences and assessment and with support from the National Science Foundation (NSF).
NSF support also helped distribute the CAT and provides training, consultation, and
statistical support to users. Partial support for this work was provided by the National
Science Foundation’s TUES (formerly CCLI) Program under grant 0404911, 0717654
and 1022789.
49