You are on page 1of 18

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/344301019

Curriculum assessment practices that incorporate learning outcomes in higher


education – a systematic literature review

Article · September 2020

CITATIONS READS

2 578

6 authors, including:

Roland Ebel Alexandra Thornton


Montana State University Montana State University
54 PUBLICATIONS 177 CITATIONS 9 PUBLICATIONS 3 CITATIONS

SEE PROFILE SEE PROFILE

Charlie Watt Colin Dring


Montana State University Royal Roads University
2 PUBLICATIONS 6 CITATIONS 25 PUBLICATIONS 108 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Advancing an Inclusive Food Systems Curriculum based on a Signature Pedagogy View project

Extensive production of habanero pepper (Capsicum chinense Jacq.) View project

All content following this page was uploaded by Roland Ebel on 21 September 2020.

The user has requested enhancement of the downloaded file.


Curriculum Assessment Practices That
Incorporate Learning Outcomes in Higher
Education: A Systematic Literature Review
R. Ebel1, S. Ahmed2 , A. Thornton3, and C. Watt4
Montana State University Bozeman
Bozeman, MT

C. Dring5
University of British Columbia
Vancouver, BC

A. Sames6
University of Minnesota Twin Cities
Minneapolis, MN

Abstract

As new higher education programs emerge to prepare of an adaptable curriculum assessment procedure for
students for being part of a well-equipped workforce higher education.
to meet the challenges of our times, and established
programs revise their curricula, clear program learning Introduction
outcomes (PLOs) are required. The process of curriculum
(or program) assessment serves to test whether programs Institutions of higher education are responding to the
are effective in facilitating their PLOs. The objective of our multifaceted societal challenges of our times by training
systematic review was to identify best practices in curriculum students in innovative and complex skills towards preparing
assessment. We examined peer-reviewed literature from a well-equipped workforce. For example, human activities
1995 to 2019. The Preferred Reporting Items for Systematic are having unprecedented impacts on the Earth and its
Reviews and Meta-Analyses (PRISMA) protocol was systems with implications for human and planetary health
applied to collect evidence, reduce bias, and synthesize (Eichinger, 2019; Steffen et al., 2015; Tilman and Clark,
evidence. We screened 535 abstracts and conducted full- 2014). These challenges require the development and
text screening of 43 publications of which 20 were selected implementation of social innovations to transform society
for final extraction. We found that the number of articles (Olsson et al., 2017). As a result, new program offerings
providing empiric evidence about effective curriculum emerge in institutions of higher education while long-
assessment is low. We identified promising practices based standing programs revise their curriculum to prepare a
on the frequency of their mention and their assessment by workforce to meet societal needs. Consequently, these
the authors. These practices include an assessment plan; programs need to articulate a clearly defined set of program
assessment implementation through a faculty committee; learning outcomes (PLOs) for students to demonstrate with
using both, direct and indirect assessment indicators (for regards to the acquisition of certain knowledge, skills, and
example, student performance in specific assignments attitudes (Killen, 2000). A PLO describes what is essential
and student exit or alumni interviews); and an ongoing for all students to successfully demonstrate in a measurable
reevaluation of PLOs. Our findings sustain the development way at the end of their learning experience within a given

1
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. roland.ebel@gmx.com. 406-994-5640
2
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. selena.ahmed@montana.edu. 406-994-5640
3
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. alithrntn@gmail.com. 406-994-5640
4
Plant Sciences & Plant Pathology Department, Plant BioScience Building 232, MSU Campus, Bozeman, MT, 59717. charleswatt@montana.edu. 406-994-5147
5
Centre for Sustainable Food Systems, 2357 Main Mall, Vancouver, BC, V6T 1Z4. colind@mail.ubc.ca. 778-859-1148
6
Department of Fisheries, Wildlife and Conservation Biology, 135 Skok Hall, 2003 Upper Buford Circle, St. Paul, MN 55108. same0057@umn.edu. 612-624-3600
238 NACTA Journal • Nov 2019 - Oct 2020
program (Knight, 2001). There is further need for curriculum and activities, to PLOs, which describe what is essential
assessment to ensure that new and existing programs are for all students to successfully demonstrate at the end of
effective in meeting specific PLOs for preparing students as their learning experience (Spady, 1994). While curriculum
professionals to address the grand challenges of our times. development was historically relatively subjective and
Outcome-based education (OBE) has gained unstructured, during the second half of the 20th century,
momentum over the past five decades to prepare a systematic curriculum development emerged on the agenda
professional workforce that meets societal needs. of academic discussion and educational administration
Curriculum assessment (often synonyms with ‘program (Knight, 2001). For the last fifty years, OBE has been the
assessment’) is a core procedure in OBE. The identification prevailing approach to systematic curriculum development
of suitable PLOs to assess how a curriculum meets these (Barman et al., 2014).
PLOs is one of the core purposes of OBE (Barman et al., The idea of OBE has its origins in behaviorist learning
2014). theories (Morcke et al., 2013). In the mid-20th century, Tyler
The way we assess our curricula, and ultimately our [(1949) in (Morcke et al., 2013)] argued that curriculum
students, are recognized as game-changers to improve design should be determined by explicit objectives. Tyler’s
student outcomes for future success (Bryan and Clegg, student Bloom (Bloom, 1956) built on the approach of
2019). There is thus a need to identify best practices for designing curriculum based on explicit objectives through
curriculum assessment suitable in different contexts, which the development of a taxonomy of educational goals which
can be applied towards the development of an adaptable he classified into knowledge, skills, and attitudes (Morcke
curriculum assessment tool for higher education institutions. et al., 2013). While Bloom´s taxonomy was reviewed
Apart from increasing the alignment of curricula and their and updated to fit contemporary assessment standards
PLOs, such a tool would allow for the communication of (Krathwohl, 2002), the differentiation between knowledge,
common educational goals, experiences, and methods skills, and attitudes persists (Harden, 2001).
across instructors, programs, and departments. To our In the 1970s, constructivist approaches to curriculum
knowledge, there is a lack of a published review identifying design opposed OBE. Particularly Stenhouse (1985)
best practices for curriculum assessment as well as of a emphasized curricula based on the students’ learning
comprehensive adaptable curriculum assessment tool or process instead of the intended outcomes of this process.
protocol. During the 1990s, OBE regained popularity, especially in
The objective of our paper is to address the best designing curricula for pre-university education. Spady
practices and protocol gap and advance the field of OBE (1994), whose work is seen as a turning point in curriculum
through conducting a systematic literature review on development, conserved the constructivist principle that the
practices for curriculum assessment that incorporate students’ knowledge and skills are a consequence of learning
PLOs. Specifically, this systematic review examines peer- experiences but subordinated this principle to measurable
reviewed literature from 1995 to 2019 that reports outcomes and observable PLOs. Precisely, Spady considered such
of concrete curriculum assessment interventions in higher learning experiences, which he calls learning goals, a
education with an emphasis on assessment practices based requirement for obtaining PLOs. In contrast, he defines
on PLOs. The Preferred Reporting Items for Systematic PLOs as an explicit demonstration of learning. Therefore,
Reviews and Meta-Analyses (PRISMA) protocol (Moher et in modern OBE, PLOs serve as the reference point for the
al., 2009) was applied to collect evidence on the closed- assessment and regulation of the students’ proficiency
frame study question: In OBE higher education curricula, (Morcke et al., 2013) and the implicit consequences for
what are practices for curriculum assessment that course and curriculum design (Barman et al., 2014). At the
incorporate PLOs? The Population, Intervention/Exposure, end of the 1990s, OBE approaches became increasingly
Comparator, Outcome (PICO) framework (Schardt et al., frequent in the development of higher education curricula,
2007) elements were applied to structure this systematic particularly in medicine, nursery, and engineering programs
review which examines curriculum assessment as the that expect graduates to have specific demonstrable
intervention/exposure with the assessment procedure as knowledge, skills, and attitudes (Keating, 2014; Knight,
the outcome. The PRISMA and PICO protocols were applied 2001). Today, OBE is almost omnipresent, from language
for the review to reduce bias and synthesize all evidence in (Barrette and Paesani, 2018) to nutrition (Joyner-Melito,
a replicable way. Findings from the review will inform the 2016) programs.
development of an adaptable curriculum assessment tool The OBE framework systematically and deductively
to evaluate the effectiveness of higher education programs. facilitates and structures the development of a curriculum
by first defining the final product, the PLOs. Since OBE is
Background on Outcome-based education performance-oriented, it relates curriculum design to the
(OBE) students’ professional careers after graduation. Accordingly,
OBE can increase graduates’ adaptation to a new job
(Harden, 2001; Hartel and Gardner, 2003; McGourty et al.,
Overview of OBE
1998). Some of the pioneers in OBE emphasized that it also
A curriculum is an attempt to communicate the
seeks to strengthen students’ role in the broader society
principles and features of an educational proposal to
(Mitchell and Spady, 1978).
effectively translating it into practice (Stenhouse, 1985).
The adoption of OBE has not been without resistance
The goal of OBE is to align curriculum, including content
(Harden, 2007). The main criticism is that both, the
NACTA Journal • Nov 2019 - Oct 2020 239
Core stakeholders and processes in outcome-based education based on input from Harden (2002) and Koppang
(2004).

conceptual background of OBE and the empirical evidence real curriculum development but has maintained influence
to prove its efficiency, are relatively weak (Barman et al., on certain pedagogies, such as feminist teaching practices
2014). Others criticize that since most of the work is usually (Carillo, 2007) and transformational teaching (Slavich and
done by program coordinators or one specific scholar, Zimbardo, 2012).
those most affected by OBE, students, and instructors, are
not sufficiently involved (Keating, 2014). Another critique Core concepts and methods of OBE: learning
refers to the fact that personal and professional attributes outcomes and curriculum mapping
of graduates are more complex than the skills expressed Learning outcomes (rather than learning goals, learning
in PLOs (Barman et al., 2014). In the 1970s, Stenhouse’s objectives, or student competencies) are the basis of OBE.
constructivist approach was the basis for the development They refer to measurable skills and knowledge students
of an alternative curriculum development approach, the master after completing their studies (Cumming et al., 2007).
process-based curriculum. The model emphasizes the Learning goals describe attitudes, emotions, and values
pedagogy of a curriculum rather than which skills are students are expected to develop as a result of their learning
facilitated (Knight, 2001). Contrary to OBE, process-based experiences. Since they are not measurable, learning goals
curriculum development involves flexible content but a are not included in the specification of curricula in OBE
determined pedagogy throughout the program (Table 1). (Morcke et al., 2013; Spady, 1994). Learning objectives
The process-based model has never played a strong role in describe discrete but not necessarily quantifiable results
of learning. Though they are sometimes developed at
Table 1. Core differences between outcome-based and process- the program level, learning objectives usually refer to a a
based curriculum development (Knight, 2001). course (Cumming et al., 2007). While a learning objective
is characterized as an intention, the learning outcome is the
Out- Process-based
projected realization of this intention (Burke, 1995). Different
come-based curriculum
from PLOs, competencies are acquired by students or
education development
graduates, rather than the program and its instructors.
Curriculum content Defined and Flexible Furthermore, competencies build on learning experience
constantly made before post-secondary education (Morcke et al.,
updated 2013). Although there are punctual differences between
Curriculum pedagogy Commonly Largely defined outcome- and competency-based education (Morcke et al.,
flexible 2013), when referring to the point of graduation, identical
Curriculum No clear descriptors can be used for PLOs and competencies
Tools (Cumming et al., 2007).
mapping methodology
Curriculum mapping is a standard tool for curriculum
Key assess- Real PLOs are design and assessment in OBE (Harden, 2001). Following
ment parame- too complex to Plaza et al. (2007), it is most commonly used to visualize
Position on PLOs
ters be objectively the intended curriculum (what the program officers intend
assessed. to facilitate), which can be distinguished from the delivered
240 NACTA Journal • Nov 2019 - Oct 2020
Table 2. Initial search terms. Search terms combinations were tested in the following databases: Web of Science, ScienceDirect, Wiley
Online Library, ProQuest, ERIC, Educators Reference Complete, Explora Educator's Edition.

Term 1 Term 2 Term 3


Learning Outcome AND Curriculum Assessment AND Higher education
OR OR OR
Program Learning Outcome Assessment University
Program Assessment Undergraduate
Curriculum Evaluation Graduate
Program Evaluation Major
Table 3. Systematic Review Search Terms. A combination of terms referring to curriculum assessment and evaluation in higher education
(and five synonyms) were searched in the following databases: Web of Science, ScienceDirect, Wiley Online Library, ProQuest, ERIC,
Educators Reference Complete, Explora Educator's Edition.

Term 1 Term 2 Term 3


Curriculum AND Assessment AND Higher education
OR OR OR
Program Evaluation University
Undergraduate
Graduate
Major
Food Systems

(what is effectively taught), hidden (what is communicated Methods


but not a part of the intended curriculum), the learned
(what the students take out of the program), and the PRISMA protocol
assessed curriculum (what is considered for grading). A This systematic literature review (SLR) was carried out
curriculum map generates a graphic portrayal such as a to collect and analyze published evidence on the following
matrix or template (Cuevas et al., 2010) that presents the closed-frame study question: In outcomes-based education
sequence of courses in the context of PLOs (Arafeh, 2016). in higher education curricula, what are practices for
Curriculum mapping serves as a navigational, data-based curriculum assessment that incorporate learning outcomes?
process tool to visualize curricular goals, including course We followed the Preferred Reporting Items for Systematic
scopes and sequences and their relation, emphasizing the Reviews and Meta-Analyses (PRISMA) protocol (Moher
nexus between PLOs and instructional activities as well et al., 2009). The procedure involves four essential steps:
as highlighting at which level of proficiency the diverse (1) search term definition and article search, (2) title and
PLOs are addressed (Cuevas et al., 2010; Gluga et al., abstract screening, (3) full-text screening, and (4) data
2012; Harden, 2002). To operationalize OBE, curriculum systematization and analysis.
mapping helps identify curriculum gaps and redundancies In addition to the PRISMA protocol, the Population,
(Freeman et al., 2008), discern whether different curriculum Intervention/Exposure, Comparator, Outcome (PICO)
components align (Kopera-Frye et al., 2008), and to framework (Schardt et al., 2007) was applied to structure our
determine which adjustments can be made to increase this review. The measured outcome of the studies we reviewed
alignment (Harden, 2001). was curriculum assessment criteria and procedures
applied to higher education curricula. The components of
The role of curriculum assessment in OBE the PICO framework were: (1) the population equivalent is
The concept of OBE is the foundation for an entire higher education curricula; (2) the intervention/exposure
framework to create and improve curricula (Figure 1). is curriculum assessment; (3) the comparator is pre- and
The framework involves a series of tools to develop and post- curriculum assessment based on PLOs and; (4) the
improve curricula, one of which is curriculum assessment. outcomes are specific assessment criteria and procedures.
Apart from program design and assessment, OBE supports
the connectivity between academic learning activities Identification of search terms
(Harden, 2007; Tonni and Oliver, 2013), commonly labeled The overall research question of the SLR determined
as curriculum alignment or “scaffolding”. the search terms following the PRISMA protocol. A
multidisciplinary team with expertise in food systems
education helped determine the initial search terms (Table
2) and the final set of search terms (Table 3). Our goal was
to identify all publications sharing outcomes of concrete

NACTA Journal • Nov 2019 - Oct 2020 241


Table 4a. Appearance of search terms in articles (out of seven reviewed databases).

Term 1 Term 2 Term 3 Quantity of publications Rank**


(Median)*
Learning Outcome Curriculum Assessment Higher education 14 22
University 19 18
Undergraduate 13 24
Graduate 9 27
College 15 20
Assessment Higher education 137 5
University 202 1
Undergraduate 173 3
Graduate 173 3
College 194 2
Program Assessment Higher education 65 9
University 63 10
Undergraduate 51 14
Graduate 54 13
College 60 12
Curriculum Evaluation Higher education 20 16
University 31 15
Undergraduate 20 16
Graduate 15 20
College 17 19
Program Evaluation Higher education 83 7
University 106 6
Undergraduate 62 11
Graduate 68 8
College 14 22

curriculum assessment interventions in higher education combinations) in the seven selected databases showed that
with an emphasis on assessment experiences based on the term combinations including “Curriculum Evaluation” (as
PLOs. The result of this discussion was to select all possible combined words) as the second search term resulted in the
combinations of three search terms, where there were two lowest number of associated articles (Table 4). Based on
options for the first term, as well as five options for the the results of the initial search term testing, we selected 24
second and third term, respectively (Appendix 1). word combinations for our final list of search terms (Table 3).
To specify our search, we tested all fifty resulting search Based on this list, we identified 525 articles for our review
term combinations of terms listed in Table 2 in seven different process. After removing 58 duplicates, we finally reviewed
publication databases (Web of Science, ScienceDirect, 467 publications.
Wiley Online Library, ProQuest, ERIC, Educators Reference The final set of search terms was processed in the same
Complete, and Explora Educator's Edition). We selected seven databases as used in the initial process of identifying
these databases due to their relevance for identifying peer- search terms. We searched for the appearance of the final
reviewed publications in all scientific fields as well as being set of search terms in the title, abstract, or keywords of peer-
sources of high-quality publications in the field of education. reviewed articles in these databases. Additionally, in the
We revised the search terms through an iterative more topic-specific databases (ERIC, Educators Reference
process. First, we ranked the search term combinations Complete, Explora Educator's Edition), we expanded
based on the number of publications associated with each our search to the appearance of the respective search
combination considering a median number of publications terms in the full text. We emphasized articles about case
throughout the seven databases. Therewith, we identified studies regarding implemented curriculum assessment
the quadrille of the least frequent combinations and discard processes, which we classified as primary data. During
them. Second, we examined relevant articles resulting from the initial process of identifying search terms, it became
the search process and refined the remaining search terms. apparent that publications on such case studies are rare.
Hence, we guaranteed a balanced and systematic review Thus, we also included reviews, opinion papers, and other
based on all significant search term combinations. communications about curriculum assessment, which we
The search term testing of the initial 50 search term classified as secondary data.
242 NACTA Journal • Nov 2019 - Oct 2020
Table 4b. Appearance of search terms in articles (out of seven reviewed databases).

Term 1 Term 2 Term 3 Quantity of publications Rank**


(Median)*
Program Learning Curriculum Assessment Higher education 0 40
Outcome University 0 40
Undergraduate 0 40
Graduate 0 40
College 0 40
Assessment Higher education 11 25
University 11 25
Undergraduate 4 30
Graduate 8 28
College 8 28
Program Assessment Higher education 3 32
University 4 30
Undergraduate 3 32
Graduate 3 32
College 3 32
Curriculum Evaluation Higher education 0 40
University 0 40
Undergraduate 0 40
Graduate 0 40
College 0 40
Program Evaluation Higher education 2 36
University 2 36
Undergraduate 0 40
Graduate 2 36
College 2 36

* Median of identified publications among publications databases


** Based on median, descending

Inclusion criteria and initial article screening development without an assessment component.
For the abstract review process, the a-priori inclusion The title and abstract of each article were screened
criteria were: (i) peer-reviewed publications between 1995- for their relevance to addressing the study question. Two
2019; (ii) articles published in the English language; (iii) review panel members screened each article to minimize
the title, abstract, or keywords include one of our search reviewer bias. Discrepancies on whether to exclude or
terms; and (iv) the abstract addresses the research include a specific article were discussed by at least three
question directly. The exclusion criteria include (i) books, reviewers to resolve conflicts between reviewers. All articles
book chapters, reports, maps, newspaper articles, and non- that met the inclusion criteria were included in the full-text
print media; (ii) articles without an abstract; (iii) abstracts screening; all other articles were excluded.
written in poor English or in a language other than English; The identified articles from the initial screening that met
(iv) articles published before 1995; (v) articles about the inclusion criteria were critically appraised by the review
curriculum assessment in countries other than the USA, panel to ensure they provided evidence to address the study
the UK, Japan, Canada, Switzerland, France, Denmark, question and met the a-priori inclusion criteria. Two panel
Israel, Germany, and Sweden (the ten leading countries in members screened each full text. Again, discrepancies were
the Center for World University Rankings (CWUR) index discussed and resolved by at least three reviewers. After
(CWUR, 2019); (vi) articles that do not discuss curriculum excluding inappropriate articles, the final set of publications
assessment criteria and procedure; (vii) articles which was included in the quantitative synthesis of our systematic
are not about higher education (for example, elementary, review.
middle, secondary education); (ix) articles about student The specific parameters and evidence extracted from
assessment (for example, exams or assignments); (x) each article during the synthesis included: (i) article title,
articles about instructor or course assessment; (xi) articles (ii) article author(s) and publication year, (iii) journal, (iv)
about self-assessment; and, (xii) articles about program observed program(s), (v) definition (understanding) of

NACTA Journal • Nov 2019 - Oct 2020 243


curriculum assessment, (vi) assessment subjects (what Abstract & title screening and full-text
is assessed), (vii) levels of assessment, (viii) frequency screening
of assessment, (ix) components of assessment, (x) Using the systematic review platform Covidence (Götz,
assessment follow-up, (xi) advantages of assessment, 2018) and applying inclusion criteria to article titles and
(xii) assessment of limitations, and (xiii) ways to improve abstracts, 43 out of 525 articles were identified as appropriate
curriculum assessment. For case studies, we additionally for full-text screening. The review panel conducted a full-
extracted the respective PLOs, assessment coordination text screening of the 43 articles, of which, 23 articles were
and execution, the assessment procedure, and the determined to not meet the inclusion criteria and were
assessment timeline. Finally, we examined primary and removed. This resulted in a total of 20 articles that met the
secondary data for reasons for executing curriculum inclusion criteria and addressed the study question. Figure
assessment, preconditions and planning for assessment, 2 outlines the PRISMA Flow Diagram of the screening
assessment methods, and tools, assessment follow- process used in the systematic review (Moher et al., 2009).
up, assessment frequency, as well as advantages and Categorization of the articles based on data type (primary
challenges of curriculum assessment in OBE. versus secondary data) resulted in seven articles that used
Using the software NVivo 11 (Leech and Onwuegbuzie, primary data and 13 articles based on secondary data.
2011), one assigned panel member per article extracted the The seven primary data papers dealt with 13 different case
parameters and evidence from each text to systematically studies, each of which about the assessment of a specific
synthesize all remaining articles. Afterward, articles were undergraduate program. Of these 13 majors, six regarded
split into two groups according to the type of data used in Science Technology Engineering and Mathematics (STEM)
the study: (1) articles using primary data (case studies) and programs, and seven were about programs in humanities,
(2) articles using secondary data (reviews or methodological social sciences, or art (Table 5).
discussions). We decided to discuss certain contents only
based on primary data, and others based on both primary
and secondary data.

Figure 2. PRISMA Flow Diagram (Moher et al., 2009) generated by the software Covidence resuming the literature screening process
from the number of articles identified through the database searching to the final number of studies included in the qualitative data
extraction.
244 NACTA Journal • Nov 2019 - Oct 2020
Table 5. Publications reviewed and synthesized in the systematic literature review that address the study question, “In outcomes-based
education in higher education curricula, what are practices for curriculum assessment that incorporate learning outcomes”.

Reference Article Title Primary (P) Study site (only Assessed


or Secondary P) Programs (only P)
(S) data
Barrette and Conceptualizing cultural literacy through student P Wayne State Foreign Language
Paesani (2018) learning outcomes assessment University
Beasley et al. Systematic plan of evaluation part I: Assessment of S
(2018) end-of-program student learning outcomes
Benson and Systemic decision application: linking learning outcome S
Dresdow assessment to organizational learning
(1998)
Dunning (2014) Developing a Competency-Based Assessment P Troy University Public Administration
Approach for Student Learning
Felder and Designing and Teaching Courses to Satisfy the ABET S
Brent (2003) Engineering Criteria
Feldhaus Assessment of Undergraduate Risk Management and S
(2002) Insurance Programs
Gosselin et al. Integration of sustainability in higher education: three P Northern AZ Environmental
(2013) case studies of curricular implementation University, Studies, Earth
University of Sciences and
Nebraska- Environmental
Lincoln, Kean Sustainability,
University Sustainability
Science
Hartel and Making the Transition to a Food Science Curriculum S
Gardner (2003) Based on Assessment of Learning Outcomes
Hartel and A Report from the Higher Education Review Board S
Iwaoka (2016) (HERB): Assessment of Undergraduate Student
Learning Outcomes in Food Science
Houston (2005) Outcomes Assessment for Beginning and Intermediate P Saint Louis Spanish
Spanish: One Program's Process and Results University
Joyner-Melito Curriculum Mapping: A Method to Assess and Refine P University of ID Food Science
(2016) Undergraduate Degree Programs
Mathews and Ongoing Assessment of a University Foreign Language S
Hansen (2004) Program
McGourty et al. Developing a Comprehensive Assessment Program for S
(1998) Engineering Education
McGourty Four Strategies to Integrate Assessment into the S
(1999) Engineering Educational Environment
Metzler et al. The Aspirational Curriculum Map: A Diagnostic Model S
(2017) for Action-Oriented Program Review
Myers et al. Assessing an Economics Programme: Hansen S
(2009) Proficiencies, ePortfolio, and Undergraduate Research
Swarat et al. How Disciplinary Differences Shape Student Learning P California State Physics, history,
(2017) Outcome Assessment: A Case Study University civil engineering,
child and adolescent
studies
Thompson and The Management Curriculum and Assessment S
Koys (2010) Journey: Use of Baldrige Criteria and the Occupational
Network Database
Tonni and A Delphi approach to define learning outcomes and S
Oliver (2013) assessment
Wright et al. Assessment from an educational development P University of Gender and
(2017) perspective Iowa, University Sexuality Studies,
of WI Madison, Business, Art
University of
Michigan, Duke
University
NACTA Journal • Nov 2019 - Oct 2020 245
Figure 3. Frequency of use of assessed sources for curriculum assessment in 13 reviewed case studies.

Results and Discussion aim for accreditation by third-party institutions. Although


the reviewed studies indicate which sources were used for
A. Case study (primary data) results assessment, they do not discuss their effectiveness and
Following the reviewed literature, individual faculty adequacy in the actual assessment process.
are most commonly in charge of the administration Joyner-Melito (2016) provides insight into curriculum
and coordination of curriculum assessment, followed assessment based on student assignments, one of the
by assessment committees comprising of core faculty most common procedures: using a curriculum map, the
members, university administrators, and independent assessment coordinator identifies a course which addresses
consultants. Further assessment coordinators include a particular PLO; then, the course instructor reviews their
deans and department heads whose primary role is to course syllabus and identifies assignments, readings, or
appoint the assessment coordinator or committee. other activities addressing this PLO; the instructor reports
In all reviewed case studies, the curriculum assessment the students’ performance in this assignment, usually
is based on specific documents, which we term as the the average grading, to the assessment coordinator; the
assessed source. Six out of 13 case studies use one single coordinator then contrasts the performance with certain
document as assessed source, three case studies use threshold values for each PLO indicating its successful or
two or three documents, three base their assessment on unsuccessful facilitation through the curriculum.
more than three documents, and in one case, the assesses The assessment criteria used in each of the case
sources are not specified. Figure 3, which illustrates the studies depend on the program and level of assessment.
assessed sources and their frequency of occurrence Of the seven case studies that specify their assessment
across the thirteen reviewed case studies, shows that the criteria, six state that their goal is to measure how efficiently
most common sources are student assignments, followed programs addressed their PLOs.
by student surveys and interviews. Table 6 (following page) While assessment execution varies greatly among the
presents specifications regarding the diverse sources case studies, most were systematic in their assessment
for assessment. Accredited survey tools, developed and procedures. For example, Swarat et al. (2017) note that
assessed by external certifiers, are common when curricula the assessment of PLOs in a physics B.S. program was
246 NACTA Journal • Nov 2019 - Oct 2020
Table 6. Documents and data (assessed sources) used as the basis for curriculum assessment.

Assessed sources Specification


Student assignments Student scores of selected course assignments, exams (or specific exam
questions), (not further specified) narratives, oral presentations, poster
presentations, research proposals, or diverse writings (Dunning, 2014;
Joyner-Melito, 2016; Swarat et al., 2017; Wright et al., 2017), in some case
limited to the evaluation of senior students (Dunning, 2014).
Case analyses emphasizing the appli- Commonly flexible
cation of analytical skills and knowledge
gained from curriculum courses (Dun-
ning, 2014)
Assessment of capstone course projects Curriculum mapping
and oral presentations (Gosselin et al.,
2013).
Student scores of assignments purpose- Key assessment parameters
ly designed for curriculum assessment
(Swarat et al., 2017).
Exit interviews, not specified (Gosselin et al., 2013; Houston, 2005; Swarat
Student surveys and interviews
et al., 2017).
Adaptation of external exam instrument to program needs (Dunning, 2014;
Accredited survey tool
Swarat et al., 2017)
Certified placement exam results (Hous-
ton, 2005).
Application of external exam in capstone
course (Gosselin et al., 2013).
Purposeful collection of student work (essays, presentations, projects,
creative products) during their time as students; may include videotaped pre-
Student portfolios
sentations and audio-recorded interviews; assessment different from course
grading (Barrette and Paesani, 2018; Houston, 2005).
Student satisfaction survey, usually course- and instructor-specific (Houston,
Course / instructor evaluations
2005; Wright et al., 2017).
Congruence of course syllabi and course PLOs with PLOs (Barrette and
Course syllabi
Paesani, 2018; Joyner-Melito, 2016).
Student performance (average grading) in junior and senior level courses
Registrar data
(Gosselin et al., 2013).
Student experience and demographics,
both not specified (Wright et al., 2017).

Table 7. Example of a calibrated assessment rubric. Troy University, Master of Public Administration, PLO 3 (“Evaluate the primary sources
of revenue at all levels of government based on the principles of taxation”). Example for the assessment of an assignment regarding the
discussion about the potential for implementing transaction-based taxes in a determined case. Resumed and modified from Dunning
(2014).

Does not meet expectations Meets expectations Exceeds expectations


The discussion does not fully identify The discussion fully identifies the The discussion provides a complete
the primary sources of revenues at primary sources of revenues at all description of the primary sources of
all levels of government or evaluate levels of government and evaluates revenues at all levels of government
their effectiveness. their effectiveness.

NACTA Journal • Nov 2019 - Oct 2020 247


administered by reviewing the students’ performance in indirect assessment indicators, or a mix of both) depends
specific assignments (selected based on their alignment to on the nature of the assessed program. Direct indicators
PLOs), such as capstone presentations and exam scores. of learning evaluate how the curriculum impacts student
Assessment results were then consulted to modify and learning; they measure the students’ mastering of the PLOs
improve the overall program. (Benson and Dresdow, 1998). In contrast, indirect indicators
In most case studies, curriculum mapping allows serve to continually improve the curriculum by measuring
the assessment administrators to adequately select the the professional performance of graduated students
appropriate assessment tools. One case study highlights (based on their self-assessment or the opinion of external
how the assessment administrator went so far as to stakeholders such as employers). Indirect indicators may
complete a gap/redundancy analysis from the curriculum also include the perception of instructors as in the case
map to illustrate if any PLOs were not being addressed, and/ of syllabus analysis (Feldhaus, 2002; Hartel and Gardner,
or if they were becoming redundant by being introduced in 2003; Joyner-Melito, 2016). Direct indicators are more
too many courses (Joyner-Melito, 2016). frequently applied in curriculum assessment than indirect
To ensure reliability within those who were assessing ones (Table 8).
the sources of documents evaluated during curriculum Two articles (Felder and Brent, 2003; Hartel and
assessment, four out of 13 reviewed studies utilize Gardner, 2003) suggest that once the assessment process
calibrated assessment rubrics to decrease the variability concludes, the following documents and processes should
among assessment results. These rubrics resemble be developed as part of the follow-up process:
rubrics used for the assessment of student performance
in exams or assignments (see Table 7 for an example). • Program outcome assessment matrix, which
In none of the 13 reviewed case studies, data about an indicates which courses might be modified,
evidenced measurement of the quality and functioning of • Course assessment matrix for each of those courses,
the assessment process was identified. suggesting areas that require strengthening,
• Teaching plan suggesting pedagogical strategies to
B. Primary and secondary data results increase alignment with PLOs,
Regarding motives for implementing curriculum • Updated curriculum map,
assessment, six of twenty reviewed articles point to an • Identification, review, and (if necessary) modification
assessment process mandated and directed by external of key institutional practices to ensure alignment
accrediting agencies like Accreditation Board for Engineering with educational outcomes and the assessment
and Technology (ABET), an institution which certifies process,
natural science, computing, and engineering programs, or • Scheduling of a process to re-evaluate PLOs, and
the Commission on Collegiate Nursing Education (CCNE), • Documentation of all decision making
which certifies nursing programs in response to industry based on the use of the assessment data.
demand (ABET, 2020; CCNE, 2019; McGourty, 1999).
Three of 20 publications mention explicit preconditions In terms of frequency of assessment, two publications
for initiating an assessment process including (i) agreement recommend initial baseline studies right after the creation of
on PLOs, (ii) development of a coordinated intended a new program (Gosselin et al., 2013; Houston, 2005). Of all
curriculum based on meeting these PLOs, (iii) mapping of reviewed publications, about a third (seven out of 20) gives
the intended curriculum, (iv) development of a standardized explicit advice about the ideal interval between assessment
syllabus template, (v) implementation of the curriculum, (vi) processes: Three suggest curriculum assessments every
assessment training for faculty and staff (the training may second or third year, two recommend yearly assessment,
continue during assessment), and, (vii) development of a and two recommend assessment in a five-year interval.
coordinated assessment plan. Seven out of 20 papers discuss the advantages of
A quarter of the reviewed papers suggests developing an curriculum assessment employing PLOs. These advantages
assessment plan before initiating a curriculum assessment arise in three main areas: clarification of expectations,
process (Beasley et al., 2018; Felder and Brent, 2003; faculty and staff engagement, and program improvement.
Joyner-Melito, 2016; McGourty, 1999; Myers et al., 2009; They can be categorized into advantages for three distinct
Wright et al., 2017). Following these sources, a systematic audiences: students, faculty, and the broader institution.
assessment plan should involve the assessment frequency, One paper identifies clarity of expectations and self-
ways to integrate faculty and staff, assessment methods, assessment of the learning process as an advantage
preparation of an assessment toolbox (templates and forms for student learning. Six out of twenty articles identified
for all assessors), expected levels of achievement, a data advantages for faculty, which include advantages at the
analysis strategy, and rules about the decision making program level. These advantages included:
based on the data.
Assessment implementations commonly begin with • Facilitation of evidence-based teaching (two of 20
developing clear learning goals or outcomes and then publications reviewed),
selecting the best assessment tools to gather information • Improvement in designing and applying an
about student achievement of the goals (Hartel and appropriate pedagogy (2/20),
Gardner, 2003). A few reviewed articles emphasize that • Facilitation of student assessment and grading
the appropriate selection of assessment methods (direct or linked to PLOs (2/20),
248 NACTA Journal • Nov 2019 - Oct 2020
Table 8. Direct and indirect tools for curriculum assessment highlighted in the reviewed articles combining primary and secondary data
(n=20).

Assessment tool Number of reviewed articles (n=20) Direct (D) or indirect (I) assessment
recommending the assessment indicators (Hartel and Gardner,
tool 2003)
Courses with embedded additional 3 D
assessment
Tests and evaluations 3 D
Student portfolio of issue papers 2 D
Capstone course evaluation (or only 2 D
final capstone presentations)
Thesis evaluation 1 D
Exit competency exam 1 D
Employer surveys 2 I
Alumni surveys (after 5 years) 2 I
Salary surveys 1 I
Internship evaluations 1 I
Student/graduate surveys 2 I
External reviewers 1 I
Curriculum / syllabus analysis 1 I

• Constant update of course syllabi based on PLOs on subjective impressions of the authors. In this context,
(1/20), two publications (Barrette and Paesani, 2018; Hartel and
• Consistency between program mission (educational Gardner, 2003) report that implementing PLO-based
objectives) and teaching (2/20), assessment can be difficult, particularly if the commitment
• Development and examination of the optimum of the faculty is weak. Accordingly, few instructors respond
sequence of courses within a curriculum (2/20), well to being told what to do in their classes. Hence, top-
• Identification of gaps between the intended PLOs down approaches to curriculum assessment may create
and actual student learning (2/20), more resistance than support among the faculty needed to
• Identification of gaps between the intended PLOs carry out and respond to the assessment. This is especially
and student learning opportunities (2/20), true when the value of the assessment process is not clear
• Increased collaboration and alignment between to faculty and instead it is perceived as a burden on time
faculty around courses (1/20), and and energy (Barrette and Paesani, 2018). Resistance is
• Improvement in program development spurred by fears that PLO-based assessment may lead to
and curriculum in general (4/20). unwanted change or even retraining requirements when
faculty feel their current courses and programs are working
Six out of 20 articles refer to advantages for the well enough (Hartel and Gardner, 2003).
program and the institution, especially thanks to an explicit Occasionally, PLOs lack specificity, and prove difficult
link between PLOs and the assessment process. These to be measured in practice (Beasley et al., 2018; Dunning,
advantages include: 2014), especially when PLOs originate at the institution
level but are implemented at the program level without
• Coordination of curricular activities (2/6), sufficient attention to adapting the PLOs to the discipline
• Enhanced dialogue among faculty groups teaching being assessed (Barrette and Paesani, 2018).
the same course (1/6),
• Communication of learning goals (1/6), Synthesis of findings
• Improvement of relations with broader stakeholders, Our review highlights that the literature about program
students, alumni and employers (1/6), and assessment in higher education is scarce and that existing
• Generally improved faculty and staff engagement publications lack evidence about the effectiveness of applied
(1/6). assessment practices. Furthermore, we identified notable
variations in the applied practices themselves. Both, the
None of the reviewed articles provides data about the scope and the applied practices at 13 different institutions
effectiveness of curriculum assessment in general and the we covered in our review, are widely inconsistent. As
functionality of their proposed approaches. However, twenty similar programs sometimes apply fundamentally different
percent of the reviewed publications discuss the challenges assessment practices, the explanation of this variation
and disadvantages of curriculum assessment in OBE based cannot be limited to the diversity of the programs.

NACTA Journal • Nov 2019 - Oct 2020 249


The nature of a specific assessment procedure is most holistic approach to curriculum assessment.
strongly defined by which assessment indicators (direct Consequently, a 360 Degree Assessment process was
versus indirect) are used. Direct indicators evaluate how developed, considering
the curriculum impacts student performance as related to
program LOs whereas indirect indicators evaluate how the • how students master the skills related to the eight
students’ learning process impacts external stakeholders adaptable PLOs;
and the students’ perception of learning achievements. • how senior students perceive their learning process
There is no explicit correlation between the assessed regarding each PLO;
majors and the recommended assessment practice. • how instructors perceive the students’ learning
However, we observed the tendency that STEM and achievements regarding each PLO;
economics programs rely stronger on direct assessment • how SFBS alumni assess the utility of the SFBS
indicators (e.g. average student scores from assignments), curriculum for their professional careers;
while language programs favor indirect indicators (e.g. • and how employers of SFBS alumni assess what
student surveys and interviews). Mixed (combined direct and their employees learned in the SFBS program.
indirect) indicators are common in social sciences programs
(yet we also identified STEM and language programs using Our preliminary assessment plan includes different
mixed indicators). Swarat et al. (2017) suggest that applied procedures for each of the five components (Table 9). The
assessment practices in a program differ based on the program assessment will be guided by a committee including
epistemological norms and culture in the respective science. faculty and students, directed by the program coordinator.
This circumstance may explain the preference of STEM and We plan to assess the SFBS program in an interval of two
economics programs for direct assessment indicators, as to three years and based on a regular assessment of the
the analysis methods (such as analysis of variance) used functionality of our PLOs.
to process assessment data are common in these sciences
(Jenner et al., 2004). On the other hand, indirect indicators Summary and Conclusions
such as interviews require qualitative methods, which are
prevalent in linguistics (Abbuhl et al., 2014).
Institutions of higher education prepare their students
Among the direct indicators, average student scores
for becoming impactful members of their communities
in selected assignments are most frequent. Numerous
and of a well- and workforce prepared for facing the
institutions emphasize assignments that are part of regular
complex and global social, economic, and environmental
course activities, while other institutions design assignments
challenges of our times. To increase the impact, accuracy,
exclusively for curriculum assessment or use assignments
and objectivity of their educational efforts, an increasing
developed by external certifying entities.
number of academic programs have defined explicit
The reviewed case studies highlight the importance of
and specific program learning outcomes (PLOs) for their
having a designated group of individuals, commonly faculty
curricula (Caspersen et al., 2017). The PLOs ensure that
or program coordinators, to administer and coordinate
all stakeholders involved in the teaching of a determined
curriculum assessment. Occasionally, external stakeholders
program (students, instructors, researchers, coordinators,
and alumni are involved in curriculum assessment.
administrators, and external collaborators) contribute to the
The core assessment criteria in most case studies
achievement of the same outcomes in student learning,
are PLOs. A few case studies note specific protocols to
while simultaneously maintaining a diversity of teaching
ensure the reliability of the assessment procedure including
styles and pedagogical approaches (Harden, 2007; Knight,
assessment rubrics similar to those used for grading student
2001).
assignments. Yet, the provided evidence is not strong
Only defining outcomes is not enough; there is a need
enough to ensure that this is the most appropriate format.
to ensure that new and existing programs are effective in
Several reviewed articles recommend an assessment
meeting their PLOs: Outcome-based education (OBE) is a
plan before as well as follow-up activities after assessment.
framework that systematically serves to align a program’s
Follow-up activities include a teaching plan suggesting
curriculum, mainly course offer, and sequence as well as
pedagogical strategies to increase the alignment of all
core pedagogical activities, to achieve its PLOs (Spady,
courses with the PLOs. Additionally, a constant PLO
1994). The framework offers guidelines to develop
reevaluation is recommended. According to several
curricula of new programs and to analyze existing ones.
publications, assessment should be carried out in an interval
For programs already being implemented and where clear
of two or three years.
PLOs are defined, OBE provides the process of curriculum
assessment to measure the effectiveness of the design,
Proposal for a curriculum assessment
delivery, and direction of an educational program based on
procedure
its PLOs (McGourty et al., 1998).
As of 2020, the Sustainable Food and Bioenergy
Curriculum assessment (sometimes also labeled
Systems Major (SFBS) at Montana State University will be
program assessment) serves to verify whether the intended
implementing an extensive curriculum assessment process
curriculum of a program is congruent with what students
based on the findings of the present review. Especially,
are learning, the learned curriculum (Plaza et al., 2007). In
we considered the diversity of assessment tools that were
other words, it puts the rule to the test: Is the curriculum
applied in those of the reviewed articles which provided the
250 NACTA Journal • Nov 2019 - Oct 2020
Table 9. Set of assessment tools applied in the assessment of the Sustainable Food and Bioenergy Systems Major, Montana State
University.

Target group Method Specification


Students (skill mastering) Assignments For each PLO, we will develop an
assignment to prove whether the
students master the PLO at advanced
level of proficiency. These (ungraded)
assignments will be implemented in
junior and senior courses.
Students (perception) Survey Students will take the survey in their
capstone course. They will be asked
how they assess the facilitation of
each of the PLOs throughout the
SFBS curriculum.
Instructors Semi-structured interviews After a faculty meeting, we will
interview each SFBS instructor who
teaches a course that addresses a
PLO at advanced level of proficiency
about how they perceive the SFBS
curriculum in terms of facilitating the
PLOs.
Alumni Focus groups During an online conference or
alumni meeting, we will have focus
groups discussing the effectivity of
the PLOs (and how they were taught)
for their application in a professional
workspace.
Employers Survey Selected employers will take online
surveys regarding the preparedness
of SFBS alumni for their jobs.

facilitating all the PLOs it pretends to generate? other components) before initiating curriculum
Consequently, numerous programs have made assessment process (Myers et al., 2009);
curriculum assessment an essential component of their • Assessment implementation through a committee
efforts to improve their curricula. Critics of OBE claim that of involved individuals, usually faculty, to guarantee
both, the conceptual background of OBE and the evidence commitment and active participation (Feldhaus,
to prove its efficiency, are weak and insufficient for assessing 2002) and inclusion of students into the assessment
contemporary programs with an increasing emphasis on process (Keating, 2014);
global and holistic thinking (Barman et al., 2014; Bryan and • Prior curriculum mapping to select the appropriate
Clegg, 2019). Through our systematic literature review, we assessment tools for each course (Metzler et al.,
can confirm that the literature on curriculum assessment 2017);
is not as abundant and specific as, for example, literature • A mix of direct and indirect assessment indicators
about curriculum mapping, another OBE process (Cuevas including student scores of selected course
et al., 2010). assignments (preferably purposely designed for
The objective of our systematic review was to identify curriculum assessment) as well as exit and alumni
the best practices for curriculum assessment in OBE based interviews (Gosselin et al., 2013; Swarat et al.,
on evidence of successful program assessment. We could 2017);
not find such evidence in the existing literature as even case • Calibrated assessment rubrics to decrease the
studies do not provide an analysis of the effectiveness of variability among assessment results (Dunning,
the applied assessment processes. However, we identified 2014);
commonly used assessment strategies and we also • Follow-up activities including a teaching plan
considered the subjective appraisal of these strategies by including pedagogical strategies to increase
the diverse authors. In this regard, we identified the following program alignment with PLOs and a process of
basic recommendations for curriculum assessment: reevaluation of PLOs (Felder and Brent, 2003;
Hartel and Gardner, 2003);
•Existence of an assessment plan (including • Curriculum assessment every second or third year
assessment frequency, faculty and staff (Swarat et al., 2017).
integration, and assessment methods, among
NACTA Journal • Nov 2019 - Oct 2020 251
Curriculum assessment helps to avoid arbitrariness Beasley, S.F., S. Farmer, N. Ard and K. Nunn-Ellison. 2018.
and subjectivity in curricula. Therefore, we consider it Systematic Plan of Evaluation Part I: Assessment of
an essential contribution to a constant improvement of End-of-Program Student Learning Outcomes. Teaching
higher education curricula. However, we are aware of and Learning in Nursing 13: 3-8.
the fact that curriculum assessment has been historically
pushed by the neoliberal “assessment movement”, which Benson, J. and S. Dresdow. 1998. Systematic Decision
sought to examine the effectiveness of funds invested Application: Linking Learning Outcome Assessment to
in higher education institutions by assessing learning Organizational Learning. Journal of Workplace Learning
upon completion of their degrees (Tam, 2014). We do not 10: 301.
advocate for program assessment to limit funding but to
improve education. Bloom, B.S. 1956. Taxonomy of educational objectives (Vol.
The outcome of curriculum assessment depends on 1: Cognitive domain). New York City, NY: McKay.
the used PLOs, the lens under which assessment occurs.
Although we consider curriculum assessment a per se Bryan, C. and K. Clegg. 2019. Innovative Assessment
neutral process, we also know that the influence of faculty in Higher Education: A Handbook for Academic
on defining PLOs has decreased while the impact of external Practitioners. London, UK: Routledge.
stakeholders such as industry leaders has increased (Millar,
2016). In contrast, we favor curriculum assessment based Burke, J.W. 1995. Outcomes, learning, and the curriculum:
on PLOs that contemplate far-reaching societal needs implications for NVQs, GNVQs, and other qualifications.
instead of short-term job market demands. London, UK: Psychology Press.
The process of curriculum assessment is worth being
sustained with coherent and evidence-based guidelines. We Carillo, E.C. 2007. Feminist" teaching/teaching" feminism.
hope that with our literature review we prepared the ground Feminist Teacher 18: 28-40.
for comprehensive and adaptable curriculum assessment.
Caspersen, J., N. Frølich and J. Muller. 2017. Higher
Acknowledgments education learning outcomes–Ambiguity and change in
higher education. European Journal of Education 52:
8-19.
We are grateful to the students, colleagues, staff, and
education stakeholders who influenced our understanding
CCNE (Commission on Collegiate Nursing Education).
of curriculum assessment. Funding for this project is
2019. CCNE Accreditation. https://www.aacnnursing.
provided by the United States Department of Agriculture
org/CCNE. Commission on Collegiate Nursing
Higher Education Challenge Grant, National Institute of
Education. April 10, 2020.
Food and Agriculture (Award Number: 2018-70003-27649).
CWUR (Center for World University Rankings). 2019.
Literature Cited World's Top Universities. https://cwur.org/2019-2020.
php. CWUR. November 7, 2019.
Abbuhl, R., S. Gass and A. Mackey. 2014. Experimental
research design. In: Podesva, R. J. and D. Sharma, Cuevas, N.M., A.G. Matveev and K.O. Miller. 2010. Mapping
(eds). Research methods in linguistics. Cambridge, UK: general education outcomes in the major: Intentionality
Cambridge University Press, and transparency. Peer Review 12: 10.

ABET (Accreditation Board for Engineering and Technology). Cumming, A., A. Cumming and M. Ross. 2007. The
2020. About ABET. https://www.abet.org/about-abet/. Tuning Project for Medicine–learning outcomes for
ABET. April 10, 2020. undergraduate medical education in Europe. Medical
teacher 29: 636-641.
Arafeh, S. 2016. Curriculum mapping in higher education: a
case study and proposed content scope and sequence Dunning, P.T. 2014. Developing a Competency-Based
mapping tool. Journal of Further and Higher Education Assessment Approach for Student Learning. Teaching
40: 585-611. Public Administration 32: 55.

Barman, L., C. Silen and K. Laksov. 2014. Outcome based Eichinger, M. 2019. Transformational change in the
education enacted: teachers' tensions in balancing Anthropocene epoch. The Lancet Planetary Health 3:
between student learning and bureaucracy. Advances 116-117.
in Health Sciences Education 19: 629-643.
Felder, R.M. and R. Brent. 2003. Designing and teaching
Barrette, C.M. and K. Paesani. 2018. Conceptualizing courses to satisfy the ABET engineering criteria. Journal
Cultural Literacy through Student Learning Outcomes of Engineering Education 92: 7-25.
Assessment. Foreign Language Annals 51: 331.
Feldhaus, W.R. 2002. Assessment of undergraduate
252 NACTA Journal • Nov 2019 - Oct 2020
risk management and insurance programs. Risk Keating, S.B. 2014. Curriculum development and evaluation
Management and Insurance Review 5: 155-171. in nursing. Springer New York.

Freeman, M., P. Hancock, L. Simpson, C. Sykes, P. Petocz, Killen, R. 2000. Outcomes-based education: Principles and
I. Densten and K. Gibson. 2008. Business as usual: possibilities. Unpublished manuscript. Newcastle, UK:
A collaborative and inclusive investigation of existing University of Newcastle, Faculty of Education.
resources, strengths, gaps and challenges to be
addressed for sustainability in teaching and learning in Knight, P.T. 2001. Complexity and curriculum: a process
Australian university business faculties. ABDC Scoping approach to curriculum-making. Teaching in Higher
Report 1: 1-54. Education 6: 369-381.

Gluga, R., J. Kay and T. Lever. 2012. Foundations for Kopera-Frye, K., J. Mahaffy and G.M. Svare. 2008. 2.
modeling university curricula in terms of multiple The Map to Curriculum Alignment and Improvement.
learning goal sets. IEEE Transactions on Learning Collected Essays on Learning and Teaching 1: 8-14.
Technologies 6: 25-37.
Koppang, A. 2004. Curriculum mapping: Building
Gosselin, D., R. Parnell, N.J. Smith-Sebasto and S. Vincent. collaboration and communication. Intervention in
2013. Integration of sustainability in higher education: School and Clinic 39: 154-161.
three case studies of curricular implementation. Journal
of Environmental Studies and Sciences 3: 316-330. Krathwohl, D.R. 2002. A Revision of Bloom's Taxonomy: An
Overview. Theory into Practice 41: 212.
Götz, S. 2018. Supporting systematic literature reviews
in computer science: the systematic literature Leech, N.L. and A.J. Onwuegbuzie. 2011. Beyond constant
review toolkit. In: Proceedings of the 21st ACM/IEEE comparison qualitative data analysis: Using NVivo.
International Conference on Model Driven Engineering School Psychology Quarterly 26: 70.
Languages and Systems. Copenhagen, Denmark: 14-
19 Oct. Mathews, T.J. and C.M. Hansen. 2004. Ongoing Assessment
of a University Foreign Language Program. Foreign
Harden, R., M. 2001a. AMEE Guide No. 21: Curriculum Language Annals 37: 630.
mapping: a tool for transparent and authentic teaching
and learning. Medical teacher 23: 123-137. McGourty, J. 1999. Four strategies to integrate assessment
into the engineering educational environment. Journal
Harden, R.M. 2002. Developments in outcome-based of Engineering Education 88: 391-395.
education. Medical teacher 24: 117-120.
McGourty, J., C. Sebastian and W. Swart. 1998. Developing
Harden, R.M. 2007. Outcome-based education: the future a comprehensive assessment program for engineering
is today. Medical teacher 29: 625-629. education. Journal of Engineering Education 87: 355-
361.
Hartel, R. and D. Gardner. 2003. Making the transition to
a food science curriculum based on assessment of Metzler, E., G. Rehrey, L. Kurz and J. Middendorf. 2017.
learning outcomes. Journal of Food Science Education The Aspirational Curriculum Map: A Diagnostic Model
2: 32-39. for Action-Oriented Program Review. To Improve the
Academy 36: 156-167.
Hartel, R. and W. Iwaoka. 2016. A report from the Higher
Education Review Board (HERB): Assessment of Millar, V. 2016. Interdisciplinary curriculum reform in the
undergraduate student learning outcomes in food changing university. Teaching in Higher Education 21:
science. Journal of Food Science Education 15: 56-62. 471-483.

Houston, T. 2005. Outcomes Assessment for Beginning Mitchell, D.E. and W.G. Spady. 1978. Organizational
and Intermediate Spanish: One Program's Process and contexts for implementing outcome based education.
Results. Foreign Language Annals 38: 366. Educational Researcher 7: 9-17.

Jenner, B., U. Flick, E. von Kardoff and I. Steinke. 2004. A Moher, D., A. Liberati, J. Tetzlaff and D.G. Altman. 2009.
companion to qualitative research. Newbury Park, Ca: Preferred reporting items for systematic reviews and
Sage. meta-analyses: the PRISMA statement. Annals of
internal medicine 151: 264-269.
Joyner-Melito, H.S. 2016. Curriculum Mapping: A Method to
Assess and Refine Undergraduate Degree Programs. Morcke, A.M., T. Dornan and B. Eika. 2013. Outcome
Journal of Food Science Education 15: 83. (competency) based education: an exploration of its
origins, theoretical basis, and empirical evidence.
NACTA Journal • Nov 2019 - Oct 2020 253
Advances in Health Sciences Education 18: 851-863. learning outcomes and assessment. Eur. J. Dent. Educ.
17: e173-e180.
Myers, S.C., M.A. Nelson and R.W. Stratton. 2009. Assessing
an Economics Programme: Hansen Proficiencies, Tyler, R.W. 1949. Tyler, Ralph W., Basic Principles of
ePortfolio, and Undergraduate Research. International Curriculum and Instruction. Chicago, IL: The University
Review of Economics Education 8: 87-105. of Chicago Press.

Olsson, P., M.-L. Moore, F.R. Westley and D.D.P. McCarthy. Wright, M.C., M. Goldwasser, W. Jacobson and C. Dakes.
2017. The concept of the Anthropocene as a game- 2017. Assessment from an educational development
changer a new context for social innovation and perspective. To Improve the Academy 36: 39-49.
transformations to sustainability. Ecology and Society
22:31.

Plaza, C.M., J.R. Draugalis, M.K. Slack, G.H. Skrepnek


and K.A. Sauer. 2007. Curriculum mapping in program
assessment and evaluation. American Journal of
Pharmaceutical Education 71: 1-8.

Schardt, C., M.B. Adams, T. Owens, S. Keitz and P. Fontelo.


2007. Utilization of the PICO framework to improve
searching PubMed for clinical questions. BMC medical
informatics and decision making 7: 1-6.

Slavich, G.M. and P.G. Zimbardo. 2012. Transformational


teaching: Theoretical underpinnings, basic principles,
and core methods. Educational psychology review 24:
569-608.

Spady, W.G. 1994. Outcome-Based Education: Critical


Issues and Answers. Airlington, VA: American
Association of School Administrators.

Steffen, W., K. Richardson, J. Rockström, S.E. Cornell, I.


Fetzer, E.M. Bennett, R. Biggs, S.R. Carpenter, W. De
Vries and C.A. De Wit. 2015. Planetary boundaries:
Guiding human development on a changing planet.
Science 347: 736.

Stenhouse, L. 1985. Research as a basis for teaching:


readings from the work of Lawrence Stenhouse.
Portsmouth, NH : Heinemann Educational Books.

Swarat, S., P.H. Oliver, L. Tran, J.G. Childers, B. Tiwari and


J. Babcock. 2017. How Disciplinary Differences Shape
Student Learning Outcome Assessment. AERA Open
3: 1-12.

Tam, M. 2014. Outcomes-based approach to quality


assessment and curriculum improvement in higher
education. Quality Assurance in Education 22: 158-168.

Thompson, K.R. and D.J. Koys. 2010. The management


curriculum and assessment journey: Use of Baldrige
criteria and the occupational network database. Journal
of Leadership & Organizational Studies 17: 156-166.

Tilman, D. and M. Clark. 2014. Global diets link environmental


sustainability and human health. Nature 515: 518.

Tonni, I. and R. Oliver. 2013. A Delphi approach to define


254 NACTA Journal • Nov 2019 - Oct 2020
View publication stats

You might also like