Professional Documents
Culture Documents
net/publication/344301019
CITATIONS READS
2 578
6 authors, including:
Some of the authors of this publication are also working on these related projects:
Advancing an Inclusive Food Systems Curriculum based on a Signature Pedagogy View project
All content following this page was uploaded by Roland Ebel on 21 September 2020.
C. Dring5
University of British Columbia
Vancouver, BC
A. Sames6
University of Minnesota Twin Cities
Minneapolis, MN
Abstract
As new higher education programs emerge to prepare of an adaptable curriculum assessment procedure for
students for being part of a well-equipped workforce higher education.
to meet the challenges of our times, and established
programs revise their curricula, clear program learning Introduction
outcomes (PLOs) are required. The process of curriculum
(or program) assessment serves to test whether programs Institutions of higher education are responding to the
are effective in facilitating their PLOs. The objective of our multifaceted societal challenges of our times by training
systematic review was to identify best practices in curriculum students in innovative and complex skills towards preparing
assessment. We examined peer-reviewed literature from a well-equipped workforce. For example, human activities
1995 to 2019. The Preferred Reporting Items for Systematic are having unprecedented impacts on the Earth and its
Reviews and Meta-Analyses (PRISMA) protocol was systems with implications for human and planetary health
applied to collect evidence, reduce bias, and synthesize (Eichinger, 2019; Steffen et al., 2015; Tilman and Clark,
evidence. We screened 535 abstracts and conducted full- 2014). These challenges require the development and
text screening of 43 publications of which 20 were selected implementation of social innovations to transform society
for final extraction. We found that the number of articles (Olsson et al., 2017). As a result, new program offerings
providing empiric evidence about effective curriculum emerge in institutions of higher education while long-
assessment is low. We identified promising practices based standing programs revise their curriculum to prepare a
on the frequency of their mention and their assessment by workforce to meet societal needs. Consequently, these
the authors. These practices include an assessment plan; programs need to articulate a clearly defined set of program
assessment implementation through a faculty committee; learning outcomes (PLOs) for students to demonstrate with
using both, direct and indirect assessment indicators (for regards to the acquisition of certain knowledge, skills, and
example, student performance in specific assignments attitudes (Killen, 2000). A PLO describes what is essential
and student exit or alumni interviews); and an ongoing for all students to successfully demonstrate in a measurable
reevaluation of PLOs. Our findings sustain the development way at the end of their learning experience within a given
1
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. roland.ebel@gmx.com. 406-994-5640
2
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. selena.ahmed@montana.edu. 406-994-5640
3
Department of Health & Human Development, Reid Hall 345, MSU Campus, Bozeman, MT, 59717. alithrntn@gmail.com. 406-994-5640
4
Plant Sciences & Plant Pathology Department, Plant BioScience Building 232, MSU Campus, Bozeman, MT, 59717. charleswatt@montana.edu. 406-994-5147
5
Centre for Sustainable Food Systems, 2357 Main Mall, Vancouver, BC, V6T 1Z4. colind@mail.ubc.ca. 778-859-1148
6
Department of Fisheries, Wildlife and Conservation Biology, 135 Skok Hall, 2003 Upper Buford Circle, St. Paul, MN 55108. same0057@umn.edu. 612-624-3600
238 NACTA Journal • Nov 2019 - Oct 2020
program (Knight, 2001). There is further need for curriculum and activities, to PLOs, which describe what is essential
assessment to ensure that new and existing programs are for all students to successfully demonstrate at the end of
effective in meeting specific PLOs for preparing students as their learning experience (Spady, 1994). While curriculum
professionals to address the grand challenges of our times. development was historically relatively subjective and
Outcome-based education (OBE) has gained unstructured, during the second half of the 20th century,
momentum over the past five decades to prepare a systematic curriculum development emerged on the agenda
professional workforce that meets societal needs. of academic discussion and educational administration
Curriculum assessment (often synonyms with ‘program (Knight, 2001). For the last fifty years, OBE has been the
assessment’) is a core procedure in OBE. The identification prevailing approach to systematic curriculum development
of suitable PLOs to assess how a curriculum meets these (Barman et al., 2014).
PLOs is one of the core purposes of OBE (Barman et al., The idea of OBE has its origins in behaviorist learning
2014). theories (Morcke et al., 2013). In the mid-20th century, Tyler
The way we assess our curricula, and ultimately our [(1949) in (Morcke et al., 2013)] argued that curriculum
students, are recognized as game-changers to improve design should be determined by explicit objectives. Tyler’s
student outcomes for future success (Bryan and Clegg, student Bloom (Bloom, 1956) built on the approach of
2019). There is thus a need to identify best practices for designing curriculum based on explicit objectives through
curriculum assessment suitable in different contexts, which the development of a taxonomy of educational goals which
can be applied towards the development of an adaptable he classified into knowledge, skills, and attitudes (Morcke
curriculum assessment tool for higher education institutions. et al., 2013). While Bloom´s taxonomy was reviewed
Apart from increasing the alignment of curricula and their and updated to fit contemporary assessment standards
PLOs, such a tool would allow for the communication of (Krathwohl, 2002), the differentiation between knowledge,
common educational goals, experiences, and methods skills, and attitudes persists (Harden, 2001).
across instructors, programs, and departments. To our In the 1970s, constructivist approaches to curriculum
knowledge, there is a lack of a published review identifying design opposed OBE. Particularly Stenhouse (1985)
best practices for curriculum assessment as well as of a emphasized curricula based on the students’ learning
comprehensive adaptable curriculum assessment tool or process instead of the intended outcomes of this process.
protocol. During the 1990s, OBE regained popularity, especially in
The objective of our paper is to address the best designing curricula for pre-university education. Spady
practices and protocol gap and advance the field of OBE (1994), whose work is seen as a turning point in curriculum
through conducting a systematic literature review on development, conserved the constructivist principle that the
practices for curriculum assessment that incorporate students’ knowledge and skills are a consequence of learning
PLOs. Specifically, this systematic review examines peer- experiences but subordinated this principle to measurable
reviewed literature from 1995 to 2019 that reports outcomes and observable PLOs. Precisely, Spady considered such
of concrete curriculum assessment interventions in higher learning experiences, which he calls learning goals, a
education with an emphasis on assessment practices based requirement for obtaining PLOs. In contrast, he defines
on PLOs. The Preferred Reporting Items for Systematic PLOs as an explicit demonstration of learning. Therefore,
Reviews and Meta-Analyses (PRISMA) protocol (Moher et in modern OBE, PLOs serve as the reference point for the
al., 2009) was applied to collect evidence on the closed- assessment and regulation of the students’ proficiency
frame study question: In OBE higher education curricula, (Morcke et al., 2013) and the implicit consequences for
what are practices for curriculum assessment that course and curriculum design (Barman et al., 2014). At the
incorporate PLOs? The Population, Intervention/Exposure, end of the 1990s, OBE approaches became increasingly
Comparator, Outcome (PICO) framework (Schardt et al., frequent in the development of higher education curricula,
2007) elements were applied to structure this systematic particularly in medicine, nursery, and engineering programs
review which examines curriculum assessment as the that expect graduates to have specific demonstrable
intervention/exposure with the assessment procedure as knowledge, skills, and attitudes (Keating, 2014; Knight,
the outcome. The PRISMA and PICO protocols were applied 2001). Today, OBE is almost omnipresent, from language
for the review to reduce bias and synthesize all evidence in (Barrette and Paesani, 2018) to nutrition (Joyner-Melito,
a replicable way. Findings from the review will inform the 2016) programs.
development of an adaptable curriculum assessment tool The OBE framework systematically and deductively
to evaluate the effectiveness of higher education programs. facilitates and structures the development of a curriculum
by first defining the final product, the PLOs. Since OBE is
Background on Outcome-based education performance-oriented, it relates curriculum design to the
(OBE) students’ professional careers after graduation. Accordingly,
OBE can increase graduates’ adaptation to a new job
(Harden, 2001; Hartel and Gardner, 2003; McGourty et al.,
Overview of OBE
1998). Some of the pioneers in OBE emphasized that it also
A curriculum is an attempt to communicate the
seeks to strengthen students’ role in the broader society
principles and features of an educational proposal to
(Mitchell and Spady, 1978).
effectively translating it into practice (Stenhouse, 1985).
The adoption of OBE has not been without resistance
The goal of OBE is to align curriculum, including content
(Harden, 2007). The main criticism is that both, the
NACTA Journal • Nov 2019 - Oct 2020 239
Core stakeholders and processes in outcome-based education based on input from Harden (2002) and Koppang
(2004).
conceptual background of OBE and the empirical evidence real curriculum development but has maintained influence
to prove its efficiency, are relatively weak (Barman et al., on certain pedagogies, such as feminist teaching practices
2014). Others criticize that since most of the work is usually (Carillo, 2007) and transformational teaching (Slavich and
done by program coordinators or one specific scholar, Zimbardo, 2012).
those most affected by OBE, students, and instructors, are
not sufficiently involved (Keating, 2014). Another critique Core concepts and methods of OBE: learning
refers to the fact that personal and professional attributes outcomes and curriculum mapping
of graduates are more complex than the skills expressed Learning outcomes (rather than learning goals, learning
in PLOs (Barman et al., 2014). In the 1970s, Stenhouse’s objectives, or student competencies) are the basis of OBE.
constructivist approach was the basis for the development They refer to measurable skills and knowledge students
of an alternative curriculum development approach, the master after completing their studies (Cumming et al., 2007).
process-based curriculum. The model emphasizes the Learning goals describe attitudes, emotions, and values
pedagogy of a curriculum rather than which skills are students are expected to develop as a result of their learning
facilitated (Knight, 2001). Contrary to OBE, process-based experiences. Since they are not measurable, learning goals
curriculum development involves flexible content but a are not included in the specification of curricula in OBE
determined pedagogy throughout the program (Table 1). (Morcke et al., 2013; Spady, 1994). Learning objectives
The process-based model has never played a strong role in describe discrete but not necessarily quantifiable results
of learning. Though they are sometimes developed at
Table 1. Core differences between outcome-based and process- the program level, learning objectives usually refer to a a
based curriculum development (Knight, 2001). course (Cumming et al., 2007). While a learning objective
is characterized as an intention, the learning outcome is the
Out- Process-based
projected realization of this intention (Burke, 1995). Different
come-based curriculum
from PLOs, competencies are acquired by students or
education development
graduates, rather than the program and its instructors.
Curriculum content Defined and Flexible Furthermore, competencies build on learning experience
constantly made before post-secondary education (Morcke et al.,
updated 2013). Although there are punctual differences between
Curriculum pedagogy Commonly Largely defined outcome- and competency-based education (Morcke et al.,
flexible 2013), when referring to the point of graduation, identical
Curriculum No clear descriptors can be used for PLOs and competencies
Tools (Cumming et al., 2007).
mapping methodology
Curriculum mapping is a standard tool for curriculum
Key assess- Real PLOs are design and assessment in OBE (Harden, 2001). Following
ment parame- too complex to Plaza et al. (2007), it is most commonly used to visualize
Position on PLOs
ters be objectively the intended curriculum (what the program officers intend
assessed. to facilitate), which can be distinguished from the delivered
240 NACTA Journal • Nov 2019 - Oct 2020
Table 2. Initial search terms. Search terms combinations were tested in the following databases: Web of Science, ScienceDirect, Wiley
Online Library, ProQuest, ERIC, Educators Reference Complete, Explora Educator's Edition.
curriculum assessment interventions in higher education combinations) in the seven selected databases showed that
with an emphasis on assessment experiences based on the term combinations including “Curriculum Evaluation” (as
PLOs. The result of this discussion was to select all possible combined words) as the second search term resulted in the
combinations of three search terms, where there were two lowest number of associated articles (Table 4). Based on
options for the first term, as well as five options for the the results of the initial search term testing, we selected 24
second and third term, respectively (Appendix 1). word combinations for our final list of search terms (Table 3).
To specify our search, we tested all fifty resulting search Based on this list, we identified 525 articles for our review
term combinations of terms listed in Table 2 in seven different process. After removing 58 duplicates, we finally reviewed
publication databases (Web of Science, ScienceDirect, 467 publications.
Wiley Online Library, ProQuest, ERIC, Educators Reference The final set of search terms was processed in the same
Complete, and Explora Educator's Edition). We selected seven databases as used in the initial process of identifying
these databases due to their relevance for identifying peer- search terms. We searched for the appearance of the final
reviewed publications in all scientific fields as well as being set of search terms in the title, abstract, or keywords of peer-
sources of high-quality publications in the field of education. reviewed articles in these databases. Additionally, in the
We revised the search terms through an iterative more topic-specific databases (ERIC, Educators Reference
process. First, we ranked the search term combinations Complete, Explora Educator's Edition), we expanded
based on the number of publications associated with each our search to the appearance of the respective search
combination considering a median number of publications terms in the full text. We emphasized articles about case
throughout the seven databases. Therewith, we identified studies regarding implemented curriculum assessment
the quadrille of the least frequent combinations and discard processes, which we classified as primary data. During
them. Second, we examined relevant articles resulting from the initial process of identifying search terms, it became
the search process and refined the remaining search terms. apparent that publications on such case studies are rare.
Hence, we guaranteed a balanced and systematic review Thus, we also included reviews, opinion papers, and other
based on all significant search term combinations. communications about curriculum assessment, which we
The search term testing of the initial 50 search term classified as secondary data.
242 NACTA Journal • Nov 2019 - Oct 2020
Table 4b. Appearance of search terms in articles (out of seven reviewed databases).
Inclusion criteria and initial article screening development without an assessment component.
For the abstract review process, the a-priori inclusion The title and abstract of each article were screened
criteria were: (i) peer-reviewed publications between 1995- for their relevance to addressing the study question. Two
2019; (ii) articles published in the English language; (iii) review panel members screened each article to minimize
the title, abstract, or keywords include one of our search reviewer bias. Discrepancies on whether to exclude or
terms; and (iv) the abstract addresses the research include a specific article were discussed by at least three
question directly. The exclusion criteria include (i) books, reviewers to resolve conflicts between reviewers. All articles
book chapters, reports, maps, newspaper articles, and non- that met the inclusion criteria were included in the full-text
print media; (ii) articles without an abstract; (iii) abstracts screening; all other articles were excluded.
written in poor English or in a language other than English; The identified articles from the initial screening that met
(iv) articles published before 1995; (v) articles about the inclusion criteria were critically appraised by the review
curriculum assessment in countries other than the USA, panel to ensure they provided evidence to address the study
the UK, Japan, Canada, Switzerland, France, Denmark, question and met the a-priori inclusion criteria. Two panel
Israel, Germany, and Sweden (the ten leading countries in members screened each full text. Again, discrepancies were
the Center for World University Rankings (CWUR) index discussed and resolved by at least three reviewers. After
(CWUR, 2019); (vi) articles that do not discuss curriculum excluding inappropriate articles, the final set of publications
assessment criteria and procedure; (vii) articles which was included in the quantitative synthesis of our systematic
are not about higher education (for example, elementary, review.
middle, secondary education); (ix) articles about student The specific parameters and evidence extracted from
assessment (for example, exams or assignments); (x) each article during the synthesis included: (i) article title,
articles about instructor or course assessment; (xi) articles (ii) article author(s) and publication year, (iii) journal, (iv)
about self-assessment; and, (xii) articles about program observed program(s), (v) definition (understanding) of
Figure 2. PRISMA Flow Diagram (Moher et al., 2009) generated by the software Covidence resuming the literature screening process
from the number of articles identified through the database searching to the final number of studies included in the qualitative data
extraction.
244 NACTA Journal • Nov 2019 - Oct 2020
Table 5. Publications reviewed and synthesized in the systematic literature review that address the study question, “In outcomes-based
education in higher education curricula, what are practices for curriculum assessment that incorporate learning outcomes”.
Table 7. Example of a calibrated assessment rubric. Troy University, Master of Public Administration, PLO 3 (“Evaluate the primary sources
of revenue at all levels of government based on the principles of taxation”). Example for the assessment of an assignment regarding the
discussion about the potential for implementing transaction-based taxes in a determined case. Resumed and modified from Dunning
(2014).
Assessment tool Number of reviewed articles (n=20) Direct (D) or indirect (I) assessment
recommending the assessment indicators (Hartel and Gardner,
tool 2003)
Courses with embedded additional 3 D
assessment
Tests and evaluations 3 D
Student portfolio of issue papers 2 D
Capstone course evaluation (or only 2 D
final capstone presentations)
Thesis evaluation 1 D
Exit competency exam 1 D
Employer surveys 2 I
Alumni surveys (after 5 years) 2 I
Salary surveys 1 I
Internship evaluations 1 I
Student/graduate surveys 2 I
External reviewers 1 I
Curriculum / syllabus analysis 1 I
• Constant update of course syllabi based on PLOs on subjective impressions of the authors. In this context,
(1/20), two publications (Barrette and Paesani, 2018; Hartel and
• Consistency between program mission (educational Gardner, 2003) report that implementing PLO-based
objectives) and teaching (2/20), assessment can be difficult, particularly if the commitment
• Development and examination of the optimum of the faculty is weak. Accordingly, few instructors respond
sequence of courses within a curriculum (2/20), well to being told what to do in their classes. Hence, top-
• Identification of gaps between the intended PLOs down approaches to curriculum assessment may create
and actual student learning (2/20), more resistance than support among the faculty needed to
• Identification of gaps between the intended PLOs carry out and respond to the assessment. This is especially
and student learning opportunities (2/20), true when the value of the assessment process is not clear
• Increased collaboration and alignment between to faculty and instead it is perceived as a burden on time
faculty around courses (1/20), and and energy (Barrette and Paesani, 2018). Resistance is
• Improvement in program development spurred by fears that PLO-based assessment may lead to
and curriculum in general (4/20). unwanted change or even retraining requirements when
faculty feel their current courses and programs are working
Six out of 20 articles refer to advantages for the well enough (Hartel and Gardner, 2003).
program and the institution, especially thanks to an explicit Occasionally, PLOs lack specificity, and prove difficult
link between PLOs and the assessment process. These to be measured in practice (Beasley et al., 2018; Dunning,
advantages include: 2014), especially when PLOs originate at the institution
level but are implemented at the program level without
• Coordination of curricular activities (2/6), sufficient attention to adapting the PLOs to the discipline
• Enhanced dialogue among faculty groups teaching being assessed (Barrette and Paesani, 2018).
the same course (1/6),
• Communication of learning goals (1/6), Synthesis of findings
• Improvement of relations with broader stakeholders, Our review highlights that the literature about program
students, alumni and employers (1/6), and assessment in higher education is scarce and that existing
• Generally improved faculty and staff engagement publications lack evidence about the effectiveness of applied
(1/6). assessment practices. Furthermore, we identified notable
variations in the applied practices themselves. Both, the
None of the reviewed articles provides data about the scope and the applied practices at 13 different institutions
effectiveness of curriculum assessment in general and the we covered in our review, are widely inconsistent. As
functionality of their proposed approaches. However, twenty similar programs sometimes apply fundamentally different
percent of the reviewed publications discuss the challenges assessment practices, the explanation of this variation
and disadvantages of curriculum assessment in OBE based cannot be limited to the diversity of the programs.
facilitating all the PLOs it pretends to generate? other components) before initiating curriculum
Consequently, numerous programs have made assessment process (Myers et al., 2009);
curriculum assessment an essential component of their • Assessment implementation through a committee
efforts to improve their curricula. Critics of OBE claim that of involved individuals, usually faculty, to guarantee
both, the conceptual background of OBE and the evidence commitment and active participation (Feldhaus,
to prove its efficiency, are weak and insufficient for assessing 2002) and inclusion of students into the assessment
contemporary programs with an increasing emphasis on process (Keating, 2014);
global and holistic thinking (Barman et al., 2014; Bryan and • Prior curriculum mapping to select the appropriate
Clegg, 2019). Through our systematic literature review, we assessment tools for each course (Metzler et al.,
can confirm that the literature on curriculum assessment 2017);
is not as abundant and specific as, for example, literature • A mix of direct and indirect assessment indicators
about curriculum mapping, another OBE process (Cuevas including student scores of selected course
et al., 2010). assignments (preferably purposely designed for
The objective of our systematic review was to identify curriculum assessment) as well as exit and alumni
the best practices for curriculum assessment in OBE based interviews (Gosselin et al., 2013; Swarat et al.,
on evidence of successful program assessment. We could 2017);
not find such evidence in the existing literature as even case • Calibrated assessment rubrics to decrease the
studies do not provide an analysis of the effectiveness of variability among assessment results (Dunning,
the applied assessment processes. However, we identified 2014);
commonly used assessment strategies and we also • Follow-up activities including a teaching plan
considered the subjective appraisal of these strategies by including pedagogical strategies to increase
the diverse authors. In this regard, we identified the following program alignment with PLOs and a process of
basic recommendations for curriculum assessment: reevaluation of PLOs (Felder and Brent, 2003;
Hartel and Gardner, 2003);
•Existence of an assessment plan (including • Curriculum assessment every second or third year
assessment frequency, faculty and staff (Swarat et al., 2017).
integration, and assessment methods, among
NACTA Journal • Nov 2019 - Oct 2020 251
Curriculum assessment helps to avoid arbitrariness Beasley, S.F., S. Farmer, N. Ard and K. Nunn-Ellison. 2018.
and subjectivity in curricula. Therefore, we consider it Systematic Plan of Evaluation Part I: Assessment of
an essential contribution to a constant improvement of End-of-Program Student Learning Outcomes. Teaching
higher education curricula. However, we are aware of and Learning in Nursing 13: 3-8.
the fact that curriculum assessment has been historically
pushed by the neoliberal “assessment movement”, which Benson, J. and S. Dresdow. 1998. Systematic Decision
sought to examine the effectiveness of funds invested Application: Linking Learning Outcome Assessment to
in higher education institutions by assessing learning Organizational Learning. Journal of Workplace Learning
upon completion of their degrees (Tam, 2014). We do not 10: 301.
advocate for program assessment to limit funding but to
improve education. Bloom, B.S. 1956. Taxonomy of educational objectives (Vol.
The outcome of curriculum assessment depends on 1: Cognitive domain). New York City, NY: McKay.
the used PLOs, the lens under which assessment occurs.
Although we consider curriculum assessment a per se Bryan, C. and K. Clegg. 2019. Innovative Assessment
neutral process, we also know that the influence of faculty in Higher Education: A Handbook for Academic
on defining PLOs has decreased while the impact of external Practitioners. London, UK: Routledge.
stakeholders such as industry leaders has increased (Millar,
2016). In contrast, we favor curriculum assessment based Burke, J.W. 1995. Outcomes, learning, and the curriculum:
on PLOs that contemplate far-reaching societal needs implications for NVQs, GNVQs, and other qualifications.
instead of short-term job market demands. London, UK: Psychology Press.
The process of curriculum assessment is worth being
sustained with coherent and evidence-based guidelines. We Carillo, E.C. 2007. Feminist" teaching/teaching" feminism.
hope that with our literature review we prepared the ground Feminist Teacher 18: 28-40.
for comprehensive and adaptable curriculum assessment.
Caspersen, J., N. Frølich and J. Muller. 2017. Higher
Acknowledgments education learning outcomes–Ambiguity and change in
higher education. European Journal of Education 52:
8-19.
We are grateful to the students, colleagues, staff, and
education stakeholders who influenced our understanding
CCNE (Commission on Collegiate Nursing Education).
of curriculum assessment. Funding for this project is
2019. CCNE Accreditation. https://www.aacnnursing.
provided by the United States Department of Agriculture
org/CCNE. Commission on Collegiate Nursing
Higher Education Challenge Grant, National Institute of
Education. April 10, 2020.
Food and Agriculture (Award Number: 2018-70003-27649).
CWUR (Center for World University Rankings). 2019.
Literature Cited World's Top Universities. https://cwur.org/2019-2020.
php. CWUR. November 7, 2019.
Abbuhl, R., S. Gass and A. Mackey. 2014. Experimental
research design. In: Podesva, R. J. and D. Sharma, Cuevas, N.M., A.G. Matveev and K.O. Miller. 2010. Mapping
(eds). Research methods in linguistics. Cambridge, UK: general education outcomes in the major: Intentionality
Cambridge University Press, and transparency. Peer Review 12: 10.
ABET (Accreditation Board for Engineering and Technology). Cumming, A., A. Cumming and M. Ross. 2007. The
2020. About ABET. https://www.abet.org/about-abet/. Tuning Project for Medicine–learning outcomes for
ABET. April 10, 2020. undergraduate medical education in Europe. Medical
teacher 29: 636-641.
Arafeh, S. 2016. Curriculum mapping in higher education: a
case study and proposed content scope and sequence Dunning, P.T. 2014. Developing a Competency-Based
mapping tool. Journal of Further and Higher Education Assessment Approach for Student Learning. Teaching
40: 585-611. Public Administration 32: 55.
Barman, L., C. Silen and K. Laksov. 2014. Outcome based Eichinger, M. 2019. Transformational change in the
education enacted: teachers' tensions in balancing Anthropocene epoch. The Lancet Planetary Health 3:
between student learning and bureaucracy. Advances 116-117.
in Health Sciences Education 19: 629-643.
Felder, R.M. and R. Brent. 2003. Designing and teaching
Barrette, C.M. and K. Paesani. 2018. Conceptualizing courses to satisfy the ABET engineering criteria. Journal
Cultural Literacy through Student Learning Outcomes of Engineering Education 92: 7-25.
Assessment. Foreign Language Annals 51: 331.
Feldhaus, W.R. 2002. Assessment of undergraduate
252 NACTA Journal • Nov 2019 - Oct 2020
risk management and insurance programs. Risk Keating, S.B. 2014. Curriculum development and evaluation
Management and Insurance Review 5: 155-171. in nursing. Springer New York.
Freeman, M., P. Hancock, L. Simpson, C. Sykes, P. Petocz, Killen, R. 2000. Outcomes-based education: Principles and
I. Densten and K. Gibson. 2008. Business as usual: possibilities. Unpublished manuscript. Newcastle, UK:
A collaborative and inclusive investigation of existing University of Newcastle, Faculty of Education.
resources, strengths, gaps and challenges to be
addressed for sustainability in teaching and learning in Knight, P.T. 2001. Complexity and curriculum: a process
Australian university business faculties. ABDC Scoping approach to curriculum-making. Teaching in Higher
Report 1: 1-54. Education 6: 369-381.
Gluga, R., J. Kay and T. Lever. 2012. Foundations for Kopera-Frye, K., J. Mahaffy and G.M. Svare. 2008. 2.
modeling university curricula in terms of multiple The Map to Curriculum Alignment and Improvement.
learning goal sets. IEEE Transactions on Learning Collected Essays on Learning and Teaching 1: 8-14.
Technologies 6: 25-37.
Koppang, A. 2004. Curriculum mapping: Building
Gosselin, D., R. Parnell, N.J. Smith-Sebasto and S. Vincent. collaboration and communication. Intervention in
2013. Integration of sustainability in higher education: School and Clinic 39: 154-161.
three case studies of curricular implementation. Journal
of Environmental Studies and Sciences 3: 316-330. Krathwohl, D.R. 2002. A Revision of Bloom's Taxonomy: An
Overview. Theory into Practice 41: 212.
Götz, S. 2018. Supporting systematic literature reviews
in computer science: the systematic literature Leech, N.L. and A.J. Onwuegbuzie. 2011. Beyond constant
review toolkit. In: Proceedings of the 21st ACM/IEEE comparison qualitative data analysis: Using NVivo.
International Conference on Model Driven Engineering School Psychology Quarterly 26: 70.
Languages and Systems. Copenhagen, Denmark: 14-
19 Oct. Mathews, T.J. and C.M. Hansen. 2004. Ongoing Assessment
of a University Foreign Language Program. Foreign
Harden, R., M. 2001a. AMEE Guide No. 21: Curriculum Language Annals 37: 630.
mapping: a tool for transparent and authentic teaching
and learning. Medical teacher 23: 123-137. McGourty, J. 1999. Four strategies to integrate assessment
into the engineering educational environment. Journal
Harden, R.M. 2002. Developments in outcome-based of Engineering Education 88: 391-395.
education. Medical teacher 24: 117-120.
McGourty, J., C. Sebastian and W. Swart. 1998. Developing
Harden, R.M. 2007. Outcome-based education: the future a comprehensive assessment program for engineering
is today. Medical teacher 29: 625-629. education. Journal of Engineering Education 87: 355-
361.
Hartel, R. and D. Gardner. 2003. Making the transition to
a food science curriculum based on assessment of Metzler, E., G. Rehrey, L. Kurz and J. Middendorf. 2017.
learning outcomes. Journal of Food Science Education The Aspirational Curriculum Map: A Diagnostic Model
2: 32-39. for Action-Oriented Program Review. To Improve the
Academy 36: 156-167.
Hartel, R. and W. Iwaoka. 2016. A report from the Higher
Education Review Board (HERB): Assessment of Millar, V. 2016. Interdisciplinary curriculum reform in the
undergraduate student learning outcomes in food changing university. Teaching in Higher Education 21:
science. Journal of Food Science Education 15: 56-62. 471-483.
Houston, T. 2005. Outcomes Assessment for Beginning Mitchell, D.E. and W.G. Spady. 1978. Organizational
and Intermediate Spanish: One Program's Process and contexts for implementing outcome based education.
Results. Foreign Language Annals 38: 366. Educational Researcher 7: 9-17.
Jenner, B., U. Flick, E. von Kardoff and I. Steinke. 2004. A Moher, D., A. Liberati, J. Tetzlaff and D.G. Altman. 2009.
companion to qualitative research. Newbury Park, Ca: Preferred reporting items for systematic reviews and
Sage. meta-analyses: the PRISMA statement. Annals of
internal medicine 151: 264-269.
Joyner-Melito, H.S. 2016. Curriculum Mapping: A Method to
Assess and Refine Undergraduate Degree Programs. Morcke, A.M., T. Dornan and B. Eika. 2013. Outcome
Journal of Food Science Education 15: 83. (competency) based education: an exploration of its
origins, theoretical basis, and empirical evidence.
NACTA Journal • Nov 2019 - Oct 2020 253
Advances in Health Sciences Education 18: 851-863. learning outcomes and assessment. Eur. J. Dent. Educ.
17: e173-e180.
Myers, S.C., M.A. Nelson and R.W. Stratton. 2009. Assessing
an Economics Programme: Hansen Proficiencies, Tyler, R.W. 1949. Tyler, Ralph W., Basic Principles of
ePortfolio, and Undergraduate Research. International Curriculum and Instruction. Chicago, IL: The University
Review of Economics Education 8: 87-105. of Chicago Press.
Olsson, P., M.-L. Moore, F.R. Westley and D.D.P. McCarthy. Wright, M.C., M. Goldwasser, W. Jacobson and C. Dakes.
2017. The concept of the Anthropocene as a game- 2017. Assessment from an educational development
changer a new context for social innovation and perspective. To Improve the Academy 36: 39-49.
transformations to sustainability. Ecology and Society
22:31.