You are on page 1of 376

Assessment of Online

Learners

Assessment of Online Learners offers essential foundations, insights, and


real-world examples for preservice teachers preparing to assess students
in today’s digitized classrooms. When aligned with intended curricula and
best practices, assessment not only informs but enhances both instruction
and student achievement, though the recent large-scale adaptation of
face-to-face learning to online platforms has yielded new challenges and
responsibilities for teachers. This book explores shifts in the research and
practice of assessment in online environments, the reconceptualization
of course content and assessment frameworks in teacher education, the
collection of fair and accurate assessment evidence reflecting students’
virtual learning, and more. Drawing from experienced Canadian instructors
who overcame the inherent technological obstacles, these chapters showcase
how unprecedented changes in schooling can lead to pedagogical renewal,
program reevaluation, and a broader understanding of instruction and
assessment practices.

Paolina Seitz is Associate Professor in the Faculty of Education at


St. Mary’s University, Canada.

S. Laurie Hill is Associate Professor in the Faculty of Education at


St. Mary’s University, Canada.
Assessment of Online
Learners
Foundations and Applications
for Teacher Education

EDITED BY
PAOLINA SEITZ AND S. LAURIE HILL
Designed cover image: © Getty Images
First published 2024
by Routledge
605 Third Avenue, New York, NY 10158
and by Routledge
4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2024 selection and editorial matter, Paolina Seitz and S. Laurie Hill; individual
chapters, the contributors
The right of Paolina Seitz and S. Laurie Hill to be identified as the authors of the
editorial material, and of the authors for their individual chapters, has been asserted in
accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or utilised in
any form or by any electronic, mechanical, or other means, now known or hereafter
invented, including photocopying and recording, or in any information storage or
retrieval system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or registered
trademarks, and are used only for identification and explanation without intent to infringe.
Library of Congress Cataloging-in-Publication Data
Names: Seitz, Paolina, editor. | Hill, S. Laurie, editor.
Title: Assessment of online learners : foundations and applications for teacher
education / edited by Paolina Seitz and S. Laurie Hill.
Description: First edition. | New York : Routledge, 2024. | Includes bibliographical
references and index.
Identifiers: LCCN 2023033662 (print) | LCCN 2023033663 (ebook) |
ISBN 9781032376912 (hbk) | ISBN 9781032390123 (pbk) |
ISBN 9781003347972 (ebk)
Subjects: LCSH: Educational evaluation—Canada. | Educational tests and
measurements—Canada. | Academic achievement—Canada—Evaluation. |
Distance education—Canada—Evaluation. | Educational technology—Canada. |
Web-based instruction—Canada. | Student teachers—Training of. |
Teachers—Training of.
Classification: LCC LB2822.3.G7 A87 2024 (print) | LCC LB2822.3.G7 (ebook) |
DDC 379.1/580971—dc23/eng/20230925
LC record available at https://lccn.loc.gov/2023033662
LC ebook record available at https://lccn.loc.gov/2023033663
ISBN: 978-1-032-37691-2 (hbk)
ISBN: 978-1-032-39012-3 (pbk)
ISBN: 978-1-003-34797-2 (ebk)
DOI: 10.4324/9781003347972
Typeset in Dante & Avenir
by Apex CoVantage, LLC
Contents

About the Editors and Contributorsix

Acknowledgmentsxix

Introduction 1
S. Laurie Hill and Paolina Seitz

Part I
Assessment Shifts in Teacher Education 13

1 Exploring Assessment in a Digital Age: Preservice and


In-service Teachers’ Professional Learning Experiences 15
Nadia Delanoy, Barbara Brown, and Jodie Walz

2 Supporting Student Teacher Assessment Identity


Development on Online Practica: A Narrative Inquiry 34
Jenny Ge

3 Assessment as Partnership: The “Co” in


Co-Constructing . . . “Not Quite” 53
Allison Tucker and Marc Husband
vi Contents

4 Addressing Challenges of Online Learning through


Quality Assessment Principles 73
Caitlin Fox and Julia Rheaume

5 Assessing the Content Knowledge, Skills, and


Competencies of Teacher Candidates in an Online
Learning Environment: A Case Study 91
Renee Bourgoin

Part II
Reconceptualizing Assessment Frameworks for Preservice
Teachers109

6 Decolonizing Assessment Practices in Teacher Education 111


Joshua Hill, Christy Thomas, and Allison Robb-Hagg

7 Meaningful Feedback in the Online Learning Environment 126


Maggie McDonnell

8 Analyzing Presence in Online Learning Environments


through Student Narratives: An Autoethnographic
Study145
Tan Xuan Pham, Chi (Linh) Tran, Le Pham Hue Xuan,
and Giang Nguyen Hoang Le

9 The Unintended Influence of COVID-19: Optimizing


Student Learning by Advancing Assessment Practices
through Technology 164
Katrina Carbone, Michelle Searle, and Saad Chahine

Part III
Teacher Educators and Assessment in K–12 Contexts 185

10 Pedagogically Hacking the System: Developing a


Competency-based Digital Portfolio 187
Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
Contents vii

11 Malfunction: Regressive and Reductive Online


Assessment Practices 206
Shannon D. M. Moore, Bruno de Oliveira Jayme, and Kathy Sanford

12 Leveraging the Relationship Between Assessment,


Learning, and Educational Technology 225
Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

13 Isolation/Adaptation/Education: Moving Hands-on


Secondary Visual Art Classes to a Virtual Platform 244
Christina Yarmol

14 Using the SAMR Model to Design Online K–12


Assessments271
Sheryl MacMath

15 Adapting Classroom Assessment Practices for Online


Learning: Lessons Learned From Secondary School
Teachers in the Early Days of COVID-19 293
Michael Holden, Christopher DeLuca, Stephen MacGregor,
and Amanda Cooper

16 Equity in Action: Virtual Learning Environment


Considerations312
Sharlene McHolm

17 Ubuntu at a Distance: Online Assessment for Care,


Justice, and Community 330
Sarah Elizabeth Barrett

Index347
About the Editors and
Contributors

Editors

Paolina Seitz is an associate professor in the Faculty of Education at St.


Mary’s University, Canada. Dr. Seitz has over 30 years of experience in
K–12 and postsecondary teaching and administration and has held sev-
eral administrative roles, including principal, associate superintendent
of learning services, and Faculty of Education area chair. She brings
extensive experience in the areas of student assessment and math-
ematics education and leadership in administration. Her professional
field of study is student motivation, evaluation, and cognition with an
emphasis on student assessment. She is both a quantitative and a quali-
tative researcher.

S. Laurie Hill is an associate professor in the Faculty of Education at


St. Mary’s University, Canada. Her research interests include teacher edu-
cation and preservice teacher professional identity, especially the way in
which field experiences shape how preservice teachers understand their
emerging practice and sense of what it means to be (and to become) a
teacher. She is also interested in preservice teacher experiences related
to social justice and equity issues.
x About the Editors and Contributors

Contributors

Sarah Elizabeth Barrett is an associate professor and former asso-


ciate dean of academic programs, in the Faculty of Education at
York University, Canada. She has also served as special advisor to the
dean on Black inclusion and anti-Black racism and served as affirma-
tive action, equity, and inclusivity officer for the university. Dr. Barrett
is a former high school science teacher, and her research focuses on
teachers’ experiences of professional identity, and the values, beliefs,
and understandings that teachers bring to their practice, especially with
respect to social justice issues.

Renee Bourgoin is an assistant professor at St. Thomas University in


New Brunswick, Canada. She teaches elementary literacy and social
studies methods courses and differentiated instruction. She is also an
honorary research associate at the Second Language Research Institute
of Canada, where she teaches French second language methods courses.

Barbara Brown is an associate dean of teaching and learning in the


Werklund School of Education at the University of Calgary, Canada. Her
research interests include research-practice partnerships, professional
learning, and instructional design in digital learning environments.

Katrina Carbone is a doctoral student at the Faculty of Education,


Queen’s University, Canada, and a member of the Ontario College
of Teachers certified to teach K–12. Her dissertation focuses on
transforming assessment practices in higher education to support
student learning. Through her years of teaching experience in both
elementary and higher education, in addition to research positions
and guest lectures, she brings a wealth of knowledge and expertise to
her work. Katrina has published in education and evaluation journals,
disseminated knowledge through practitioner articles, and presented at
academic conferences across North America.

Saad Chahine is an associate professor of assessment and measure-


ment at the Faculty of Education in Queen’s University, Canada, and
a member of the Queen’s Assessment and Evaluation Group. He is
a former high school math teacher and draws on validity theory to
examine big data and improve educational policy and practice. Saad has
published in various education, measurement, and health profession
About the Editors and Contributors xi

education journals and has taught courses and workshops on research


design, educational measurement, assessment, and educational effect-
iveness to students and professionals. Additionally, Saad has supported
educational change initiatives in developing countries, including
Myanmar, Belize, Trinidad and Tobago, and Pakistan.

Amanda Cooper is an associate dean of research and strategic initiatives


in the Faculty of Education at Queen’s University, Canada, and asso-
ciate professor of educational leadership and policy. She founded the
Research Informing Policy, Practice, and Leadership in Education
(RIPPLE) program to increase the use of evidence in public service
sectors. Dr. Cooper’s research on knowledge mobilization includes four
areas of inquiry: research producers (funders and universities), research
users (practitioners and policymakers), research brokers, and meas-
uring research impact.

Bruno de Oliveira Jayme is a queer art educator from Brazil. He is


an assistant professor in curriculum and pedagogy in the Faculty of
Education at the University of Manitoba, Canada. Bruno works with
community museums and art galleries and has developed an inter-
national research agenda that includes critical pedagogy, transformative
art education, and arts-based methodologies.

Nadia Delanoy is an assistant professor (adj.), researcher, and pro-


ject lead in the Werklund School of Education at the University of
Calgary, Canada. Her research interests include evidence-based prac-
tice in assessment, leadership, and innovative pedagogies in technology
enhanced environments as well as big data and social media analytics to
support innovative business and technology practices.

Christopher DeLuca is an associate dean in the School of Graduate


Studies at Queen’s University, Canada, and professor of educational
assessment at the Faculty of Education. He leads the Classroom
Assessment Research Team and is the director of the Queen’s
Assessment and Evaluation Group. Dr. DeLuca’s research examines the
complex intersection of curriculum, pedagogy, and assessment as oper-
ating within the current context of school accountability and standards-
based education. His work largely focuses on supporting teachers in
negotiating these critical areas of practice to enhance student learning
experiences.
xii About the Editors and Contributors

Caitlin Fox shares her passion for teaching and learning with pre-
service teachers in the Bachelor of Education program at Red Deer
Polytechnic, Canada. As a teacher, curriculum leader, instructional
leader, post-secondary course designer, teacher educator, and mentor,
Caitlin has dedicated her practice to inspiring teachers to consider the
power of quality learning and assessment design.

Hong Fu is a research associate and instructor in the Department


of Curriculum and Instruction, University of Victoria, Canada. Her
research interests and experience in teacher identity, educational
assessment, digital portfolio and technology, and preparing teacher
candidates to teach English language learners. She also has experience
in English as a foreign language and in education leadership programs
for schoolteachers and administrators outside Canada.

Jenny Ge is an educational developer at Toronto Metropolitan University,


Canada, and a sessional lecturer in the University of Toronto’s Master
of Teaching program. Her PhD research examined student teacher
assessment identity development. Her other research interests include
program evaluation, technology for teaching and learning, and arts-
based methods.

Joshua Hill is an assistant professor in the department of education at


Mount Royal University, Canada. Through his teaching, service, and
scholarship Josh seeks to create the conditions to (re)story education as
a journey towards agency, wellness, wonder, and expansive awareness of
oneself in the world. He is currently exploring storytelling, Indigenous
land-based learning, and heterarchy in the contexts of learning,
teaching, and leadership in K–12 and postsecondary education.

Thiago Hinkel is a PhD candidate and sessional instructor with the


Department of Curriculum and Instruction at the University of Victoria,
Canada. His research interests include digital technologies and the liter-
acies they generate and how they can be explored in the field of teacher
education. Thiago has taught for nearly two decades now, initially as
an EAL teacher and more recently as an instructor at postsecondary.
Thiago has benefitted from using technology in his own learning trajec-
tory and is today enthusiastic about creative and critical uses of digital
technologies in education.
About the Editors and Contributors xiii

Michael Holden is a PhD candidate at Queen’s University, Canada, and


a sessional instructor with the Werklund School of Education at the
University of Calgary. His SSHRC-funded research examines principles
of classroom assessment for emergent learning, with a focus on collab-
orating with teachers as they work to provoke and support emergent
learning that stems from formative classroom assessment in com-
plex spaces. His professional interests include classroom assessment,
program evaluation, and developing stronger collaborative networks in
education.

Timothy Hopper has been at the University of Victoria, Canada, since


1998. He is a professor in the School of Exercise Science, Physical
and Health Education (EPHE), where he has teaching responsibil-
ities associated with physical education teacher education. His current
SSHRC research project focuses on using digital assessment systems
to promote student learning through competency-based, authentic,
personal, and contextual based assessment. Dr. Hopper’s scholarly work
focuses on teacher education, physical education, and applications of
complexity theory in teaching and learning.

Marc Husband is an assistant professor in the Faculty of Education


at St. Francis Xavier University in Nova Scotia, Canada. He has over
20 years of teaching experience, including classroom teaching and sev-
eral leadership positions with the Toronto District School Board. His
classroom-based research investigates using student ideas as a resource
for learning mathematics in schools, teacher education, and profes-
sional learning settings.

Lori Kirkpatrick is an adjunct faculty member in the Faculty of


Education at Western University, Canada. Lori works in the areas of
educational psychology and special education, specifically in inclu-
sion, writing, and educational technology. Lori is also an experienced
program evaluator, and she has conducted program evaluations in the
areas of inclusion, educational technology, and youth mental health.

Giang Nguyen Hoang Le is affiliated with the Faculty of Social


Sciences and Humanities, Van Lang University, Vietnam. He is also a
lecturer in the Graduate Programs in Education, School of Education &
Social Work, at Thompson Rivers University, Canada. Le holds a PhD
xiv About the Editors and Contributors

in Educational Studies, Faculty of Education, from Brock University,


Canada. His scholarship is situated in social justice education, queer
studies, cultural studies, and international education with a focus on
voices of international students across contexts. Le has written exten-
sively about the body image of Vietnamese gay boys at schools and in
family life.

Stephen MacGregor is an assistant professor of educational leader-


ship, policy, and governance in the Werklund School of Education at
the University of Calgary, Canada. His research explores how multi-
stakeholder networks can mobilize research evidence to achieve soci-
etal impacts. He gives particular attention to how universities can
build their capacity in knowledge mobilization: a range of activities
to connect research producers, users, and mediators. Mixed methods
approaches are a recurring theme in his work, including the use of
social network analysis to analyze interaction patterns among diverse
research stakeholders.

Sheryl MacMath is a professor and department head in teacher edu-


cation at the University of the Fraser Valley, Canada. She specializes
in planning and assessment, elementary math methods, elementary
social studies methods, and research methods. She also provides regular
professional development for practising teachers. In addition to her
teaching areas, Dr. MacMath also researches EDID and teacher educa-
tion admissions.

Maggie McDonnell is the program coordinator for composition and


professional writing in the English Department at Concordia University
in Montreal, Canada, where she teaches composition, professional
writing and editing, and rhetoric. She also teaches interdisciplinary
teaching for the PERFORMA master teacher program at the Université
de Sherbrooke. Her academic research focuses on experiential learning,
authentic assessment, and effective feedback, as well as the develop-
ment of teacher identity in higher education.

Sharlene McHolm is a practising elementary principal and an adjunct


professor at Wilfrid Laurier University, Canada. She has been an admin-
istrator in both elementary and second panels for the past 18 years and
an educator for 30. Her areas of specialty include mental health, cog-
nitive neuroscience, and inclusive education. She is working on her
About the Editors and Contributors xv

seventh degree. During the pandemic, Dr. McHolm’s passion for educa-
tion led her to ask deep questions about how virtual learning environ-
ments serve all types of students.

Shannon D. M. Moore is an assistant professor of social studies educa-


tion in the Department of Curriculum, Teaching and Learning in the
Faculty of Education, at the University of Manitoba, Canada. Prior
to joining the faculty, she was a high school social studies teacher
in Vancouver. Her research interests include media and digital liter-
acies in the social studies classroom, the impacts of online learning
on teachers and teaching, and the impacts of neoliberalism on public
education.

Tan Xuan Pham is an early-career education researcher. He was a visiting


instructor at the HCMC University of Transport where he conducted
research with students to seek approaches for academic development.
He has employed research methods such as photovoice, storytelling,
and writing as pedagogical approaches. His research interests include
social justice and power in education, curriculum development, and
second language acquisition.

Julia Rheaume is the chair of the Bachelor of Education program at


Red Deer Polytechnic, where she has shared her passion for teaching
and learning for the past 13 years. Julia primarily taught middle school
French immersion math and science for 15 years. Her experiences as a
preservice and graduate teacher educator, high school vice-principal,
postsecondary administrator, and classroom teacher have inspired her
interests in teacher education, leadership, assessment, middle level edu-
cation, and educational policy.

Allison Robb-Hagg is a practising elementary assistant principal


in Calgary, Canada. She has been an educator and leader for the
past 15 years. Allison has taught at the elementary level and as a ses-
sional instructor at Ambrose University. She has consulted with pri-
vate organizations to develop curriculum, design learning tasks, and
collaboratively establish a grant which supports student community
initiatives. Allison’s areas of research centre around building mathem-
atical confidence in second language programs, second language acqui-
sition, and professional development of teachers for the success and
well-being of all students.
xvi About the Editors and Contributors

Kathy Sanford is a professor in the Faculty of Education at the


University of Victoria, Canada. Her research interests include teacher
education, assessment, digital portfolios, nonformal and informal adult
education, gender pedagogy, and multiliteracies. Kathy is a feminist
social justice educator and teacher educator, utilizing intersectionality
as a framework to understand issues of equity as they relate to race,
gender, and class. She works with partnerships of schools, teacher edu-
cation programs, and the Ministry of Education to further 21st century
understandings of learning and learners’ needs.

Michelle Searle is an assistant professor of educational evaluation


at the Faculty of Education, Queen’s University, Canada, who brings
dynamic expertise in program evaluation, creative arts, mixed and
multiple methods as well as an emphasis on knowledge mobilization.
Michelle’s scholarship is centred on collaborative and interdiscip-
linary values that promote understanding by co-producing evidence to
advance social change in practice and policy. Through her community-
engaged projects, she has over 20 years of experience working with
faculty, community, and youth in national and international contexts.
Dissemination has included 32 publications, 13 chapters, 50+ reports,
90+ presentations, and multiple creative outputs including arts exhibits
and poetic contributions.

Christy Thomas is an assistant professor in the School of Education


at Ambrose University, Canada, and an adjunct assistant professor in
the Werklund School of Education at University of Calgary. Christy’s
research centres on leadership, professional learning, and collaboration
which is fuelled by her desire to build communities of practice that
support flourishing for all.

Chi (Linh) Tran is a PhD scholar in the School of Social Science, Media,
Film, and Education at Swinburne University of Technology, Australia.
Chi has been working for Vietnam’s Ministry of Natural Resources
and Environment. She is an environmental/sustainability education
researcher with 10 years of experience collaborating and working with
Vietnamese K–12 students and in-service/preservice teachers. Chi’s
research interests include climate change education, environmental edu-
cation, public engagement, and ecologically and multigenerationally
justice-oriented education towards new materialist, posthumanist, and
feminist thinking. She recently completed her doctoral thesis entitled
About the Editors and Contributors xvii

“Weathering with Climate in the Anthropocene: Framing East/West


Theoretical Entanglements.”

Allison Tucker is an assistant professor in the Faculty of Education at


St. Francis Xavier University, Canada. Her teaching includes classroom
teaching and leadership and spans British Columbia, Alberta, Ontario,
Newfoundland and Labrador, and Nova Scotia. Situated in contexts
from public school to postsecondary classrooms, Allison’s research
embeds the Reggio Emilia-inspired belief that learners are capable and
competent protagonists of their own learning. This belief underpins
her teaching practice and her research in teacher identity, assessment,
and teacher/leadership professional learning.

Jodie Walz is an educational technology consultant with the Calgary


Catholic School District, Canada. Her research interests include
assessment practices with digital tools, assistive technology, and peda-
gogies to support technology use in the classroom. She has a Master of
Educational Technology from the University of Saskatchewan.

Le Pham Hue Xuan received her MEd degree in Educational Leadership


from National Chung Cheng University, Taiwan, where she is cur-
rently a PhD student in the Educational Leadership and Management
Development Program. Her research interests include multiculturalism
in education, students’ well-being, and equity in higher education. She
is developing dissertation research tentatively titled “An ethnographic
study on Vietnamese women’s voices in higher education leadership:
Experiences of female leaders returning from abroad.”

Christina Yarmol is department head of the arts at a Toronto, Canada,


high school. Years of teaching experience in the elementary and sec-
ondary panels has informed both her theoretical master’s and doctoral
research. For her Master of Education work Yarmol was awarded the
W.A. Bishop Townshend Gold Medal in Education (2013) and a merit
award from the Centre for Inclusive Education for her research about
the experiences of high school students labelled with multiple disabil-
ities. Her doctoral research at York University examined the practical,
philosophical, fiscal, and social policy realities of adults with cognitive
and intellectual disabilities who want to live as artists in the community.
Acknowledgments

We are grateful to the people at Routledge for providing us with the


opportunity to gather this collection of writing together for publication
and for affirming the value of this topic. We wish to express special thanks
to Daniel Schwartz, our editor, and Katherine Tsamparlis, our editorial
assistant. Both have provided advice and guidance through the process.
This work would not have come together without the copyediting
talent of Karen Crosby and Karen Lowry of Editarians; we are indebted
to them for their careful and professional assistance in helping to prepare
the manuscript.
Finally, we would like to thank the authors for contributing their work
to this collection and entrusting us with their ideas and writing. We
appreciated their ongoing commitment to the project, their patience in
responding to our deadlines, and their interest in this important topic in
teacher education.
Introduction
S. Laurie Hill and Paolina Seitz

The inspiration for this collection grew from our lived experience of
teaching during the early days of the pandemic when the way in which we
approached our teaching practices changed dramatically. The movement
to virtual learning necessitated an urgent shift to online and hybrid leaning
environments for our preservice teachers. This shift invited a renewed
interest in assessment practices and a consideration of how we conduct
meaningful and purposeful assessment of learning in online courses. In
this edited book, we bring together contributors to share how the shift
to online teaching and learning impacted assessment practices and stu-
dent learning experiences in teacher education programs. The collection
also includes insights from research conducted on virtual learning spaces
by teacher educators in the kindergarten to Grade 12 (K–12) educational
context.
The importance of examining and understanding the pedagogical
shifts that occurred in virtual classroom contexts cannot be understated.
“Independent of the move to online learning as a result of the COVID-19
pandemic, there is a clear upward trend to the prevalence and variety of
online learning opportunities available to students” (Caprara & Caprara,
2021, p. 3683). Prepandemic, the number of courses being offered online
was growing in postsecondary institutions; indeed, “in the 2016–2017 aca-
demic year, 18% of postsecondary students in Canada took at least one
fully online course” (Weleschuk et al., 2019, p. 4). During the pandemic,
this trend was accelerated as universities across Canada, and around
the world, quickly moved courses to an online learning environment.

DOI: 10.4324/9781003347972-1
2 S. Laurie Hill and Paolina Seitz

Even with the return to face-to-face instruction, “data collected by the


Canadian Digital Learning Research Association (CDLRA) shows that
post-secondary education will be significantly changed as a result of the
pandemic” ( Johnson, 2021, p. 2), with institutions offering more online
and hybrid courses and integrating more technology into in-person
teaching.
As the pandemic took hold, university faculty members suddenly
had to shift their teaching to an online space and re-envision what their
practice would look like. In professional programs such as education,
maintaining the practicum component of the program presented add-
itional challenges (Burns et al., 2020). Online learning became one way
of supporting the traditional practicum experiences that are a require-
ment for preservice teachers in order to become accredited by a provin-
cial professional body. Reorganizing this component of teacher education
programs required flexibility and ingenuity so that preservice teachers
had ways to demonstrate their emerging teaching practice and university
mentor instructors could assess their efforts in meaningful ways.
Educators were invested in understanding effective online instruc-
tion and in determining how they would assess their students’ learning.
The word assessment “comes from the Latin verb assidere meaning to
sit with” (Heritage, 2010, p. 7). This definition suggests that assessment
is something the teacher does with the learner in collaboration and not
to the learner in isolation. For example, observation and conversations
are two important strategies in the collection of evidence of student
learning. These strategies and other formative assessment approaches
had to be realized differently in an online context. The pandemic required
teachers to use their creativity to solve newfound problems such as these.
Educators in faculties of education spent countless hours in professional
development to strengthen skills to effectively deliver courses online
(Hartwell et al., 2021) to help meet specific student needs within tightly
structured professional programs.
In addition to pedagogical challenges, the shift to online teaching
brought other concerns to the forefront. Accessibility became an equity
issue as not all students had access to computers and reliable internet
connections (Tadesse & Muluye, 2020). These challenges were felt keenly
in the K–12 education environment but were also relevant for preservice
teachers in their own program learning. In their practicum experiences,
preservice teachers grappled with leading learning experiences and
carrying out assessment strategies with their practicum online classes. In
K–12 classrooms, the provision of technology tools was more prominent
Introduction 3

in urban schools than in rural schools (Tadesse & Muluye, 2020), and
cramped spaces in homes were not ideal for teaching or learning (Metcalfe,
2021). “At the same time, there was a growing awareness of how the
pandemic was impacting Black people, Indigenous people, and people
of colour, who suffered disproportionately [from the negative impacts
of the pandemic]” (Danyluk et al., 2022, p. 3). Meaningful learning and
assessment of that learning cannot be fair when students do not have
the requisite devices, supplies, and support for the learning experiences
offered online.
There is a great deal of uncertainty among institutions in how
to define online learning. Terms such as “blended learning, hybrid
learning, and hyflex learning have emerged to describe the various ways
institutions can deliver learning experiences to students” ( Johnson et al.,
2022, p. 92). Defining digital learning as a term that refers to all types of
learning supported by technology, Johnson (2021) developed a modes of
learning spectrum framework. She applied definitions used by various
institutions to the framework to help categorize common terms. In this
framework, Johnson defined online learning as a learning context in which
“all instruction and interaction is fully online; synchronous or asyn-
chronous” ( Johnson et al., 2022, p. 94). We have adopted this definition
in our collection.

Organization of the Book

This volume represents a contemporary moment in which to con-


sider the educational significance of the shift from a face-to-face
context for teaching to an online learning environment in teacher edu-
cation programs. Drawing on different research methodologies and
approaches, the authors in this collection contribute to the research lit-
erature on effective assessment practices and strategies in online learning
environments. The authors are interested in student experiences and
assessment pedagogies, and they offer insights about the effectiveness
of current online approaches to assessment and the applicability of stu-
dent assessment practices in both teacher education programs and K–12
online learning environments.
The particular assessment lenses of our contributors are used to organize
the book’s themes into three parts: first, to highlight how assessment
was carried out in teacher education programs during the sudden shift
to online learning; second, to examine the manner in which assessment
4 S. Laurie Hill and Paolina Seitz

frameworks and strategies were reconceptualized in teacher education


programs; and third, to share research teacher educators conducted on
how assessment strategies were reimagined and implemented in K–12
online learning contexts.

Part 1: Assessment Shifts in Teacher Education

Part 1 contains five chapters that explore the assessment practices of


instructors and the experiences of preservice teachers in Bachelor of
Education programs as they shifted to an online learning environ-
ment with their students. In Chapter 1, “Exploring Assessment in a
Digital Age: Preservice and In-Service Teachers’ Professional Learning
Experiences,” authors Nadia Delanoy, Barbara Brown, and Jodie Walz
examine a professional learning series and research partnership project.
The framework for their study was informed by the technological, peda-
gogical, and content knowledge framework (Mishra & Koehler, 2006).
The professional learning series, offered online, connected theory and
current practice in assessment for preservice and in-service teachers that
supported them in developing digital assessment strategies for use with
online learning.
In the second chapter, “Supporting Student Teacher Assessment
Identity Development on Online Practica: A Narrative Inquiry,” Jenny
Ge presents a study that was grounded in teacher assessment identity
theories (e.g., Looney et al., 2018; Xu & Brown, 2016). Using a narrative
inquiry approach, she explored how preservice teachers developed and
applied their assessment philosophy and practices during their online
practicum experiences. She identified two types of tension experienced
by the preservice teachers and described four processes that supported
them in navigating these tensions and facilitating their assessment iden-
tity development.
In the next chapter, “Assessment as Partnership: The ‘Co’ in
Co-Constructing . . . ‘Not Quite’,” authors Allison Tucker and Marc
Husband investigate assessment practices created through a partnership
they built with preservice teachers in the context of a foundational course
in the elementary education stream. Their study was framed by Mason’s
(2002) theory of professional “noticing” and drew on online and in-person
learning contexts to examine the experiences of teacher educators and
preservice teachers as they navigated a cocreated assessment practices
Introduction 5

partnership. Tucker and Husband highlight the challenges and rewards


that implementing an assessment-as partnership strategy can bring.
The fourth chapter, titled “Addressing Challenges of Online Assessment
Through Quality Assessment Principles,” offers insight into the assessment
experiences of preservice teachers while they were learning online. Authors
Caitlin Fox and Julia Rheaume found that preservice teachers often felt
overwhelmed, confused, and frustrated because the foundational principles
of quality assessment (Black & Wiliam, 2018), such as clear criteria, feed-
back, and evidence of learning, were not readily transferred from face-to-face
classrooms to online settings. Fox and Rheaume conclude that implementing
quality assessment principles is essential when teaching in online contexts in
order to acknowledge the merit of student work and learning.
In the final chapter in this section, “Assessing the Content Knowledge,
Skills, and Competencies of Teacher Candidates in an Online
Environment: A Case Study,” author Renee Bourgoin uses an intrinsic case
study approach to gain insights into how teacher candidates experienced
their online learning experience. Bourgoin suggests specific assessment
practices that were particularly impactful and important to her teacher
candidates’ learning. Certain conditions allowed Bourgoin to uphold and
model effective assessment practices in her classes, including consistent
student attendance that facilitated active engagement and participation.

Part 2: Reconceptualizing Assessment Frameworks


for Preservice Teachers

Part 2 offers four chapters in which the authors examine ways that
assessment strategies were reconceptualized for online teaching and
learning in teacher education programs. In Chapter 6, titled “Decolonizing
Assessment Practices in Teacher Education,” authors Joshua Hill, Christy
Thomas, and Allison Robb-Hagg present an account of their collabor-
ation in integrating decolonizing assessment practices in a fully online
teacher education course. Their work drew upon decolonizing principles
of storytelling and negotiation to support the integration of Indigenous
ways of knowing and doing in the course learning tasks, formative
assessment strategies, and determination of grades. The authors con-
clude with a discussion of the challenges they encountered and the peda-
gogical decisions they made on their journey to decolonize assessment
practices in their online classrooms.
6 S. Laurie Hill and Paolina Seitz

Chapter 7, “Meaningful Feedback in the Online Learning Environment”


by Maggie McDonnell, uses Nicol and Macfarlane-Dick’s (2006) seven
principles of effective feedback to provide an analytic literature review
for making feedback meaningful and effective in the online environ-
ment. Best practices in providing feedback through strategies such as
video and audio tools, single-point rubrics, peer and self-assessment,
scaffolded assessments, and reflective practice are discussed in this
chapter. McDonnell concludes that feedback, essential at every stage of
assessment, fosters a sense of ownership and accountability in students.
In Chapter 8, authors Tan Xuan Pham, Chi (Linh) Tran, Le Pham Hue
Xuan, and Giang Nguyen Hoang Le share personal narratives from their
experiences in various cross-national graduate education programs. The
notion of presence, a key component to online teaching, is explored in
their chapter, “Analyzing Presence in Online Learning Environments
Through Student Narratives: An Autoethnographic Study.” Insights
from their experiences of online learning in the context of the COVID-
19 pandemic are framed through the interrelationships of three
concepts: cognitive presence, social presence, and teaching presence.
In their conclusions, the authors discuss implications for teachers and
institutions related to establishing meaningful pedagogical presence in
online course delivery.
Chapter 9, entitled “The Unintended Influence of COVID-19:
Optimizing Student Learning by Advancing Assessment Practices
Through Technology,” offers insights into the possibilities and challenges
associated with assessment literacy in two preservice online courses.
Technology has been a consistent presence in classroom for years; how-
ever, with the shift to online learning, it became a critical tool for online
teaching and learning. Authors Katrina Carbone, Michelle Searle, and Saad
Chahine used the technological, pedagogical, and content knowledge
framework (Mishra & Koehler, 2006) to deliver and enhance preservice
instruction and promote assessment literacy. Themes that emerged from
their case study suggest that preservice teachers gained a new awareness
of the possibilities and the challenges of assessment practices and the role
of technology in online teaching and assessment.

Part 3: Teacher Educators and Assessment


in K–12 Contexts

In Part 3, authors share research conducted on shifting online assessment


practices in K–12 classroom contexts. Beginning in Chapter 10,
Introduction 7

“Pedagogically Hacking the System: Developing a Competency-Based


Digital Portfolio,” authors Kathy Sanford, Hong Fu, Timothy Hopper, and
Thiago Hinkel investigate the emerging competency-based assessment
practices of four teachers in British Columbia. The redesigned provincial
curriculum requires teachers to develop a competency-based assessment
protocol. Teacher participants in the study developed alternative online
assessment approaches to measure and report student learning to respond
to this initiative. Results from their research clarify the conditions that
enabled participating teachers to shift to an innovative competency-based
approach to student assessment that fostered dialogue between them and
their students.
In Chapter 11, “Malfunction: Regressive and Reductive Online
Assessment Practices,” authors Shannon D. M. Moore, Bruno de Oliveira
Jayme, and Kathy Sanford report that assessment practices used by
teacher participants in their study did not align with the teachers’ own
understanding of sound assessment practices. The authors caution
that although educators may be invested in practices that support stu-
dent learning, the use of EdTech can encourage regressive pedagogy
and assessment practices. In their conclusions, they present alternative
approaches to thinking about and enacting online assessment practices
that are authentic and meaningful.
Chapter 12 continues the assessment discussion on teachers’
adjustments and adaptations of assessment strategies. Authors Katrina
Carbone, Michelle Searle, and Lori Kirkpatrick, in “Leveraging the
Relationship Between Assessment, Learning, and Educational Technology,”
explore teacher assessment practices with the use of technology. Using
Puentedura’s (2006) substitution, augmentation, modification, and redef-
inition (SAMR) framework and qualitative secondary analysis to analyze
data, the authors gained insights into how teachers leveraged technology
to support their assessment practices. The authors discuss their findings
and conclude that interweaving assessment practices with sound tech-
nology implementation can help teachers to effectively assess student
learning.
In Chapter 13, “Isolation/Adaptation/Education: Moving Hands-On
Secondary Visual Art Classes to a Virtual Platform,” author Christina
Yarmol discusses the challenges of rotating from in-person and hands-on
high school visual arts classes to online course delivery. Using a/r/
tographic methodology, a theoretical framework that engages in both
self-inquiry and collective inquiry, Yarmol offers insights into how the
obstacles teachers faced during the pandemic, such as ensuring access
to art materials, synchronously teaching basic skills, and sharing online
8 S. Laurie Hill and Paolina Seitz

resources, could be met with creativity. Yarmol also discusses how the
reimagining, teaching, and assessing of a visual arts curriculum in an
online environment resulted in a rich learning experience for her students.
In the next chapter, “Using the SAMR Model to Design Online K–12
Assessments,” author Sheryl MacMath describes her work with teacher
candidates when their practicum experiences were moved online.
Although many aspects of assessment strategies used in face-to-face
classrooms also worked in online classes, these strategies had to be
reimagined to be most effective in virtual spaces. MacMath introduced
Puentedura’s (2006) SAMR model into her course assessment frame-
work and modelled this approach for her teacher candidates to support
them in developing assessment literacy. Teacher candidates applied this
knowledge in carrying out online assessment of their students during
their practicum experiences. The chapter concludes with recommended
programs and apps that can be used when assessing student work.
In Chapter 15, “Adapting Classroom Assessment Practices for Online
Learning: Lessons Learned From Secondary School Teachers in the Early
Days of COVID-19,” Michael Holden, Christopher DeLuca, Stephen
MacGregor, and Amanda Cooper provide an overview of contemporary
research in formative assessment and implications for online learning.
They argue that disruptions created by the pandemic exacerbated
preexisting challenges with equitable access to education. The authors
interviewed 17 secondary school teachers to examine how formative
assessment strategies were used to meet students’ learning needs and
how systemic challenges were met during the shift to online learning.
Insights provided in this chapter will assist classroom teachers and pre-
service teachers in considering how to best adjust effective assessment
strategies to support online learning.
In the chapter “Equity in Action: Virtual Learning Environment
Considerations,” author Sharlene McHolm uses a narrative inquiry
approach to examine the conditions of learning for online learners and
shares experiences of school administrators leading remote learning
schools during a time of significant upheaval. The author questions
what constitutes an excellent teaching environment, in both face-to-face
and virtual contexts. She shares strategies that supported the learning
of exceptional students, finding that neurodiverse, medically fragile,
and marginalized students can thrive in a virtual learning environment.
Finally, McHolm discusses implications for future educators, noting the
opportunities and potential challenges the online environment offers to
educators and students.
Introduction 9

In the final chapter of Part 3, the author of “Ubuntu at a Distance:


Online Assessment for Care, Justice, and Community,” Sarah Elizabeth
Barrett, applies the sub-Saharan ethic of ubuntu as a conceptual lens to
examine the experiences of 50 K–12 teachers to better understand the
nature of assessment in online classrooms. Barrett recontextualized initial
themes from transcribed interviews to three aspects of ubuntu: justice,
community, and caring. Assessing students’ academic needs was related to
justice, accurate and equitable assessment to community, and assessment
of well-being to caring. The author concludes that community building
is an essential aspect to create and maintain a healthy classroom environ-
ment that supports the ethic of ubuntu in assessment practices.

Final Thoughts

The chapters in this collection offer a springboard for teacher educators to


(re)assess the values that underpin Bachelor of Education programs and
to “critically examine what we [are] teaching and how we [are] teaching
it” (Danyluk et al., 2022, p. 334). This work is particularly relevant given
that “preservice teachers are demanding flexibility in how they take
their courses, including through blended and online delivery” (Danyluk
et al., 2022, p. 339). Each chapter takes up the challenge of more fully
understanding the best pedagogical practices associated with assessment
in online learning environments. The pandemic presented an opportunity
to renew and refocus our interest in this learning context. It also offered
us a chance to consider the ways in which assessment practices support
the learning of preservice teachers and, in turn, inspire them to use mean-
ingful assessment practices as future teachers with their own students in
K–12 teaching environments. Further, the themes and questions posed
in these chapters provide a starting point for teacher educators to con-
sider assessment practices as a vehicle for meaningful learning in online
learning contexts, and “to rethink educational ideals in light of changing
technologies and changing human literacies” (Tarc, 2020, p. 121).
Underlying this renewed attention towards assessment pedagogies is our
belief that the quality of preservice teachers’ student experience is para-
mount. The research on online learning assessment practices presented
in this collection reaffirms the value of a strong pedagogical relationship
between instructor and student. It also highlights the inherent challenges in
establishing these pedagogical connections and providing the meaningful
feedback about learning needed for student engagement and understanding.
10 S. Laurie Hill and Paolina Seitz

Two components are key: (a) designing appropriate and meaningful


assessment strategies that allow preservice teachers and students to dem-
onstrate their understanding, and (b) creating a sense of community in
the online environment that allows all preservice teachers and students
to access and participate in the learning opportunities provided. Student-
student and teacher-student relationships matter in both online and face-
to-face learning environments. These relationship dynamics are especially
relevant for elementary and secondary students, but even in postsecondary
education, connections between instructors and students “is vital” (Tarc,
2020, p. 123) as well. As teacher educators, we want to continue prioritizing
the pedagogical relationships we build with our students and to foster in
them a deep connection to the knowledge that shapes their teaching.
The act of teaching reflects our values and beliefs in virtually all peda-
gogical decisions that we make. “Teaching is highly personal—an intensely
intimate encounter” (Ayers, 2001, p. 26). Ayers described the work of a
teacher as “becoming the student to your students . . . [and taking up the
task of creating] an environment for learning, a nurturing and challen-
ging space in which to travel” (p. 26). We hope this collection offers an
acknowledgement that this learning space is enriched when the manner
in which we measure student learning reflects not only the formal cur-
riculum, but also considers the less tangible but essential ways in which
students build meaning between their own experiences and knowledge
with the wider wilderness of the world around them (Grumet, 2006).

References

Ayers, W. (2001). To teach: The journey of a teacher (2nd ed.). Teachers College
Press.
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.
1080/0969594X.2018.1441807
Burns, A., Danyluk, P., Kapoyannis, T., & Kendrick, A. (2020). Leading the pan-
demic practicum: One teacher education response to the COVID-19 crisis.
International Journal of E-learning and Education, 35(2), 1–25. www.ijede.ca/
index.php/jde/article/view/1173/1836
Caprara, L., & Caprara, C. (2021). Effects of virtual learning environments:
A scoping review of literature. Education and Information Technologies, 27,
3683–3722. https://doi.org/10.1007/s10639-021-10768-w
Introduction 11

Danyluk, P., Burns, A., Hill, S. L., & Crawford, K. (Eds.). (2022). Crisis and
opportunity: How Canadian Bachelor of Education programs responded to the pan-
demic (Canadian Research in Teacher Education series, Vol. 11). Canadian
Association for Teacher Education. https://doi.org/10.11575/PRISM/39534
Grumet, M. (2006). Where does the world go when schooling is about schooling?
Journal of Curriculum Theorizing, 22(3), 47–54.
Hartwell, A., Brown, B., & Hanlon, P. (2021). Designing for technology-enabled
learning environments. University of Calgary. http://hdl.handle.net/1880/
113710
Heritage, M. (2010). Formative assessment: Making it happen in the classroom.
Corwin.
Johnson, N. (2021). Evolving definitions in digital learning: A national framework
for categorizing commonly used terms. Canadian Digital Learning Research
Association. www.cdlra-acrfl.ca/2021-cdlra-definitions-report/
Johnson, N., Seaman, J., & Poulin, R. (2022). Defining different modes of
learning: Resolving confusion and contention through consensus. Online
Learning Journal, 26(3), 91–110.
Looney, A., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualizing
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.108
0/0969594X.2016.1268090
Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
Metcalfe, A. S. (2021). Visualizing the COVID-19 pandemic response in Canadian
higher education: An extended photo essay. Studies in Higher Education, 46(1),
5–18. https://doi.org/10.1080/03075079.2020.1843151
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge:
A framework for integrating technology in teachers’ knowledge. Teachers College
Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated
learning: A model and seven principles of good feedback practice. Studies in
Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Puentedura, R. (August 18, 2006). Transformation, technology, and education [Blog
post]. http://hippasus.com/resources/tte/
Tadesse, S., & Muluye, W. (2020). The impact of COVID-19 pandemic on educa-
tion system in developing countries: A review. Open Journal of Social Sciences,
8, 159–170. https://doi.org/10.4236/jss.2020.810011
Tarc, P. (2020). Education post-‘Covid-19’: Re-visioning the face-to-face class-
room. Current Issues in Comparative Education, 22(1), 121–124. https://files.
eric.ed.gov/fulltext/EJ1274311.pdf
12 S. Laurie Hill and Paolina Seitz

Weleschuk, A., Dyjur, P., & Kelly, P. (2019). Online assessment in higher educa-
tion. Taylor Institute for Teaching and Learning Guide Series, University of
Calgary.
Xu, Y., & Brown, G. T. (2016). Teacher assessment literacy in practice:
A reconceptualization. Teaching and Teacher Education, 58, 149–162. https://
doi.org/10.1016/j.tate.2016.05.010
Part I
Assessment Shifts in
Teacher Education
Exploring Assessment 1
in a Digital Age
Preservice and In-service Teachers’
Professional Learning Experiences
Nadia Delanoy, Barbara Brown, and Jodie Walz

Within this extraordinary time in educational history, given the COVID-19


pandemic and the expansion of technological advancements and digital
learning spaces, educators are trying to adapt to new learning contexts
while continuing to advance pedagogy in assessment (Bragg et al., 2021;
Crompton et al., 2021; Darling-Hammond & Hyler, 2020). The chasm
between theory and practice is widening as digital methods and tools and
ways of knowing continue to advance rapidly (Bereiter, 2014; DeLuca
et al., 2015; Marzano & Pickering, 2011). Professional learning can provide
opportunities to engage preservice teachers, in-service teachers, graduate
students, and instructors to examine their learning and apply evidence-
based designs to connect theory to practice (Friesen & Brown, 2021).
The pandemic disrupted education systems worldwide (United Nations
Sustainable Development Group, 2020). System leaders and educators
made attempts to mitigate the disruptions and find ways to continue
learning in K–12 and in higher education contexts (Bozkurt et al., 2020).
For example, professional learning and training for educators often moved
to an online format (Hill et al., 2020; Quezada et al., 2020). In the fall of
2021, we offered a professional learning series called Assessment in a Digital
Age, comprising five synchronous sessions on topics related to assessment
in online teaching and learning. The participants included undergraduate
students in education (preservice teachers), education faculty, graduate

DOI: 10.4324/9781003347972-3
16 Nadia Delanoy, Barbara Brown, and Jodie Walz

students in education, and practising teachers (in-service teachers) from


a school district. Each session was 60 minutes and focused on proven
assessment practices informed by research and linked to competencies in
the Alberta Teaching Quality Standard (Alberta Education, 2020) related to
career-long learning and applying current practices for planning, instruc-
tion, and assessment. Following each session, we gathered data from
participants to learn about their experiences and determine how to con-
tinue to improve the series design.
In online professional learning, the ability of facilitators to bring to
life the intentions behind the content are crucial to how participants
perceive the success of the sessions (Darling-Hammond & Hyler, 2020).
In this study, we brought together our professional and research-based
experiences, as well as our pedagogical and field-based experiences, to
help inform the design and delivery of the online professional learning
series. The aim of our study was to examine how teachers experienced
a professional learning series designed to develop digital assessment
practices.

Literature on Assessment

Assessment has always been a challenging area for educators, and some-
times they perceive digital advancements as adding to the complexity
of assessment practices (Kurvinen et al., 2020). For innovative redesign
of assessment practices, teachers need to collectively reflect on the
challenges with current practices (Guskey, 2000; O’Connor, 2007). For
example, common challenges include integrating meaningful formative
assessment into learning and bringing students into the assessment and
grading process (O’Connor, 2007). Moreover, educators struggle with
balancing the demands of high-stakes or standardized tests and creative
means of assessing students (Wiliam, 2017). Many educators believe
they cannot keep up with the pace of advancements and consequently
feel uncertain about how to implement appropriate applications (Paulus
et al., 2020). As such, an examination of the relationship between theory
and practice can be the foundation for reflection and enhance educators’
level of comfort with digital applications, affirm utility of the tools in
the digital space, and aid in continuing to advance pedagogy and student
outcomes (Kurt, 2019; Mishra & Koehler, 2006).
As Boltz et al. (2021) and Scull et al. (2020) asserted, practitioners
must harness the social and constructivist nature of learning to continue
Exploring Assessment in a Digital Age 17

innovating pedagogically and in the digital space to increase assessment


effectiveness. Professional learning should help reframe student-centred
approaches to realize a high level of personalization (Aguilar-Cruz &
Medina, 2021; Charania et al., 2021; Robinson, 2021). During the pan-
demic, the accelerated use of digital applications within online, hybrid,
and blended learning environments further precipitated the need for flex-
ible and adaptive professional learning offerings (Scull et al., 2020).
In a study focused on the Australian perspective of preservice teacher
development, many preservice teachers shared that their approach was
in many ways trial and error, with adjacent real-time supports for pro-
fessional development (Scull et al., 2020). Moreover, preservice teachers
needed to find professional learning to help with online pedagogies
and student engagement; emergent advancement was a theme across
many studies (Boltz et al., 2021; Robinson, 2021; Scull et al., 2020). In
another study that solicited preservice teacher feedback on teaching in
a digital arena, Aguilar-Cruz and Medina (2021) shared that many pre-
service teachers reported their problem-solving skills were honed more
readily online than in person because of how they navigated the digital
environment, student engagement, and opportunities for collaboration.
Additionally, communication skills and ways of relating to students were
areas where preservice teachers continued to advance as complexities in
the learning environment presented themselves (Walker & White, 2013).
Providing authentic opportunities to address different learning
scenarios, cultivating an openness to the climate, and reducing fears and
perceived challenges in the digital space should be at the forefront of not
only the design of professional learning but also the delivery (Koehler
et al., 2014). In one study on digital capacity and professional learning
during the pandemic, researchers highlighted aspects of equity in techno-
logical applications such as access and understanding of tools, as well as
the need for mindful pedagogical application even in a disrupted state
(Talib et al., 2021). This state of uncomfortable confidence can happen
only with guided professional learning in a state of disruption; opportun-
ities to apply digital methods must be created at all levels of education
(Cotton, 2021). The integration of opportunities to safely engage in trial
and error learning and to receive organic mentoring from those in the
field—whether educational technologists, in-service graduate students,
or experienced practitioners—within interactive professional learning
can provide a low-stakes opportunity to holistically develop instructional
acumen. Capacity building can be limited by assumptions about individ-
uals who grow up in the digital world and their level of comfort with
18 Nadia Delanoy, Barbara Brown, and Jodie Walz

teaching and learning in online environments (Boltz et al., 2021; Kalloo


et al., 2020; Talib et al., 2021). Helping educators to recognize the different
learning cadences of online and in-person environments and to execute
contingency planning when using digital tools is essential to competency
building (Charania et al., 2021).
Teacher capacity is a significant facet in the digital age; understanding
how applications and platforms can be used in student-centred and
personalized approaches can support fluent integration (Darling-
Hammond & Hyler, 2020). Elements such as pacing and social and intel-
lectual engagement, as Talib et al. (2021) stated, are integral in the design
of lessons and how they are delivered. Pedagogy, content, and knowledge
all coalesce within the digital space and interrelate with teacher comfort
and the need for opportunities in relation to professional learning; respon-
sive and iterative aspects within the professional learning design; and time
to deeply engage, play, and receive feedback (Mishra & Koehler, 2006).
In a systematic literature review on the impact of technology, Daoud
et al. (2020) reinforced the importance of considerations of equity to
support teacher implementation and use of digital tools to ensure access
to devices, infrastructure within districts in terms of network capacity,
and teacher competency (Alberta Education, 2013, 2020).
As districts and schools continue to embrace the digital age, policy
frameworks such as the learning and technology policy framework
(Alberta Education, 2013) will be essential as elements of infrastruc-
ture and pedagogy need to be considered to support student learning
(Selwyn et al., 2020). Daoud et al. (2020) argued that events such as the
pandemic, digital advancement, and the use of technology in education
need to be deeply considered, and an exploration of how these real-
ities are led, supported, and enacted can be critical to understanding
pedagogical change. Bridging theory and practice within professional
learning and embedding digital methods to move these changes for-
ward can provide a rich backdrop for learning (Anderson et al., 2021;
Friesen & Brown, 2021).

Theoretical Framework

The technological, pedagogical, and content knowledge (TPACK) frame-


work was introduced by Mishra and Koehler (2006), based on Shulman’s
(1987) discussion of pedagogical content knowledge, and it is widely
used in teacher education. Content knowledge is the teacher’s knowledge
Exploring Assessment in a Digital Age 19

of the subject matter to be taught, such as outcomes from a developed


curriculum in math, science, or social studies. Pedagogical knowledge is
the teacher’s knowledge of methods used to deliver content to students,
encompassing how students learn and is often referred to as the teacher’s
toolbox or repertoire of practice. Technology knowledge is the teacher’s
understanding of technology and how to apply it in meaningful ways
such as through creation and collaboration (Cotton, 2021). Each domain
plays an important role by itself, but Mishra and Koehler suggested the
interplay of these three domains creates constructive ways to effectively
integrate technology in the classroom while also considering content and
pedagogy.
Studies of technology integration have shown that the opportunity for
teachers to holistically integrate pedagogy, content, and technology for
deep understanding of the use of digital tools provides a foundation
for the design of the professional learning (Koehler et al., 2014). The
TPACK framework can enable preservice and in-service teachers to inte-
grate technology effectively in their teaching practice. In the current
study, the framework helped guide the design of the professional learning
and research focused on developing digital assessment practices.
Preservice teachers are in the process of developing discrete and
integrated technological, pedagogical, and content expertise, whereas
in-service teachers understand the content and pedagogies but often lack
the technological knowledge required to use contemporary technology
to its full potential (Koehler et al., 2014). TPACK helps both preservice
and in-service teachers navigate the complexity of technology in the
classroom setting (Voogt & McKenney, 2017). The TPACK framework
(Mishra & Koehler, 2006) guides teachers in effectively integrating tech-
nology into their teaching once they become proficient with content
knowledge, pedagogical knowledge, and technological knowledge.
Using TPACK as a theoretical frame for the study enabled us to con-
sider the critical dimensions of technology, content, and pedagogical
knowledge needed for developing digital assessment practices theoretic-
ally and practically (Koehler et al., 2014). Each of the three domains can
be combined in different ways to support the diverse needs of a teacher,
their classroom, and students. These three domains come together to
create four additional subdomains: pedagogical content knowledge,
technological content knowledge, technological pedagogical know-
ledge, and technological pedagogical content knowledge. TPACK is the
cumulation of these domains, as shown in Figure 1.1. The dotted line
represents the context or differing applications of each of the knowledge
20 Nadia Delanoy, Barbara Brown, and Jodie Walz

TPACK framework. From “Using the TPACK Image,” by M. J.


Figure 1.1 
Koehler, 2011 (http://matt-koehler.com/tpack2/using-the-tpack-
image/). Copyright 2011 by tpack.org. Reproduced with permission.

areas and how the relationship between these areas supports the diverse
nature of classrooms.

Methodology

Design research in education uses an iterative approach and is often


conducted for interventions, on interventions, or through interventions
(McKenney & Reeves, 2018). In this case, we conducted design research
on an intervention. The intervention was a professional learning series
offered online during the COVID-19 pandemic. In this design-based
study, our team (also the authors of this chapter) included an educational
technology consultant, a university instructor, and an associate dean in
the Faculty of Education. This team formed a partnership to develop
and facilitate a professional learning series to support praxis. Looking
Exploring Assessment in a Digital Age 21

at the research through a lens of what education can be was integral to


the process (Bakker, 2018). The level of authenticity in the design of the
professional learning sessions during unprecedented times supported
innovative and emergent pedagogy (Bakker, 2018; McKenney & Reeves,
2018). Design-based research has been shown to foster partnerships
between researchers and practitioners and harness the intersections of
research and practice (Getenet, 2019).
The five sessions in the series were offered free of charge via Zoom
(https://zoom.us) during the fall of 2021 and advertised through the
Faculty of Education website. Each session offered the opportunity for
participants to explore a different facet of assessment and online peda-
gogy using the TPACK framework (Mishra & Koehler, 2006). The sessions
afforded participants the time to explore at least two to three digital tools,
applications, or practices while considering how best to integrate them
pedagogically.
Although we recommended that participants attend all five sessions
for microcredential eligibility, they were welcome to attend any number
of sessions. In the sessions, participants had opportunities to engage
with the theory, research, and considerations of research-informed
practices related to each theme, as listed in Table 1.1. Additionally,
teachers were given time to apply digital tools and platforms using
various lesson approaches and practice scenarios. For example, in
the second session, participants learned about personalizing and dif-
ferentiating learning for students using guides, such as online choice
boards, to map out a range of options for assignments and diverse

Table 1.1 Professional learning series topics and number of attendees.


Session Date Topic Attendees (n)
1 October 14 Innovative and student-centred 45
approaches
2 October 28 Learning about designing digital 46
lessons
3 November 18 The communication of student 38
learning
4 December 2 The impact of gamification for 40
learning
5 December 9 Benefits of using a learning 36
management system
22 Nadia Delanoy, Barbara Brown, and Jodie Walz

ways for students to demonstrate their learning. Choice boards can


be customized for individual students to complete at their own pace,
and the assignment chart or personalized playlist can be used to inte-
grate knowledge, give students the ability to explore through internet
links, and show their understanding in digitally oriented ways. During
the fourth session, participants were introduced to ways of gamifying
learning, and they explored how game-based learning can be integrated
into learning designs. Last, participants were encouraged to consider
how they assess, specifically how they evaluate and measure student
learning, and how this can be communicated through different media
such as portfolios, interactive applications, or learning management
systems. The sequence of the design provided entry points for the
participants and building blocks to support a digital repertoire of com-
petencies from a theoretical and practical level of understanding.
During the interactive synchronous Zoom sessions, we used a tech-
nique referred to as chatterfall. We asked all participants to respond to a
question at the same time by entering their responses in the chatbox to
share their understanding, show how they conceptualize ideas, and reflect
on their practice. This relational capital and information solicited from
participants during each session was used in a design-based way to adapt
the subsequent offerings and better serve the needs of the participants
(Blackmon, 2012). Through the professional learning series and research,
we aimed to understand how the design ideas informed by the TPACK
framework were experienced by asking participants for their feedback
following the sessions. The part of the study discussed in this chapter was
guided by the following question: How do teachers experience a profes-
sional learning series designed to develop digital assessment practices?
Undergraduate students (preservice teachers), graduate students, fac-
ulty, sessional instructors, and K–12 in-service teachers affiliated with the
university attended the professional learning series and were invited to
participate in the study in compliance with the research ethics board cer-
tificate of approval. Attendees were informed that participation in the
study would be voluntary and confidential and was not required to attend
the professional learning series. Each session had 36 to 46 participants in
attendance, and 16 individuals provided evidence of completing all five
sessions and requested a digital badge from the administrator.
A questionnaire with 11 questions was administered to participants
after each session. The first question asked participants to identify their
role, and the second question asked about the dates of the professional
Exploring Assessment in a Digital Age 23

learning sessions they attended. Questions 3 to 6 asked participants to


rate items using a four-point scale ranging from strongly agree to strongly
disagree, and Questions 7 to 11 were open-ended. The questions were as
follows:

1. Which best identifies you? (We recognize you may have more than
one of the roles listed. Please select your primary role or use other to
identify a different role that is not listed.) Preservice teacher (under-
graduate student); Instructor (includes faculty of all ranks, contract
faculty, sessional); Graduate student; In-service teacher (includes
practising teacher in the field, mentor teacher, substitute teacher,
learning leader); Other.
2. What date(s) did you attend an online pedagogy series?
3. I had a good experience attending the session.
4. The session helped inform my practice.
5. I will likely use something from the session in my practice.
6. I found the session helpful with my teaching amid the COVID-19
pandemic.
7. Based on your experience, what did you find most helpful that you
plan to use in your teaching or that has informed your teaching
practice?
8. How might this session help with your teaching during the COVID-
19 pandemic?
9. What were the benefits of participating in the online pedagogy series?
10. What recommendations do you have to improve the design of the
online pedagogy series?
11. Provide any additional comments that you would like to share with
the research team regarding the online pedagogy series.

During each session, the researchers provided information about the


study and then shared the link to the questionnaire in the chat at the end
of the webinar. A third-party administrator also sent a follow-up email to
all participants to remind them to complete the questionnaire. Over the
course of the professional learning sessions, 23 participants completed
the questionnaire.
We used the input gathered from participants during the sessions
and the data collected from the post session questionnaires to inform
the ongoing design for the professional learning. We also analyzed the
questionnaire feedback to gain a deeper understanding of how the series
24 Nadia Delanoy, Barbara Brown, and Jodie Walz

influenced participants’ digital assessment practice using descriptive


statistics for Questions 3 to 6 and open coding for Questions 7 to 11 (Miles
et al., 2020). All members of the research team worked collaboratively
to review the open-ended text responses and generate themes to help
inform future iterations of the online professional learning series.

Findings

The demographic information gathered through the questionnaire


showed that participants included preservice teachers, in-service teachers,
graduate students, and university instructors. Findings demonstrated that
dedicated time for teachers at different career stages to work together,
reflecting on their practice through targeted professional learning, can
support iterative practice. Additionally, findings reaffirmed that this
dedicated time could support the opportunities for teachers to develop an
assessment toolbox, and a pedagogical repertoire situated around digital
technologies and formative assessment.
The sessions were highly attended across the series and the majority
of those who attended the sessions communicated their appreciation,
sharing comments such as “attending the sessions was highly mean-
ingful” and it “highly informed my practice.” Additionally, 100%
strongly agreed or agreed they would use something from each session
in their practice and that each session provided practical, informative,
and instructional design-oriented opportunities that could be applied
to their practice. Furthermore, 100% of participants strongly agreed
or agreed that the sessions helped their teaching and with instances
of teaching online, known as emergency remote teaching (Hodges
et al., 2020).
Specifically, many of the open-ended textual responses for Questions
7 to 11 reflected how participants found the discussion of the theoretical
aspects of assessment helped them develop in their learning and adapt to
new challenges as they acclimated to the digital space. Additionally, they
found gamification of learning in math, online applications for commu-
nicating student learning, and the tools introduced to advance formative
assessment to be highly relevant to their learning and subsequent prac-
tice. Moreover, most participants shared their appreciation of the integra-
tion of the concept of digital citizenship, the support shared for planning
for technical difficulties, and the research-informed discussion that was
integrated into every session.
Exploring Assessment in a Digital Age 25

The participants’ responses about how the sessions helped with their
teaching, whether in practicum or in their own classrooms, indicated
that the professional learning was beneficial to their practice. Question 9
required participants to share their perceived benefits of participating in
an online pedagogy series. The majority of participants saw value in the
series and expressed a high level of satisfaction. Table 1.2 displays the six
most common response themes from the participants.
Participants reported an overall positive experience with the profes-
sional learning series. They found value in the content as well as the ways
in which engagement was fostered through interactive digital lessons,
tool exploration, and discussions of the connections to practice from
multiple access points. In the session on learning management systems
and portfolio use, tools were discussed for formative and summative
assessment, inquiry, and evaluation of student learning, including tool
capabilities for student voice with Mote (www.mote.com/) Google Forms
(https://forms.google.com) with extended responses, and digital lessons

Table 1.2 Top six response themes about the benefits of an online pedagogy
series.
Benefits of participating in professional Sample excerpts from questionnaire
learning responses
Collaboration with others provides “Collaboration with others who might
unique insights have unique insights.”
Current and relevant technologies “It was most beneficial to see what
used in teaching and learning is currently being used to ensure
inclusivity and meet the diverse
needs of students.”
Sharing knowledge and pedagogical “Learning, finding out what others are
experiences practising, and sharing.”
Practical application “Training and professionalism. Real
answers rather than more questions.”
Flexibility of sessions “Flexibility of participation. I’m in
the field right now and was able to
seamlessly transition from a staff
meeting to the session.”
Easy access of online offering “Per being online, I literally jumped
on to the course. . . . I didn’t have to
choose between being there for my
family, and my continuing education
is invaluable.”
26 Nadia Delanoy, Barbara Brown, and Jodie Walz

embedded with multiple forms of representation. Notable across the


results of this study were the examples of ways the teachers were able to
use these assessment tools. Overall, engaging with other teachers across
different career points helped with understanding the practical consider-
ations for technology-enhanced teaching and learning environments.

Discussion

Assessment as a construct is difficult to navigate, whether as a preservice


teacher, in-service teacher, instructor, or graduate student (Kurvinen
et al., 2020). For some, assessment is complicated by the expansion of
technology and the digital space. Having rich feedback on the signifi-
cance of this professional learning series, the design of the theoretical
and practical applications, and the time provided for participants to play
highlighted the value participants found in their experiences and learning
processes. Through the lens of the TPACK theoretical framework, the
participants experienced this professional learning series in ways they
perceived to deepen their pedagogy and confidence in integrating tech-
nology meaningfully into their practice. For example, the teachers shared
that they applied their learning of the program of studies and use of
digital tools in relevant ways for assessment. The exploration of digital
tools provided an opportunity for participants to become familiar with
using them for assessment purposes. Through the content knowledge
dimension of TPACK, the professional learning series provided the
opportunity for participants to share their knowledge and pedagogical
experiences in relation to practical applications in teaching and learning.
Cotton (2021) highlighted in a K–12 study that content knowledge can be
seen as the anchor or foundation that teachers can use as a lever to design
learning for their students.
Technologically, the professional learning was designed to introduce
new technologies related to assessment tools and web platforms, and
to afford participants with time to practice embedding the technology
in instructional practice. Participants noted that technologies such as
Genially (https://genial.ly/), Minecraft (https://education.minecraft.
net/en-us/), Flipgrid (https://info.flipgrid.com/), Google Classroom
(https://classroom.google.com/), and Brightspace/Desire2Learn (www.
d2l.com/) were meaningfully integrated into activities or scenarios
that teachers might encounter in their own classrooms or practicum
experiences. This interplay in the TPACK framework between content
Exploring Assessment in a Digital Age 27

and technology was received well by participants as they valued the pro-
cess, found the engagement with the tools or platforms manageable, and
had time to play in a safe space where trial and error was supported by the
facilitators. Bragg et al. (2021) asserted that the meshing of technology
and content should be seamless so users can engage with technology as
an extension of learning and not as an add-on. The sessions focused on
the former to coalesce content knowledge, technological knowledge or
learning, and pedagogical knowledge.
As the focus was on assessment from a theoretical and pragmatic per-
spective, participants were able to conceptualize use of the digital tools and
platforms outside of the sessions more readily because of the professional
learning design. The mapping of technology as a means of assessment
was used throughout the sessions. From a professional learning stance,
pedagogy was the common thread interwoven throughout the series.
The sessions also focused on how-to, and participants embraced new
tools and considerations for the application to teaching and learning.
Participants shared their ideas and pedagogical considerations such as
student access points and utility. Bringing together teachers with varying
years of experience helped preservice teachers and in-service teachers
alike to contextualize their pedagogical knowledge and support others
in this process. For example, the relevance of this process during the
COVID-19 pandemic and how teachers traversed online, blended, and
hybrid approaches was widely discussed, providing, in multiple ways,
needed insight for preservice teachers as well as a sense of manageability.
In a study of preservice teachers in Australia during the pandemic, inter-
mittent professional learning supported deeper considerations of how
to navigate technology integration and shifts due to the COVID-19 pan-
demic (Scull et al., 2020).
Through the lens of this professional learning rooted in the digital and
with the foundation of the TPACK framework, teachers were provided
context to understand the interconnected dimensions of content,
technology, and pedagogical knowledge and the digital environment
(Koehler et al., 2014). Seeing this process as a holistic and interrelated
system can support recognition of technology as a tool for learning
and interconnected to pedagogy and content. Engaging in professional
learning that fosters sharing, collaboration, capacity building through
play, and linkages to the program of studies can help teachers recognize
the connections between tools, platforms, and content more authentic-
ally. Providing a forum for educators to share their experiences across the
career span can make the navigation of the evolving digital world more
28 Nadia Delanoy, Barbara Brown, and Jodie Walz

manageable and create a level of comfort that teachers on their own may
not experience. Though this study had a small group of questionnaire
respondents, their consistent underscoring of positive experiences and
lessons learned can inform the design and implementation of profes-
sional learning to support assessment in a digital age and can provide a
foundation for future study.
Figure 1.2 highlights the interplay and embedded nature of the
dimensions of TPACK, overlaid with descriptors for the design of the

Figure 1.2 Interplay of the TPACK framework with the professional learning


design and participant experiences. CK = content knowledge;
PCK = pedagogical content knowledge; PK = pedagogical knowledge;
TCK = technological content knowledge; TK = technological know-
ledge; TPACK = technological, pedagogical, and content knowledge;
TPK = technological pedagogical knowledge. Adapted from “Using
the TPACK Image,” by M. J. Koehler, 2011 (http://matt-koehler.
com/tpack2/using-the-tpack-image/). Copyright 2011 by tpack.org.
Reproduced and adapted with permission.
Exploring Assessment in a Digital Age 29

professional learning within the technological, content and pedagogical


knowledge circles. It also shows how participants experienced the online
professional learning on the outer edges, within the dotted line that
depicts the context of the professional learning series offered during the
COVID-19 pandemic.

Conclusion

This chapter provided the space to share the experiences of preservice


and in-service teachers who engaged in a professional learning series
to inform their teaching and assessment practice. We examined the
questionnaire responses provided by participants following each of
the sessions to gain an understanding of their experience and how the
sessions influenced their digital assessment practices. The data were also
used to inform our ongoing design of the professional learning series
using the TPACK framework (Mishra & Koehler, 2006). Overall, the
study demonstrated there were benefits for a university-district partner-
ship in offering an online professional learning series in response to the
pandemic, when teachers had to quickly transition to emergency remote
teaching (Hodges et al., 2020).
Participants reported they were introduced to assessment theory and
technologies that could be used practically as assessment tools, they
were provided with time to explore and play with technologies, and they
developed confidence in using technology-enhanced formative assessment
tools and strategies. Their feedback also demonstrated the value of and
demand for online professional learning for a mixed group of preservice
and in-service teachers. The findings showed a need to continue studying
and improving online professional learning offerings to support teachers’
developing body of knowledge and career-long learning.

References

Aguilar-Cruz, P. J., & Medina, D. L. (September 1, 2021). Pre-service English


teachers’ perceptions of their online teaching practice during pandemic
times. Propósitos y Representaciones, 9, Article e925. https://doi.org/10.20511/
pyr2021.v9nSPE1.925
Alberta Education. (2013). Learning and technology policy framework. Government
of Alberta. https://open.alberta.ca/publications/6354617
30 Nadia Delanoy, Barbara Brown, and Jodie Walz

Alberta Education. (2020). Teaching quality standard. Government of Alberta.


https://open.alberta.ca/publications/teaching-quality-standard-2020
Anderson, M., Turner, A., & Brown, B. (2021). Designing online professional
learning to support in-service and preservice teachers adapting to emergency
remote teaching. The Journal of Applied Instructional Design, 10(3). https://doi.
org/10.51869/103/maatbb
Bakker, A. (2018). Design research in education: A practical guide for early career
researchers. Routledge.
Bereiter, C. (2014). Principled practical knowledge: Not a bridge but a ladder.
Journal of the Learning Sciences, 23(1), 4–17. https://doi.org/10.1080/105084
06.2013.812533
Blackmon, S. (2012). Outcomes of chat and discussion board use in online
learning: A research synthesis. Journal of Educators Online, 9(2). https://doi.
org/10.9743/JEO.2012.2.4
Boltz, L. O., Yadav, A., Dillman, B., & Robertson, C. (2021). Transitioning to
remote learning: Lessons from supporting K–12 teachers through a MOOC.
British Journal of Educational Technology, 52(4), 1377–1393. https://doi.
org/10.1111/bjet.13075
Bozkurt, A., Jung, I., Xiao, J., Vladimirschi, V., Schuwer, R., Egorov, G., Lambert,
S., Al-Freih, M., Pete, J., Olcott, D., Jr., Rodes, V., Aranciaga, I., Bali, M.,
Alvarez, A. J., Roberts, J., Pazurek, A., Raffaghelli, J. E., Panagiotou, N.,
de Coëtlogon, P., . . . Paskevicius, M. (2020). A global outlook to the inter-
ruption of education due to COVID-19 pandemic: Navigating in a time of
uncertainty and crisis. Asian Journal of Distance Education, 15(1), 1–126. www.
asianjde.com/ojs/index.php/AsianJDE/article/view/462
Bragg, L. A., Walsh, C., & Heyeres, M. (2021). Successful design and delivery
of online professional development for teachers: A systematic review of
the literature. Computers & Education, 166, Article 104158. https://doi.
org/10.1016/j.compedu.2021.104158
Charania, A., Bakshani, U., Paltiwale, S., Kaur, I., & Nasrin, N. (2021).
Constructivist teaching and learning with technologies in the COVID‐19
lockdown in eastern India. British Journal of Educational Technology, 52(4),
1478–1493. https://doi.org/10.1111/bjet.13111
Cotton, W. (2021). Examining technology integration in K–12 schools through the
TPACK framework during the COVID-19 pandemic [Doctoral dissertation].
Hampton University. ProQuest Dissertations Publishing.
Crompton, H., Burke, D., Jordan, K., & Wilson, S. W. G. (2021). Learning
with technology during emergencies: A systematic review of K–12 educa-
tion. British Journal of Educational Technology, 52(4), 1554–1575. https://doi.
org/10.1111/bjet.13114
Exploring Assessment in a Digital Age 31

Daoud, R., Starkey, L., Eppel, E., Vo, T. D., & Sylvester, A. (2020). The educa-
tional value of internet use in the home for school children: A systematic
review of literature. Journal of Research on Technology in Education, 53(4), 353–
374. https://doi.org/10.1080/15391523.2020.1783402
Darling-Hammond, L., & Hyler, M. E. (2020). Preparing educators for the time
of COVID . . . and beyond. European Journal of Teacher Education, 43(4), 457–
465. https://doi.org/10.1080/02619768.2020.1816961
DeLuca, C., Volante, L., & Earl, L. (2015). Assessment for learning across
Canada: Where we’ve been and where we’re going. Education Canada, 55(2).
www.edcan.ca/articles/assessment-for-learning-across-canada/
Friesen, S., & Brown, B. (2021). Advancing knowledge creation in education
through tripartite partnerships. Canadian Journal of Learning and Technology,
47(4). https://doi.org/10.21432/cjlt28052
Getenet, S. (2019). Using design-based research to bring partnership between
researchers and practitioners. Educational Research, 61(4), 482–494. https://
doi.org/10.1080/00131881.2019.1677168
Guskey, T. R. (2000). Grading policies that work against standards . . .
and how to fix them. NASSP Bulletin, 84(620), 20–29. https://doi.
org/10.1177/019263650008462003
Hill, C., Rosehart, P., St. Helene, J., & Sadhra, S. (2020). What kind of educator
does the world need today? Reimagining teacher education in post-pandemic
Canada. Journal of Education for Teaching, 46(4), 565–575. https://doi.org/10.
1080/02607476.2020.1797439
Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (March 27, 2020).
The difference between emergency remote teaching and online learning.
Educause Review. https://er.educause.edu/articles/2020/3/the-difference-
between-emergency-remote-teaching-and-online-learning
Kalloo, R. C., Mitchell, B., & Kamalodeen, V. J. (2020). Responding to the
COVID-19 pandemic in Trinidad and Tobago: Challenges and opportun-
ities for teacher education. Journal of Education for Teaching, 46(4), 452–462.
https://doi.org/10.1080/02607476.2020.1800407
Koehler, M. J. (May 11, 2011). Using the TPACK image. TPACK.org. http://matt-
koehler.com/tpack2/using-the-tpack-image/
Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The
technological pedagogical content knowledge framework. In J. M. Spector,
M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational
communications and technology (pp. 101–111). Springer.
Kurt, S. (September 16, 2019). TPACK: Technological pedagogical content knowledge
framework. International Society for Educational Technology. https://educational
technology.net/technological-pedagogical-content-knowledge-tpack-framework/
32 Nadia Delanoy, Barbara Brown, and Jodie Walz

Kurvinen, E., Kaila, E., Laakso, M.-J., & Salakoski, T. (2020). Long term effects
on technology enhanced learning: The use of weekly digital lessons in math-
ematics. Informatics in Education, 19(1), 51–75. https://doi.org/10.15388/
infedu.2020.04
Marzano, R. J., & Pickering, D. J. (2011). Chapter one: Research and theory. In
The highly engaged classroom (pp. 3–20). Marzano Research Laboratory.
McKenney, S., & Reeves, T. C. (2018). Conducting educational research. Routledge.
Miles, M. B., Huberman, A. M., & Saldaña, J. (2020). Qualitative data analysis:
A methods sourcebook (4th ed.). SAGE.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge:
A framework for integrating technology in teachers’ knowledge. Teachers College
Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
O’Connor, K. (2007). A repair kit for grading: 15 fixes for broken grades. Pearson.
Paulus, M. T., Villegas, S. G., & Howze-Owens, J. (2020). Professional learning
communities: Bridging the technology integration gap through effective pro-
fessional development. Peabody Journal of Education, 95(2), 193–202. https://
doi.org/10.1080/0161956X.2020.1745610
Quezada, R. L., Talbot, C., & Quezada-Parker, K. B. (2020). From bricks and
mortar to remote teaching: A teacher education program’s response to
COVID-19. Journal of Education for Teaching, 46(4), 472–483. https://doi.org/
10.1080/02607476.2020.1801330
Robinson, M. (2021). The virtual teaching experience with Google Classroom
during COVID-19: A phenomenological study [Doctoral dissertation]. St. John’s
University. ProQuest Dissertations Publishing.
Scull, J., Phillips, M., Sharma, U., & Garnier, K. (2020). Innovations in teacher
education at the time of COVID-19: An Australian perspective. Journal of
Education for Teaching, 46(4), 497–506. https://doi.org/10.1080/02607476.20
20.1802701
Selwyn, N., Hillman, T., Eynon, R., Ferreira, G., Knox, J., Macgilchrist, F., &
Sancho-Gil, J. M. (2020). What’s next for ed-tech? Critical hopes and concerns
for the 2020s. Learning, Media and Technology, 45(1), 1–6. https://doi.org/10.1
080/17439884.2020.1694945
Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform.
Harvard Educational Review, 57(1), 1–23. https://doi.org/10.17763/haer.57.1.
j463w79r56455411
Talib, M. A., Bettayeb, A. M., & Omer, R. I. (2021). Analytical study on the impact
of technology in higher education during the age of COVID-19: Systematic
literature review. Education and Information Technologies, 26(6), 6719–6746.
https://doi.org/10.1007/s10639-021-10507-1
Exploring Assessment in a Digital Age 33

United Nations Sustainable Development Group. (2020). Policy brief: Education


during COVID-19 and beyond. https://unsdg.un.org/resources/policy-brief-
education-during-covid-19-and-beyond
Voogt, J., & McKenney, S. (2017). TPACK in teacher education: Are we pre-
paring teachers to use technology for early literacy? Technology, Pedagogy and
Education, 26(1), 69–83. https://doi.org/10.1080/1475939X.2016.1174730
Walker, A., & White, G. (2013). Technology enhanced language learning: Connecting
theory and practice-Oxford handbooks for language teachers. Oxford University
Press.
Wiliam, D. (2017). Embedded formative assessment (2nd ed.). Solution Tree Press.
Supporting Student 2
Teacher Assessment
Identity Development
on Online Practica
A Narrative Inquiry
Jenny Ge

Practica are a critical component of initial teacher education (ITE)


programs in preparing student teachers for their diverse classroom
responsibilities, including leveraging contemporary (i.e., formative,
growth-oriented, student-centred) assessment practices to support stu-
dent learning. Studies have shown that practicum experiences help stu-
dent teachers significantly expand their conceptions of assessment which,
in turn, shape their approaches to assessment (e.g., Xu & He, 2019).
Encountering and resolving ethical and practical dilemmas on practica
has also been found to be instrumental in supporting teacher identity
development, which plays an important role in informing teachers’ con-
fidence, fulfilment, and commitment to their role (e.g., Alsup, 2006,
2019). As beginning teachers continue to report being underprepared
in assessment (Gareis et al., 2019; Rogers et al., 2022), there is a need
for ITE programs and teacher educators to understand how various pro-
grammatic experiences, such as practica, are supporting student teachers’
assessment practice.
Traditionally, ITE programs have focused on developing student
teachers’ assessment literacy—or assessment skills, knowledge, and
practices (Stiggins, 1991). Coombs et al. (2021) analyzed 173 course
DOI: 10.4324/9781003347972-4
Supporting Student Teacher Assessment Identity Development 35

descriptions from 12 ITE programs across Canada and found that course
content most commonly focused on assessment strategies (e.g., pro-
viding feedback, creating rubrics), the role of assessment in the teaching
and learning cycle, and accommodations and modifications to support
diverse learners’ needs. However, assessment literacy development
studies continue to show that teachers’ assessment beliefs and practices
are strongly influenced by personal and social factors (e.g., personal
experiences with assessment as students) beyond their learning (Xu &
Brown, 2016). These findings call for a new, more holistic approach to
assessment education.
More recently, researchers have begun to conceptualize and examine
teacher assessment identity (TAI; e.g., Adie, 2013; Looney et al., 2018;
Xu & Brown, 2016). TAI draws on teacher identity and assessment lit-
eracy research to account for the personal and social factors that mediate
teachers’ assessment learning and practice. In other words, the construct
suggests the extension of teacher education’s focus on teacher identity
development to include identity development related to assessment
specifically. Just as teacher identity development can support role sat-
isfaction, self-efficacy, and resiliency in the face of dynamic classroom
demands (e.g., Hong et al., 2017; Kelchtermans, 2005), assessment iden-
tity development may support the navigation of assessment dilemmas
and strengthen teachers’ assessment practice.
Up until the COVID-19 pandemic’s beginnings in Canada in 2020, the
Ontario College of Teachers (OCT), the regulatory college for teaching
in the province, had required student teachers to complete face-to-face
practica (Ontario College of Teachers, 2022). However, as the pandemic
mandated physical distancing, many student teachers were assigned
to online classrooms. With the rise of online learning and the poten-
tial for continued online practica in the future, ITE programs, teacher
educators, associate teachers (ATs), and student teachers themselves need
to understand how student teachers are engaging with assessment—and
developing their assessment identities—on online practica. Accordingly,
this study was led by the following questions:

1. What tensions do student teachers experience in observing and prac-


tising assessment on online practica?
2. Through what processes do student teachers negotiate these tensions?

Findings from this study can guide how ITE programs prepare and support
student teachers in not only developing their assessment identities on
36 Jenny Ge

online practica, but also navigating online classroom constraints to enact


assessment intentionally in support of student learning.

Conceptualizing Teacher Assessment Identity


as Narrative and Dialogical

Drawing on the large base of teacher identity literature, early authors


on TAI have broadly conceptualized it as being iteratively formed and
comprising multiple dimensions or subidentities. Adie (2013) referred
to TAI as teachers’ self-perceptions as assessors and described iden-
tity briefly as narrative (i.e., formed and emergent through narratives).
Her study found that participating in moderated grading conversations
helped teachers recognize beliefs influencing assessment decisions, con-
sider others’ perspectives, and (re)construct their self-perceptions as
assessors. Based on a scoping review of 100 assessment literacy studies,
Xu and Brown (2016) created a new conceptual framework for teacher
assessment literacy in practice (TALiP) that also suggested teachers’ iden-
tities as assessors could be developed through engaging in continual prac-
tice and dialogue to negotiate tensions.
More recently, based on their review of teacher identity and
assessment literacy scales, Looney et al. (2018) proposed a detailed
framework for TAI that presented five interlinked dimensions expressed
through “I” statements: knowledge (“I know”), beliefs (“I believe”),
feelings (“I feel”), self-efficacy (“I am confident”), and role (“my role”;
pp. 455–456). These dimensions seek to capture the personal, social,
and affective influences on teachers’ assessment practices in addition to
teachers’ knowledge. Each dimension represents a subidentity that can
interact and conflict, reflecting teachers’ oft-reported mixed feelings
towards or conflicting conceptions of assessment (Looney et al., 2018).
The framing of these dimensions as “I” statements suggest that Looney
and colleagues were conceptualizing TAI as emerging narratively and/
or dialogically, although the authors did not explicitly refer to these
identity theories.
Given its newness, few empirical studies on TAI and its development
currently exist. To further current understandings of TAI, as well as to
operationalize the construct, this study approached the examination
of participants’ TAIs explicitly through narrative and dialogical lenses,
which are well established in teacher identity research. Narrative identity
Supporting Student Teacher Assessment Identity Development 37

theorists (e.g., Mead, 1934; Ricoeur, 1991) have conceptualized identity


as emerging and forming through narratives. Dialogical identity theories
(e.g., Akkerman & Meijer, 2011) further suggest that identity is constructed
through dialogue with others (e.g., through the narration of experiences)
or with the self (e.g., through reflection). With Looney et al.’s (2018)
framework as a guide for identifying dimensions of assessment identities
emergent in reflective narratives, these theoretical underpinnings enabled
an in-depth examination of not only what participants were experiencing,
but also, of importance, how those experiences were (re)shaping their
expressions of their identities over time.

Tensions in Identity Development

Limited empirical studies exist on the role of tensions in TAI development


explicitly. However, tensions in teacher identity development are well
studied and well established (e.g., Alsup, 2006, 2019; Pillen et al., 2013).
Alsup (2006) described tensions as arising due to “identity conflicts” (p. 57),
or the perception of a professional or personal identity as problematic by
another. In her study following six student teachers’ ITE journeys, Alsup
(2006) found that student teachers commonly experience the following
three tensions: (a) tensions between student teachers’ selves as learners
in ITE programs and selves as professional teachers while on practica, (b)
tensions between personal beliefs and professional expectations, and (c)
tensions between the ideologies and practices espoused by ITE programs
and those experienced during practica.
Pillen et al. (2013) reviewed 17 studies conducted between 1981 and
2011 on teacher identity development and found that teachers experienced
13 common tensions. Some of the most commonly experienced tensions
were between (a) the teacher’s desire to be tough or authoritative versus
caring or nurturing, (b) the teacher’s desire to spend time and energy on
work versus their private lives (i.e., finding a work-life balance), and (c) the
teacher’s personal versus others’ (e.g., mentors, colleagues) orientations
towards the profession. Both Pillen et al. and Alsup (2006) concluded that
negotiating and resolving tensions supported teachers’ self-efficacy and
resiliency in their role. This study aimed to identify assessment-specific
tensions and negotiation processes to inform ITE programs in supporting
assessment identity development—and teachers’ self-efficacy and resili-
ency in online assessment—on online practica.
38 Jenny Ge

Methods

I conducted this study using narrative inquiry methodology, which facilitated


the elicitation of participants’ identities through narratives and supported
a more nuanced understanding of experiences in context and over time
(Clandinin & Connelly, 2000). Data were collected from September 2020
to August 2021 as part of a larger study that examined TAI develop-
ment through an ITE program more broadly. For this study, I extracted
conversations related to participants’ experiences with learning about
and practising assessment on online practica, with the understanding that
these narratives were produced within other and overarching narratives.
The aim of this study was not to isolate and generalize assessment iden-
tity development on online practica, but rather to identify ways in which
assessment identity development can be supported on online practica.
Seven student teachers within one cohort of an ITE program in
Ontario volunteered to participate in the study (see Table 2.1). Three

Table 2.1 Participant details.


Name Stream Teaching Practicum placements
subjects
Fall Winter Summer
Paolo Consecutive French and Grade 9/10 Grade 11 Grade 9/10
math French, French, in French,
in person person online
Miles Concurrent French Grade 9/10 Alternative Grade 11/12
and First French, arrangements English,
Nations, online due to online
Métis, COVID-19
and Inuit
Studies
Tanya Concurrent Music and Grade 7, Grade 11/12 Grade 11/12
drama online music, online music,
online
Marisa Concurrent - Kindergarten, Grade 5/6, Grade 1/2,
in person in person online
Avery Consecutive - Kindergarten, Grade 1, Grade 6,
in person in person online
Lola Concurrent - Grade 1, Grade 3/4, Grade 2/3,
online hybrid online
Rose Concurrent - Grade 7, Grade 1, Grade 5,
in person in person online
Supporting Student Teacher Assessment Identity Development 39

were enrolled to teach in the intermediate and senior divisions, and four
were enrolled to teach in the primary and junior divisions. Five were
enrolled in the concurrent stream and had completed some teacher
education courses during their undergraduate programs; the student
teachers enrolled in the consecutive stream entered the ITE program
after completing their undergraduate programs. Given their varying foci
and prior teacher education experiences, participants began the study
with varying levels of knowledge of and confidence towards assessment.
However, all participants expressed feeling as though they had limited
assessment experiences relative to teaching experiences before the start
of the program.
Participants chose their own pseudonyms. Each student teacher
participated in three semistructured interviews of 60 minutes: one during
their first practicum, one during their second practicum, and one after their
final practicum. Interviews focused on exploring their experiences with
learning about and conducting assessment online. Given that it was difficult
to predict practicum arrangements during the pandemic, four participants
ended up completing one or two practica in person. In these cases,
interviews focused on their experiences with conducting assessment more
broadly; participants were later prompted to compare their in-person and
online assessment experiences as all participants’ final practica were online.
Interviews were transcribed verbatim by Zoom (https://zoom.us),
and I reviewed them for accuracy before sharing them with participants
so they could clarify their responses as desired. Data were analyzed in
NVivo following three inductive, three-stage coding cycles (Saldaña, 2013)
guided by the principles of discourse analysis (Gee, 1999). First, I analyzed
transcripts using Looney et al.’s (2018) framework to identify the ways in
which participants narrated their own assessment identities. For example,
I assigned a code whenever a participant spoke as an assessor, indicated
by their use of “I know,” “I feel,” “I believe,” and so on, (from Looney et
al.’s framework) and other related words as they described their engage-
ment in assessment. My analysis of the transcripts then shifted to iden-
tify experiences of tensions related to participants’ assessment learning
or practice during their online practica, indicated by the expression of
conflicting ideas or feelings, discomfort, and uncertainty, as well as by
moments of hedging or silence. Finally, I analyzed transcripts to iden-
tify the processes through which participants negotiated tensions. Codes
about participants’ assessment identities prior to and after negotiation
processes were compared to identify changes and developments in their
assessment identities.
40 Jenny Ge

Findings and Discussion

Participants’ narratives revealed that they commonly experienced two types


of tensions as they engaged in assessment on online practica: (a) tensions
due to internal conflict between familiar envisioned assessment iden-
tities and unfamiliar desired assessment identities; and (b) tensions due to
external conflict between desired assessment identities and online practica
constraints. Note that participants experienced the first type of tension early
on in the program, whether their practica were online or in person; however,
the online setting amplified experiences of those tensions and suggested the
need for context-specific tension negotiation processes. Participants’ desired
assessment identities (i.e., who and how they wanted to be as assessors)
emerged as a conduit through which participants could simultaneously
express multiple dimensions (Looney et al., 2018) of their assessment iden-
tity. For instance, a participant’s expression of their desire to be a certain
kind of assessor (e.g., fair, inclusive) reflected their knowledge of what
being this kind of assessor entailed, to varying extents, and their beliefs that
this kind of assessor was valuable. Other studies have used similar concepts
of possible selves (e.g., Hamman et al., 2013) and provisional selves (e.g., Ibarra,
1999) to elicit and understand dimensions of teacher identity.

Internal Tensions

Consistent with previous research findings (e.g., Smith et al., 2014),


participants’ narratives revealed that their initial conceptions of how
one could approach assessment were strongly shaped by their personal
experiences with assessment. In their first interviews, Rose, Tanya,
and Paolo, who recounted primarily negative personal experiences,
expressed nervousness about being perceived by students as “scary” or
“unfair” when conducting assessment. All participants (except Miles,
who recounted positive personal experiences) expressed wanting to avoid
being the kinds of assessors they were familiar with (e.g., “intimidating,”
“stressful”). However, whereas all participants mentioned wanting to be
fair and inclusive, they appeared to wrestle with providing details and
tangible examples. Rose, for example, said, “I don’t really have a lot of
descriptive words for it right now.” Lola knew she wanted to be fair but
expressed some hesitation in her response: “I guess I feel like assessment
should always be fair. I feel like the kids should kind of know what they’re
getting themselves into.”
Supporting Student Teacher Assessment Identity Development 41

The intermediate/senior division participants were able to describe


relatively more specific assessment ideas (e.g., “give multiple smaller
assignments,” “offer personalized feedback”) but similarly struggled
with reconciling ideas with practice. Tanya, for example, shared that
she wanted to be an “inclusive” and “creative” assessor by offering her
students “a variety of tasks, assignments, hands-on projects, quizzes,
polls, [and] fun games.” Yet, at the end of her first interview, she stated
that her current preferred approach was still “pen and paper” because it
was “dependable,” noting: “We draw on our personal experiences and
what we know. And right now, what I know is the traditional way.” In
other words, participants intuitively knew how they did not want to
be as assessors, reflecting the feelings and beliefs dimensions of their
assessment identity. However, they did not yet know how to be, or have
the confidence to be, a different kind of assessor, reflecting knowledge
and self-efficacy dimensions that conflicted with their feelings and beliefs.
Engaging in assessment on online practica appeared to exacerbate these
tensions. Like many teachers prepandemic (Archambault et al., 2016), the
student teachers had limited training on or personal experiences with
online assessment upon which to base their envisioned or ideal assessor
selves. Tanya, for example, described feeling “overwhelmed” as “assessing
drama online was entirely new” to her when she began her first prac-
ticum online. In their third interview, Paolo, Marisa, Avery, and Lola
similarly struggled to describe detailed ideal online assessment practices,
despite having more prior in-person experience to reference. Paolo, for
example, noted,

I tried to make [assessments] authentic [on my previous practicum],


but . . . I don’t remember being told how [those assessments] could
be adapted because . . . in the online sphere, I don’t actually know
how [those assessments] could be authentic.

Comments like these suggest that participants were wrestling with


translating their familiar envisioned (and enacted) assessor selves to an
unfamiliar context.

External Tensions

Between their first and second practica, the student teachers completed a
foundational assessment course that appeared to significantly support the
42 Jenny Ge

development of their desired assessment identities. Amongst other topics,


the course discussed the value and purposes of assessment for, as, and of
learning, as well as authentic assessment. In their second interview, all
participants narrated more detailed and nuanced examples of how they
wanted to conduct assessment, citing new vocabulary and strategies from
the course. Marisa, for example, shared that on her second practicum,
she was actively conducting “diagnostic assessment and even assessment
as learning” to “better understand how [her] students are doing and help
them learn.” She went on to say: “[I’m] definitely getting more experi-
ence and feeling a little bit like I know more.” These comments reflected
developments within—and an alignment between—her knowledge of
assessments, her beliefs about their efficacy, her sense of self-efficacy in
enacting these assessments, and her perception of her role as an assessor
in supporting student learning.
In their second and third interviews, as participants described trying
to apply their new knowledge in ways that reflected desired assessment
identities, their narratives revealed tensions between their efforts and
online practica constraints. Specifically, participants commonly shared
narratives of trying to be “fair,” “inclusive,” and “supportive” of student
learning” as assessors but encountering online classroom constraints and
general practica constraints. Although the latter were not unique to online
practica, understanding these constraints can inform ITE programs,
teacher educators, and ATs in guiding student teachers through online
practica holistically.

Online Classroom Constraints

The online setting introduced several new challenges to participants’


assessment practice, each provoking tensions and challenging their
existing conceptions of how one could or should enact assessment. As
all participants were particularly concerned with being fair assessors,
all participants narrated tensions surrounding academic integrity on
online assessments. Miles, for example, described tensions between his
role perception and online assessment constraints: “My job is to get a
fair assessment to my students . . . . That was tricky with the remote
world and privacy and security.” However, participants were also keenly
aware that students could be experiencing inequities and/or extenuating
circumstances that could render an assessment inaccessible and unfair.
Supporting Student Teacher Assessment Identity Development 43

Participants noted that students were becoming increasingly


disengaged as a result of “Zoom fatigue” and “burnout,” as well as the
significant impacts of the pandemic on mental health and well-being
(e.g., Hamilton & Gross, 2021). As a result, participants generally agreed
that it would be unfair to penalize students for submitting work late or of
lower quality but were unsure about what to do otherwise. Lola, Miles,
and Paolo discussed trying to be fair by being empathetic or more lenient,
but worried that doing so would mean they were being less objective,
which they initially viewed as less fair. Narratives like these reflected
participants’ internal discourses as they tried to adapt their assessment
practices to the online classroom in ways that still reflected their desired
assessment identities.
More broadly, participants were concerned about conducting
assessments to support student learning and wanted to design authentic
assessments. Paolo, for example, wrestled with getting his students to pre-
sent in French without overly relying on dictionaries or scripts that could
be hidden when students’ cameras were turned off. Other participants
worried that some of the ways their ATs were modifying assessments
to be practical online (e.g., simplifying questions, shortening length)
were not adequately assessing students’ knowledge or preparing them
for the formats or expectations of future in-person assessments (e.g.,
in postsecondary education). Notably, the limited face-to-face commu-
nication with students (as cameras were not required to be on) left all
participants feeling uncertain about how their students were experien-
cing assessments, impeding their ability to adapt their approaches respon-
sively to immediate student needs.

General Practica Constraints

All participants discussed, at least once during the study, feeling com-
pelled to conform to their ATs’ assessment practices while on practica,
a constraint that has been commonly reported by student teachers
(e.g., Oo et al., 2021). In some cases, participants described conforming
because they generally agreed with their ATs’ practices; in other cases,
they conformed even when they disagreed with their ATs’ practices
because they believed they ought to as a student teacher. The need to
conform discouraged some participants from engaging autonomously
with assessment. Lola, for example, shared:
44 Jenny Ge

I feel like it’s just because every time that I ever assess, it’s always
been within the constraints of my AT. . . . I always had to, like, do
it, and then say, “Okay, is this good? Is this not good? How do I fix
this?” . . . It wasn’t really me thinking. It was me just kind of doing
whatever they want.

Participants also described how the short time frames of practica made
it difficult to experiment with assessment and observe the outcomes of
their experiments (e.g., whether an assessment was engaging, whether
students believed the assessment was fair) to inform their future practice.
Avery, for example, noted,

I didn’t have enough time to assess all of them. . . . I remember doing


three to four of the final assessments for the project that I had given
them to work on. [My AT] did the rest . . . [because] my placement
ended.

Although some practica constraints may be unavoidable, ITE programs


can support student teachers in negotiating the tensions that arise in
facing these constraints by facilitating and supporting the processes
detailed below.

Tension Negotiation Processes

Participants commonly negotiated tensions in four ways: (a) gaining


knowledge about assessment (general and online), (b) experimenting
with assessment in an online classroom, (c) experiencing affirmation
from others (e.g., students, ATs), and (d) reflecting on and recognizing
the assessor self (i.e., affirming the self ). Negotiation processes often
took place simultaneously and iteratively. These negotiation processes
largely aligned with Xu and Brown’s (2016) suggested TAI develop-
ment processes, which include co-constructing foundational knowledge,
engaging in assessment, participating in conversations about assessment
with colleagues, and reflecting on the assessor self. Negotiating tensions
supported participants in developing new, more detailed or nuanced,
and/or more confident dimensions of their assessment identities, as
well as enacting their assessment identities in different ways to adapt to
different contexts. These developments were reflected in changes to the
Supporting Student Teacher Assessment Identity Development 45

ways they narrated their experiences and (re)positioned themselves in


their narratives over time.

Gaining Assessment Knowledge

As participants initially had limited theoretical and practical know-


ledge about different ways to conduct assessment, gaining more know-
ledge played a key role in helping them develop their conceptions of
assessment—and, concurrently, their desired assessment identities. These
developments, in turn, supported the student teachers in negotiating
both internal and external tensions. Lola, for example, described how
learning more about assessment helped her identify ways to be inclusive
as an assessor, allowing her to move away from being the kind of assessor
she had personally experienced:

I had that course on assessment, [and] I realized that my assessment


should be differentiated as well. I don’t know why I didn’t think
about that. . . . Well, I did think about it [in terms of] differentiating
lessons, but not . . . making different tests. . . . I always thought it had
to be, like, one test because that’s what I’d seen before.

Other studies have similarly reported that student teachers shift their
conceptions of assessment from largely summative to more formative
through ITE learning (Smith et al., 2014) and practica (Xu & He, 2019).
These findings also support Xu and Brown’s (2016) and Looney et al.’s
(2018) emphasis on knowledge in their TAI frameworks.
Given that participants expressed difficulty in translating their
assessment thinking and practices between in-person and online
classrooms, gaining online assessment knowledge specifically helped
ease tensions between participants’ desired assessment identities and
online classroom constraints by expanding their conceptions of how they
could conduct assessment online to navigate those constraints. Avery, for
example, shared how she used her learning to help her be an “engaging”
assessor online as desired: “The [educational technology] course taught
us how to use [Google Earth], and I realized it could be a fun way for the
kids to learn and create a presentation.”
All participants, however, noted that there had been limited, if any,
instruction or resources on online assessment within their ITE courses.
46 Jenny Ge

Tanya recalled only one course that had spent one hour, at the instructor’s
discretion, on providing accommodations for online assessments. Miles
and Avery stated that an educational technology course talked about tech-
nology broadly without explicit reference to its potential use in online
assessment. These participants cited their own research and conversations
with ATs as building their knowledge of online assessment.
To prepare student teachers for online assessment, ITE programs
should provide specific information and guidance on online assessment.
Xu and Brown (2016) suggested that an appropriate assessment know-
ledge base should include knowledge of disciplines and pedagogical con-
tent; assessment purposes, content, and methods; grading; feedback;
assessment interpretation and communication; student involvement in
assessment; and assessment ethics. As ITE programs develop their cur-
ricula on online assessment, they may consider these topics, as well as
other contemporary online learning topics (e.g., digital skills, educational
technology), in relation to online classrooms.

Experimenting with Assessment in an Online Classroom

Experimenting with assessment in an online classroom helped participants


negotiate tensions between their familiar envisioned assessment iden-
tities and unfamiliar desired assessment identities by providing them with
opportunities to become familiar with the unfamiliar. Wenger (1998)
wrote: “There is a profound connection between identity and prac-
tice . . .. Practice entails the negotiation of ways of being a person in
that context” (p. 149); autonomy on practica allows student teachers to
learn about and grow their practice. Specifically, participants emphasized
how being given freedom and agency on practica helped them not only
better understand which practices would reflect their assessment iden-
tities, but also feel more professional, which fostered self-efficacy. Avery,
for example, recounted a memorable practica experience:

The best part was that the AT gave me the floor and was like, “It’s
your choice what you want to do with them.” . . . And I saw the
kids were pretty engaged. . . . Maybe had I been in another class-
room, the AT might say, “Okay, this is not what is accepted; . . . do
this instead.” Or maybe they would keep giving suggestions, so
I wouldn’t know if I’m doing good [according to the AT], or if I’m
actually doing fine.
Supporting Student Teacher Assessment Identity Development 47

Tanya emphasized that implementing her own ideas, given the limited
guidance on online assessment she received, was the primary process
through which she gained confidence in her online assessment abilities.
Participants noted that opportunities to assess varied significantly
between practica and ATs. Additionally, although it was not observed in
this study, Adie (2013) found in her study that the online setting, when
cameras were off, made it easier for newer or more hesitant teachers to
be passive in assessment conversations with other teachers. At the time
of this study, neither the ITE program’s practicum guidelines for student
teachers and ATs nor the OCT’s (2022) Registration Guide outlined spe-
cific expectations for students to engage in assessment on practica. ITE
programs may consider working with ATs to ensure student teachers are
offered authentic opportunities to both practice and discuss assessment
on in-person and online practica. Further, given the value of context-
specific experience, student teachers anticipating teaching online should
be supported in completing online practica.

Experiencing Affirmation from Others

Throughout the study, participants shared how discussing their concerns


and frustrations about online classroom constraints with their ATs
supported them in normalizing both the experience of dilemmas and
their feelings of uncertainty about online assessment. Paolo described
how this affirmation from experienced teachers helped ease tensions
about constraints by reassuring him that he did not need to overcome all
of them all at once:

I felt as though I was being a quote–unquote “bad teacher,” because


I couldn’t get all the stuff done that I wanted to. . . . But hearing from
teachers that this is how it is, especially when you first start, . . . I feel
confident that I’ll be able to not, like, completely drown in my first
year of working.

Tanya similarly shared how she began to feel more confident in online
assessment after her ITE professor copied one of her approaches:

When it came to [my professor], . . . I would say, “Oh, I did this


today,” and he would be like, “Oh, that’s neat,” and then he would
try it in our class. . . . It was nice to know what I’m doing is good.
48 Jenny Ge

Comments like these suggest that affirmation supported participants’


confidence and reassured them that they could or should continue to seek
ways to enact their desired assessment identities. Other studies have simi-
larly reported that ATs and instructors can greatly influence the identities
that student teachers believe they should enact (e.g., Rahimi & Bigdeli,
2014). Alongside creating more opportunities for student teachers to
practice online assessment, ITE programs can facilitate this tension
negotiation process by encouraging and facilitating feedback processes
and conversations, including collaborative reflection exercises as detailed
below, between student teachers, teacher educators, and ATs relating to
online assessment.

Reflecting on and Recognizing the Assessor Self

Reflecting on their assessment experiences supported participants in rec-


ognizing changes and growth in the self, reinforcing their growth, and
encouraging self-efficacy, which helped participants feel less tension as
they faced constraints. This recognition often occurred during guided
conversations, including job interviews for teaching positions and
interviews for this study. Marisa shared how reflecting during the study
enabled affirming self-dialogue:

If I hadn’t done the study I probably would have thought more gen-
erally in terms of my growth as a teacher, . . . but I don’t think
I would have had that same awareness of how I’ve grown as an
assessor.

Others shared that reflecting out loud helped ease their experience of
tensions by encouraging them to frame challenges as potential learning
experiences. Rose, for example, noted that recalling her experiences
during the study’s interviews “made it easier for [her] to look at some-
thing and go, ‘Oh, I could try this next time.’ ” This comment also shows
that reflecting helped Rose develop her ability to meta-position, “teachers’
ability to strategically select, blend, and shift between positions depending
on the teaching situation” (Blasco et al., 2021, p. 27).
Although reflective practice is often promoted as an individual pro-
cess (Collin & Karsenti, 2011), findings from this study suggest that ITE
programs should facilitate collaborative reflection exercises such as mock
job interviews and small group discussions to support student teachers not
Supporting Student Teacher Assessment Identity Development 49

only in recognizing and affirming growth, but also in examining beliefs


and personal experiences influencing their practice, in considering others’
perspectives, and in making more intentional decisions about assessment
(Adie, 2013; Kramer, 2018). A growing body of literature (e.g., DeLuca
et al., 2022; Kramer, 2018) can be referenced to design and foster more
meaningful, effective, and critical reflective practice in teacher education.

Conclusion

The tensions and tension negotiation processes that emerged from this
study provide evidence that student teachers’ assessment identities may
initially develop to be context-specific. However, it is possible that over
time, through more learning, experience, and tension negotiation, begin-
ning teachers will grow their abilities to meta-position (Blasco et al.,
2021) and become better able to independently adapt their assessment
to new contexts. In the meanwhile, the increasing prevalence of online
learning calls for dedicated programming on online assessment in teacher
education. Specifically, ITE programs and teacher educators should con-
sider building student teachers’ knowledge of online assessment, cre-
ating authentic opportunities to practice online assessment, encouraging
and facilitating conversations on and feedback processes about online
assessment, and fostering collaborative and critical reflection of online
assessment experiences. These processes can support student teachers in
negotiating tensions arising in their online assessment learning and prac-
tice—and, in doing so, support developments in their budding assessment
identities that will guide their practice.
It is important to note that as the student teachers in this study all
volunteered to participate and expressed interest in developing as
assessors, it is possible they were more inclined to engage in the tension
negotiation processes identified in this study. Other student teachers
may experience other or additional tensions as they practice online
assessment, including those between personal and teacher or assessor
identities, reflecting more general hesitations towards assessment. This
study also took place during the pandemic, which affected the ways
online classrooms were organized (intentionally and unintentionally)
and teacher educators’ and Ats’ preparedness in supporting student
teachers on online practica. Future research should explore how teacher
educators’ planned approaches to online assessment education support
student teachers’ assessment identity development, how beginning
50 Jenny Ge

teachers develop their ability to enact identities across contexts, and


how teachers’ online assessment practices intersect with their online
teaching practices, use of educational technology, and student learning
within the online classroom.

References

Adie, L. (2013). The development of teacher assessment identity through par-


ticipation in online moderation. Assessment in Education: Principles, Policy &
Practice, 20(1), 91–106. https://doi.org/10.1080/0969594X.2011.650150
Akkerman, S., & Meijer, P. C. (2011). A dialogical approach to conceptualizing
teacher identity. Teaching and Teacher Education, 27(2), 308–319. https://doi.
org/10.1016/j.tate.2010.08.013
Alsup, J. (2006). Teacher identity discourses: Negotiating personal and professional
spaces. Routledge.
Alsup, J. (2019). Millennial teacher identity discourses: Balancing self and other.
Routledge.
Archambault, L., Kennedy, K., Shelton, C., Dalal, M., McAllister, L., & Huyett, S.
(2016). Incremental progress: Re-examining field experiences in K–12 online
learning contexts in the United States. Journal of Online Learning Research, 2(3),
303–326.
Blasco, M., Kjærgaard, A., & Thomsen, T. U. (2021). Situationally orchestrated
pedagogy: Teacher reflections on positioning as expert, facilitator, and
caregiver. Management Learning, 52(1), 26–46. https://doi.org/10.1177/
1350507620925627
Clandinin, D. J., & Connelly, F. M. (2000). Narrative inquiry: Experience and story in
qualitative research. Jossey-Bass.
Collin, S., & Karsenti, T. (2011). The collective dimension of reflective practice:
The how and why. Reflective Practice, 12(4), 569–581. https://doi.org/10.1080
/14623943.2011.590346
Coombs, A., Ge, J., & DeLuca, C. (2021). From sea to sea: The landscape of
Canadian assessment education. Educational Research, 63(1), 9–25. https://
doi.org/10.1080/00131881.2020.1839353
DeLuca, C., Willis, J., Dorji, K., & Sherman, A. (2022). Cultivating reflective
teachers: Challenging power and promoting pedagogy of self-assessment in
Australian, Bhutanese, and Canadian teacher education programs. Power and
Education, 1–8. https://doi.org/10.1177/17577438221108240
Gareis, C. R., Coombs, A. J., Barnes, N., DeLuca, C., & Uchiyama, M. (April 5–9,
2019). Assessment literacy development: Exploring the influence of assessment
Supporting Student Teacher Assessment Identity Development 51

courses and student teaching on beginning teachers. Roundtable presentation at


American Educational Research Association.
Gee, J. P. (1999). An introduction to discourse analysis: Theory and method. Routledge.
Hamilton, L., & Gross, B. (2021). How has the pandemic affected students’ social-
emotional well-being? A review of the evidence to date. Center on Reinventing
Public Education. https://files.eric.ed.gov/fulltext/ED614131.pdf
Hamman, D., Coward, F., Johnson, L., Lambert, M., Zhou, L., & Indiatsi, J.
(2013). Teacher possible selves: How thinking about the future contributes
to the formation of professional identity. Self and Identity, 12(3), 307–336.
https://doi.org/10.1080/15298868.2012.671955
Hong, J., Greene, B., & Lowery, J. (2017). Multiple dimensions of teacher iden-
tity development from pre-service to early years of teaching: A longitudinal
study. Journal of Education for Teaching: JET, 43(1), 84–98. https://doi.org/10.
1080/02607476.2017.1251111
Ibarra, H. (1999). Provisional selves: Experimenting with image and identity
in professional adaptation. Administrative Science Quarterly, 44(4), 764–791.
https://doi.org/10.2307/2667055
Kelchtermans, G. (2005). Teachers’ emotions in educational reforms: Self-
understanding, vulnerable commitment and micropolitical literacy. Teaching
and Teacher Education, 21(8), 995–1006. https://doi.org/10.1016/j.tate.2005.
06.009
Kramer, M. (2018). Promoting teachers’ agency: Reflective practice as trans-
formative disposition. Reflective Practice, 19(2), 211–224. https://doi.org/10.1
080/14623943.2018.1437405
Looney, A., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualising
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.1080/
0969594X.2016.1268090
Mead, G. H. (1934). Mind, self and society. University of Chicago Press.
Ontario College of Teachers. (June 24, 2022). Registration guide. Ontario College of
Teachers. www.oct.ca/-/media/PDF/Requirements%20General%20Education
%20Teacher/EN/general_education_teacher_e.pdf
Oo, C. Z., Alonzo, D., & Davison, C. (April 2021). Pre-service teachers’ decision-
making and classroom assessment practices. In Frontiers in Education, 6, 1–12.
https://doi.org/10.3389/feduc.2021.628100
Pillen, M. T., Den Brok, P., & Beijaard, D. (2013). Profiles and change in begin-
ning teachers’ professional identity tensions. Teaching and Teacher Education,
34, 86–97. https://doi.org/10.1016/j.tate.2013.04.003
Rahimi, A., & Bigdeli, R. A. (2014). The dynamicity and flexibility of EFL
teachers’ role identities in language institutes. ABAC Journal, 34(2), 46–55.
52 Jenny Ge

Ricoeur, P. (1991). Narrative identity. Philosophy Today, 35(1), 73–77.


Rogers, A. P., Reagan, E. M., & Ward, C. (2022). Preservice teacher performance
assessment and novice teacher assessment literacy. Teaching Education, 33(2),
175–193. https://doi.org/10.1080/10476210.2020.1840544
Saldaña, J. (2013). The coding manual for qualitative researchers (2nd ed.). SAGE
Publications.
Smith, L. F., Hill, M. F., Cowie, B., & Gilmore, A. (2014). Preparing teachers to
use the enabling power of assessment. In C. Wyatt-Smith, V. Klenowski, &
P. Colbert (Eds.), Designing assessment for quality learning (pp. 303–323). Springer.
Stiggins, R. J. (1991). Assessment literacy. Phi Delta Kappan, 72(7), 534–539.
Wenger, E. (1998). Communities of practice: Learning, meaning and identity.
Cambridge University Press.
Xu, Y., & Brown, G. T. (2016). Teacher assessment literacy in practice:
A reconceptualization. Teaching and Teacher Education, 58, 149–162. https://
doi.org/10.1016/j.tate.2016.05.010
Xu, Y., & He, L. (2019). How pre-service teachers’ conceptions of assessment
change over practicum: Implications for teacher assessment literacy. Frontiers
in Education, 4, 1–16. https://doi.org/10.3389/feduc.2019.00145
Assessment as 3
Partnership
The “Co” in Co-Constructing . . .
“Not Quite”
Allison Tucker and Marc Husband

As educators, we have spent countless hours since 2020 developing,


refining, reflecting on, and revising our practices as we attempted to virtu-
ally engage with students in meaningful ways. Although there were new
skills to be learned for teaching in virtual settings, relational teaching—
building and maintaining partnerships with students—remained
important. Educators were in a unique position in September 2021: Many
schools were reopening, and both educators and students anticipated the
school year unfolding with time spent on virtual platforms and in person.
As new teacher educators in a faculty of education working with pre-
service teachers, we questioned how we could engage in assessment
practices that would help our students experience assessment in ways
that reflected current research. Our past teaching experiences as class-
room teachers and leaders in K–12 public education systems informed
our understanding of the challenges teachers faced in moving beyond
traditional views of assessment and using research to guide practice.
We define traditional views of assessment as understanding assessment
and learning as separate: the teacher as the authority in determining
success, the student positioned as passive and voiceless. This paradigm
of understanding still exists and emerges in both virtual and in-person
contexts.

DOI: 10.4324/9781003347972-5
54 Allison Tucker and Marc Husband

The Truth and Reconciliation Commission of Canada’s (2015) Calls to


Action, action number 65, requires educators to engage in research to help
advance reconciliation. As such, it was also important for us, as teacher
educators, to consider how our practices contribute to decolonization.
Our university is located on the unceded land of the Mi’kmaw people,
covered by the treaties of Peace and Friendship. Our effort to advance
reconciliation meant rethinking how we, as teacher educators and settler
allies, might take up our assessment practices in Indigenist ways (Wilson,
2007). Our growing understanding of the ways we could decolonize
our practices in teacher education provoked us to consider that teachers
imposing assessments on students was a marginalizing and colonizing
act (Eizadirad, 2019). As we imagined our year unfolding with preservice
teachers, we were motivated to enact research recommendations, inviting
them into assessment partnerships.
In this chapter, we examine assessment in higher education in a
teacher education program. We investigate how assessment as partner-
ship was experienced by a group of preservice teachers in a foundational
course towards their Bachelor of Education in elementary education.
The course was designed to support them in examining their perceptions
of assessment as partnership. Although the course was scheduled to be in
person, COVID-19 restrictions meant that for periods of time, the classes
also met virtually. Drawing on experiences from both contexts, the study
is framed by the question: “What are the experiences of assessment as
partnership for both of us, as teacher educators and preservice teachers?”
We describe the theoretical framework and methods for our study
and then share our findings, including quotes and excerpts of how our
awareness of our practices grew in response to noticing. We wrap up
the chapter with a discussion of implications for future work and draw
conclusions from the study.

Theoretical Framework

To study our practices and learn from preservice teachers’ experiences,


we used Mason’s (2002) theory of noticing, which invited us to examine
our practices, a form of research from the inside. Noticing is “a collection
of practices both for living in, and hence learning from, experience and
for informing future practice” (Mason, 2002, p. 44). Mason distinguished
between noticing and professional noticing: “Professional noticing is what
we do when we watch someone else acting professionally and become
Assessment as Partnership 55

aware of something that they do . . . which we think we could use our-


selves” (2002, pp. 29–30). The significance of Mason’s theory speaks to
noticing as intentional work for practitioners; through noticing, our
practices are enhanced.

Professional Noticing

In this study, we intentionally noticed how practices supported conditions


for assessment to be taken up as partnerships. Preservice teachers were also
provoked to notice teaching practices (e.g., co-constructing assignment cri-
teria) and to reflect on how they experienced assessment as partnerships.
Additionally, because participants were watching one another act profes-
sionally, everyone had opportunities to grow awareness of actions they
could see one another doing to support assessment as partnership.
Entering assessment partnerships involved navigating questions of
presumed authority, authorship of one’s learning, and the nuance of doing
so. We understood that there is often a gap between aspirations of teaching
and actual teaching (Hauge, 2021). In response to a gap we perceived in our
teaching practice, we set out to interrogate our actions in the classroom,
virtual and in person, to shift presumed authority from us to a shared
authority with students, and thus narrow the gap in our practices.
In the research in higher education, both Stefani (1998) and Doyle et al.
(2019) have described assessment as a partnership between academics
(teacher educators) and preservice teachers. In response to this conceptu-
alization, we posed the following general research question to ourselves:
As new teacher educators, how do our professional noticings enhance
our understanding of practices that support assessment as partnership?
To grow our awareness of this question, we explored the following two
aspects: (a) our assessment practices (virtual and in person) in our teacher
education classes, and (b) how preservice teachers perceived assessment
as partnership in those classes.

Assessment as Partnership

Doyle et al.’s (2019) conceptualization of assessment as partnership


informed how we wanted to reframe assessment as inherent in the
learning process, and with preservice teachers. Cook-Sather et al.
(2014) defined the partnership between students and professors as
56 Allison Tucker and Marc Husband

“a collaborative, reciprocal process through which all participants have


opportunities to contribute equally, although not necessarily in the
same ways” (pp. 6–7). Based on this definition, we sought to cocreate
partnerships with students that were reciprocal and responsive to each
individual. Dale (2017) described reciprocity as “acting ‘with’ and not
‘for’ others, decreasing traditionally hierarchical roles and replacing
power with shared responsibility” (p. 64). In terms of shared responsi-
bility with assessment practices, students have traditionally been viewed
as having “no role other than to subject themselves to the assessment
acts of others” (Boud, 2007, p. 17).
Boud and Solder’s (2016) work also revealed a shared responsibility
between teachers and students; they recommended teachers “shift
assessment discourse away from the notion that assessment is a unilateral
act done to students, to assessment that is mutually constructed between
learners and assessors/teachers” (p. 5). We also viewed the idea of mutu-
ally constructed (co-constructed) partnerships in Morcom and Freeman’s
(2018) description of the Anishinaabemowin understanding of “we”:
shifting from niinwi, “we (but not you)” and kiinwa, “you all (but not us),”
to kiinwi, “you and me/us (together)” (p. 815). The literature supported
our intention to invite students into assessment partnerships. In virtual
and in-person learning environments, we aimed for students to be full
participants; partners in the process; kiinwi.

Methods

As mentioned, we conducted this study with preservice elementary


teachers in their first year of a Bachelor of Education program at a small
rural university in Eastern Canada. As teacher educators in the elemen-
tary program, we taught a required foundational course called Principles
and Practices to two sections of students. One section consisted of
on-campus students who were primarily recent graduates of an under-
graduate degree program; the second section was located off campus and
mainly consisted of mature students who brought substantial work and
life experiences. Of the 45 students enrolled in both sections, all consented
to participate in the study.
Data consisted of our weekly conversations, during which we reflected
on how partnerships with students unfolded during each class. In add-
ition to our reflections, preservice teachers completed an assignment in
Assessment as Partnership 57

which they reflected on their experiences in response to the following


question: “In this course, assessment was approached as a partnership.
What was your experience of assessment as partnership?” Throughout
the course, and at the end with their assignments, we analyzed the data
using a phenomenological approach. Phenomenology seeks to under-
stand and appreciate how individuals make sense of their experiences
(Stolz, 2020). We were interested in understanding how preservice
teachers experienced assessment as a partnership. As such, we individu-
ally read their assignment responses and sought to determine a “pool of
meaning” (Marton, 1994, p. 4428) from these data. Using the experiences
of partnership they described, we each created categories based on these
descriptions, then met to discuss our categories. After this, we sorted our
categories and looked for commonalities in an iterative process that took
place on several occasions.
We began the semester in the midst of newness: The preservice
teachers were new to the Bachelor of Education program and to one
another. Some were also new to the town and campus. To establish
partnerships, we considered how partnerships might unfold in virtual and
in-person learning environments. We invited students into shared respon-
sibility, where co-construction was an essential practice throughout the
term. As a result of COVID-19 protocols, co-construction occurred both
in person and virtually. We invited preservice teachers to co-construct a
representation showing (a) norms of class interactions and (b) assignment
criteria. Preservice teachers’ ideas were the basis for the work for each
assignment.
For their first co-construction, preservice teachers were asked to reflect
on what dispositions and behaviours they needed from one another to be
at their best as learners. Using Mentimeter (www.mentimeter.com/) a
virtual tool that creates live word clouds, they co-constructed a represen-
tation of what they thought were necessary conditions for learning (see
Figure 3.1). It included ways of interacting with peers and establishing how
the class would work together regardless of the context. One suggestion
that surprised us but also captured an essential element of partnership
was the suggestion that preservice teachers could choose not to speak
or share; they had the right to pass on taking their turn. Providing them
with a choice on whether to participate in whole group conversations
began to shift authority to them to decide when, or if, they would share
their thinking. This shift was purposeful in building relationships, so pre-
service teachers saw their peers and us as partners.
58 Allison Tucker and Marc Husband

Figure 3.1 Working together: Words describing class interactions.

Figure 3.2 Co-constructed criteria (jamboard).

The preservice teachers’ second shared responsibility was determining


assignment criteria. By positioning assessment as partnership, we hoped they
would believe they had voice and agency in their learning. Conversations
about class assignments were important as we continued to build shared
responsibility and establish partnerships with them. Preservice teachers
Assessment as Partnership 59

Figure 3.3 Co-constructed criteria (whiteboard).

were invited to help create the success criteria for each assignment—
another example of our work to create partnerships purposefully.
Inviting preservice teachers to think about success criteria followed
a similar process for each assignment. Whether virtual or in person,
co-constructing success criteria involved three steps. First, preservice teachers
individually considered the assignment and generated ideas. Second, they
met in virtual breakout rooms or small groups to share and record group
ideas using either Google Jamboard (https://jamboard.google.com/;
see Figure 3.2), or whiteboard space (see Figure 3.3). Preservice teachers
indicated agreement with other groups’ ideas using checkmarks. In the
60 Allison Tucker and Marc Husband

third and final step, they finalized the criteria and sought agreement among
the group, including the teacher, through whole class discussion.

Noticings

We noticed little difference in how this co-construction process unfolded


between virtual and in-person groups. Some preservice teachers eagerly
brought forth their ideas, whereas others were quieter. Our interactions
with preservice teachers were also similar in that we moved between
small groups, then facilitated whole group discussions to bring all the
ideas together. In our data analysis, we agreed on one theme for teacher
educator noticings—we use what we know—and three themes for pre-
service teacher noticings: (a) previous assessment experiences have a
strong pull, (b) participation and contributions during group discussions
were uneven, and (c) feelings of discomfort arise during the process
of co-construction. To respect participant confidentiality, preservice
teachers’ names are pseudonyms.

Teacher Educator Noticings

We wanted assessment partnerships with preservice teachers to be an


experience of shared responsibility and equal participation (Boud &
Solder, 2016; Dale, 2017). Despite our commitment to invite them into
assessment partnerships virtually and in person, we noticed instances
when our prior teaching experiences were barriers. We summarized this
theme as, “We use what we know.” The ways we had previously under-
stood and enacted teaching, some that had become second nature, were
reminders of the intentional work required of us, particularly when some
of our habits took over. Two vignettes illustrate how challenging it was
for us to rethink teaching, even with our commitment and determination
to disrupt conventional practices and engage in reciprocal practices of
shared responsibility (Cook-Sather et al., 2014).

Vignette 1: “Is Everyone Okay with That?”

In small groups, the preservice teachers generated a list of criteria for


an assignment. Based on their proposed lists, I (Marc) began to lead a
Assessment as Partnership 61

discussion to consolidate and revise the criteria and help bring about
clarity with the whole group. The group addressed the first criterion, and
I started reading the first suggestion from the list. I asked the preservice
teachers if they were okay with it or would like to delete or modify it.
The question elicited many back-and-forth conversations among them to
clarify their wording. I reworded the criterion in response to a few pre-
service teachers’ ideas. Then I said, “Is everyone okay with that?” I looked
for a few nodding heads before ending the class.
Seeking agreement by asking, “Is everyone okay with that?” seemed
fine at the time. Looking back, however, something was off about me
editing their ideas using my words and then asking for agreement in a
way that some might not have felt comfortable saying no. In terms of
developing a partnership, I wondered if this approach could have been
perceived as a unilateral decision.

Vignette 2: “Well, Not Quite . . .”

To establish success criteria for an assignment, preservice teachers indi-


vidually considered criteria for the assignment, then shared ideas in
small groups, and finally shared ideas with the whole group. I (Allison)
invited preservice teachers to get the group started with their ideas.
Sophia offered an idea, and before my brain could slow down and pro-
cess what was happening, I said, “Well, not quite . . .” The rest of the
statement is moot; I sabotaged the exercise with my response. I imme-
diately recognized what I had done and quickly acknowledged it. I said,
“That was a lesson in what not to do!” I apologized and tried to open the
space once more, but the damage was done. As I worked through what
was now a tainted process of establishing success criteria, I sat with the
full realization that I had not contributed to a partnership or created safe
conditions. Rather, I had reestablished the idea that I make the decisions
about what counts as assessment and its quality.
Each vignette could be read from the perspective of teaching vir-
tually or in person, and each revealed teaching moves that had become
entrenched in our practices. We noticed that our language and our inten-
tion to create partnerships with shared responsibility were in contradiction.
Saying, “Is everyone okay with this?” or “Well, not quite . . .” in the con-
text of cocreating partnerships is evidence of what Boud and Solder (2016)
described as acting unilaterally. We must remain mindful that intentions to
create partnerships can go awry regardless of the learning space.
62 Allison Tucker and Marc Husband

Preservice Teacher Noticings

Research about learning through a lived curriculum supported our con-


jecture that by being immersed in the experience of assessment as part-
nership, preservice teachers would develop a deeper understanding of
how assessment can be woven into the processes of learning. Boud and
Molloy (2013), in discussing sustainable assessment practices, wrote,
“While the curriculum ‘as it is designed’ may influence learners and what
they do, it is the curriculum ‘as it is enacted’ (Barnett & Coate, 2005) that
has a direct impact on students” (p. 708). In other words, although the
preservice teachers may be told that they should involve their students
in assessment (curriculum as designed), the experience of being involved
in their own assessment as partnership (curriculum as enacted) might
have more impact on their future practices. Understanding that what is
experienced and lived might overshadow the theoretical presentations
of assessment has also been the focus of other studies, including Ayalon
and Wilkie (2020), Buck et al. (2010), Deeley and Bovill (2017), and
Grossman et al. (2009). These studies pointed to the importance of stu-
dent assessment experiences as essential components for developing their
future teaching practices. Assessment as partnership would be lived cur-
riculum, enacted both virtually and in person, and infused into students’
emerging beliefs about education. The experience would also be an
alternative to assessment imposed on students (Boud, 2007), which we
anticipated was the experience of many students.
As we engaged with preservice teachers virtually and in person to
create assessment partnerships, we tried to be explicit in our intentions
and practice. Of particular significance, when analyzing the data, we
noticed that there was no evidence that the context, virtual or in person,
played a role in preservice teachers’ experiences of assessment as partner-
ship. In other words, not one preservice teacher mentioned the learning
space as a factor in their experience. This finding is significant because as
we have asserted throughout the chapter, establishing assessment as part-
nership is complex work, whether in person or virtual.
As noted, we invited preservice teachers to see themselves as active
participants, co-constructing elements of the course; they defined a safe
and open learning community and offered suggestions about assignment
criteria. Many of these undertakings were messy and uncomfortable for
everyone. Leaving classes without clarity and answers was new for pre-
service teachers, as was having flexibility and voice in assessment. We
Assessment as Partnership 63

anticipated some discomfort in the process, as learning involves disequi-


librium. Despite their discomfort, we hoped that preservice teachers’
experiences of assessment as partnership would support critical consid-
eration of their previous assessment experiences and inform how they
might envision their future practices. Each theme identifies a tension that
preservice teachers experienced of assessment as partnership. We share
our understanding of the ways we came to understand their experiences
and what we could take from their reflections.

Previous Assessment Experiences Have a Strong Influence

The first theme that emerged deals with preservice teachers’ previous
assessment experiences. The responses revealed that in prior learning
experiences, they adopted a passive role in their learning by doing what
they were told. Although co-constructing assignment criteria offered them
opportunities to take an active role in their learning, the co-constructing
experience brought up desires to remain passive. In other words, pre-
service teachers wanted to be told what to do because they had been
successful in that paradigm.
This theme was characterized by phrases such as the following: “I had
never heard of this before” (Nancy), “I have never experienced [this]
before” (Anna), “I have not had the opportunity to partake in [assessment
as partnership] before” (Yolonda), and “It was so far from anything I’ve
experienced throughout my education journey” (Rachel). The word
“told” was also frequently used to describe their previous assessment
experiences. Preservice teachers relied heavily on their teachers telling
them what to do. As Emma’s comment illustrated, “I grew up in a system
where I was told what to do, and that was the bottom line.” Yeshi drew
on the understanding that success came from doing what one was told:
“Everything [was] always, ‘Do as you are told, and you will do well.’ ”
Anna’s response captured how the role of the teacher and the role of the
student would be enacted—the teacher would tell, and the student would
do exactly as instructed to “get the highest grade”:

We are so accustomed to being told exactly what our educators


want done (with so much detail), so we can then get it done and
make sure we are doing exactly what they want and what will get us
the highest grade.
64 Allison Tucker and Marc Husband

There was tension for preservice teachers: They recognized the oppor-
tunity that co-constructing offered them but wanted the familiarity of
being told what to do. This friction was evident in Leo’s comment: “I felt
a bit frustrated with the [co-constructing] process, almost as if I just
wanted to be told what to do.” Rachel’s comments echoed this sentiment,
which highlight that in playing the game of school, preservice teachers
were told exactly what the teacher wanted them to do:

I’ve always been a student who knew how to play the game of
school. When a teacher or professor told me what they wanted,
I could usually deliver it. So, at times during our co-construction
process, I found myself thinking, just tell me what you want so I can
give it to you!

In Rachel’s description, success was dependent upon what the teacher


wanted and the student’s ability to deliver just that.
The preservice teachers’ responses fostered our growing awareness of
how our practices continue to be rooted in the idea that teachers tell,
and students do, despite our intention to shift those practices. In trying
to move our practices towards assessment as partnership, we noticed
our own tendencies to revert to teaching moves that reinforced students’
more passive positions. Vignette 2, shared above, illustrates how we
unintentionally shut down a student’s idea, hearkening to a paradigm of
teachers telling and students doing. Overcoming the influence of prior
experience will take time, intentionality, and a willingness to accept the
discomfort and tension.

Participation and Contributions During Group


Discussions Were Uneven

Our goal was to weave assessment into the process of learning. The result
was that some preservice teachers felt excluded from the process whereas
others took on active roles, eagerly participating in the whole class discus-
sion; such was the nature of the uneven experience. For some, the experi-
ence was positive; they felt they had a voice and were valued through
the process. For example, Yannick “appreciated having the opportunity
to have a ‘say’ in [their] learning” and Luca thought it was “beneficial to
make students feel seen and valued in the classroom.” For others, however,
Assessment as Partnership 65

sharing their ideas as to how classes were structured was uncomfortable,


and they felt excluded from the process because their voices were not
heard. Thus, this second theme was framed by the uneven experience of
assessment as partnership for preservice teachers.
Phrases from the data in this category speak to the foundational work
of cocreating a partnership. Words and phrases such as “voice,” “say,”
and feeling “seen and valued” characterized the theme of noticing an
uneven participation during whole group classroom discussions. Some
responses included experiences of community that referred to the col-
lective “we,” whereas other responses referred to the individual experi-
ence “I.” For example, Yannick said, “I very much appreciated having the
opportunity to have a ‘say,’ ” whereas Nick’s comment pointed to the idea
that assessment as partnership “helped the classroom feel like a commu-
nity and everyone’s voice was heard.” Jan’s thoughts about assessment
as partnership also spoke to the idea that it provided “a sense of com-
munity among the classroom,” one that allows the preservice teachers
and teacher to work together as a team for a student-centred environ-
ment. Additionally, all preservice teachers had the chance to share their
input when criteria were being created. This approach provided each pre-
service teacher with the chance to voice their approval or disapproval or
ask questions for clarification. Jan’s reflection brought forward the idea of
shared responsibility. Words like “approval and disapproval,” “voice,” and
“sharing their input” all characterized the experience of partnership that
we intended.
Within the responses, the ideas of how the community was
experienced were further delineated. Luca drew our attention to the idea
that assessment as partnership had “benefits” to how preservice teachers
“feel seen and valued in the classroom.” However, the experiences of
community were uneven here as well. Frankie, who referred to them-
selves as a “more outspoken” student, “found that [they] were able to
contribute during class and have [their] ideas incorporated.” Having their
voices and ideas heard by the class highlighted their experience of feeling
part of the community. Others, however, even though they were part of
the same shared conversations, felt silenced and unheard. Letisha wrote,
“It was difficult to feel like it was all of us and not just the more vocal
few who were part of the decision” because there were so many people.
Furthermore, some preservice teachers felt they did not have a voice and
decisions were made by peers whose voices were “bigger.” This belief
was evident in Anthony’s comment:
66 Allison Tucker and Marc Husband

Although I really appreciated the collaborative approach, at times


I felt that the bigger voices in the classroom were making the
decisions for me. . . . When we were co-constructing criteria,
speaking against others’ opinions is just not something I am confi-
dent in—especially when their voices feel much bigger than mine in
a classroom. So, as much as I appreciated assessment as partnership,
I did struggle to find my voice in the partnership (in no way did
I feel that you were not allowing me to have a voice, just sometimes
when there are bigger personalities in a class it’s difficult to find your
place!).

The individual tensions seeped beyond personal experiences and emerged


as a group tension: participation was uneven. Avery wrote, “I felt as if
there were times when the differing ideas between the class caused con-
flict.” The conflict arose in response to the perception that participation in
the partnership and community was uneven. Preservice teachers needed
to navigate their experiences of the learning community and assessment
as partnership within the conflict, as illustrated by Ynes’ comments:

The duality of this process though, is that I felt alone in enjoying the
process. It did not feel like the people closest to me agreed with the
methods of assessment as partnership. This led to many awkward
conversations and moments of uncertainty for me as a learner.

Obvious in Ynes’ words were the difficult conditions occurring among


preservice teachers when establishing community and partnerships. It
was uneven.
Last, one preservice teacher, Drew, went so far as to describe the pro-
cess of co-constructing criteria as having to “uncover” the criteria that we
had already decided on. Drew wrote,

I did not feel like I personally had much input into selecting cri-
teria. . . . It did seem like there was a set of criteria that we as a
group had to uncover more than we actually created criteria, in a
true sense.

Drew’s comments speak to the idea that for some preservice teachers,
co-constructing criteria felt like a façade, and they did not have input.
Although some preservice teachers experienced assessment as partnership
Assessment as Partnership 67

as beneficial, these others believed that they did not have a voice and felt
lost in the process of co-constructing criteria.
In beginning this process of assessment as partnership, we drew upon
our own experiences of teaching in classrooms with elementary-aged
students and working with in-service teachers whose students are that
age. It somehow seemed easier to do things differently in an elemen-
tary classroom, where the rules of the game are not firmly entrenched.
It is now obvious to us that the preservice teachers have ideas of what
assessment should be based on their past experiences. As we invite them
into this partnership, we are asking them to unlearn a great deal about
the nature of learning and classrooms and adopt new understandings
of how educators contribute ideas and establish shared learning goals.
We have underestimated how deeply embedded ideas about decision-
making in classrooms are for the preservice teachers. Their responses,
when asked for their ideas, ranged from feeling liberated and glad to
have a voice to distrust (we are asking them to simply uncover our
ideas).
In thinking about how we might reframe the process of
co-constructing criteria in the future, one obvious insight is that we
need to address trust within this partnership. We also need to think
about other ways to allow spaces for voices. Perhaps the small group
work does not need to be reported to the whole group. Perhaps we
need to make more spaces for the quieter voices. Perhaps we need to
ask preservice teachers how to structure this process. After all, if this is
truly a partnership; they can guide, and they know their own barriers.
Perhaps we need to reflect on our enactment and understanding of the
“co,” so we can follow their lead.

Feelings of Discomfort Arise During the Process


of Co-construction

A third theme that emerged from preservice teachers’ experiences of the


process to establish success criteria revealed discomfort among students.
Words and phrases that indicated this feeling are “frustrating,” “confu-
sion,” and “difficult.”
The process of establishing success criteria was reiterated with each
assignment, and each time it was messy. Some preservice teachers shared
that they “were pushed to unfamiliar territory and . . . uncomfortable
68 Allison Tucker and Marc Husband

and on the edge a little” (Nancy) in what Yannick described as “a bit


conflicting, out of order, and/or unorganized” process. Asher added to
the idea that co-constructing criteria was difficult, saying they “often-
times found this [the process] frustrating” and that it would “cause more
confusion than it did understanding.” Asher wanted to know that the
process had “firmly established [criteria] by the end of the session.” It
was their “preference to be given success criteria and assessment rubrics
by the professor and then discuss any questions or comments regarding
these requirements.” Asher’s comments highlight the challenge of
shifting thinking from a paradigm where teachers tell and students do,
to a partnership.
Within this same theme, other preservice teachers articulated that
their discomfort with the process had dissipated. They shared that they
had learned through the process and that although they were “confused
at first” (Indigo) and “it was frustrating at first” (Toni), and it was “a bit
difficult to come to fruition” (Adrian), they could see the benefits of the
process and how it supported their learning. This idea was evident in
Toni’s comment about the process for establishing criteria:

[It] got slowly better as time went on. . . . Once the learning
started, and everyone started to co-construct the assessment cri-
teria together, it was easier to see the benefit of how it impacted
everyone’s thinking of assessment.

Toni’s recognition of growth through the process was echoed by Adrian,


who reflected on the process as “becom[ing] more comfortable,” and
although “difficult, . . . [it was] a rich learning experience.”
In addition to the preservice teachers who expressed that the process
got easier over time, others recognized the value in providing clarity about
the assignment. For example, Indigo said, “Now I think it [the process of
co-construction] is almost more effective as it develops a learner’s answer
more in depth.” This was also the experience of others in the course, who
“appreciate[d] the value and meaning behind this practice as students
were building their understandings as to what they will be assessed on”
(Asher). Similarly, others said that they felt they could “better understand
assignment requirements” (Luca) and “remember the criteria” (Drew).
The preservice teachers’ responses give a window into how they per-
ceive the partnerships we have invited them into. Thinking back to the
beginning of the year and the excitement we felt about offering students
voice in their learning, we realized there was a gap in our thinking:
Assessment as Partnership 69

This was a new process and experience for students that required us
to scaffold and support them in these partnerships. Effective teaching
involves knowing and responding to where the preservice teacher is on
their journey as the starting place—we acknowledge that we needed to
be better in this regard. We were okay with the messiness, the unfolding,
and the learning through the process. They, however, came with
expectations of how school functions. We needed to do better to bridge
the gap between where we started and their previous knowledge and
experiences.

Implications for Future Work

Despite calls in the literature for assessment partnerships in higher educa-


tion, our study shows that a gap remains between research and practice. In
other words, the preservice teachers we worked with continued to draw
on experiences rooted in conventional practices; we did not find evidence
that they had previously experienced assessment as partnership. For those
who had no prior experience in teaching and learning, the experience was
significant. For example, Nico, who had experienced more conventional
forms of assessment, found the partnership transformative to how they
saw themselves as a student:

With the experience of assessment as partnership, I am a little emo-


tional. I have always struggled with tests, exams, quizzes, etc. In
my undergrad, I had accommodations to assist with my anxiety,
this accommodation took me away from my peers, and I had to
write alone, and everyone knew something was “wrong” with me
because I left to write exams. . . . In this class, in particular, I have
been lucky to have been a part of the assessment process and assist
in co-constructing success criteria. If I was unsure of anything or
wanted to add anything, I was allowed and encouraged. My con-
fidence in my own educational abilities has increased significantly,
and I am not as anxious to speak out anymore. I never feel “stupid”
or like I have the wrong answer. Instead, I feel respected, listened to,
and a part of the learning community.

Nico’s response captures what we hoped would emerge from assessment


as partnership: a preservice teacher who felt “respected, listened to, and
part of the learning community.” This is one inspiring implication.
70 Allison Tucker and Marc Husband

Final Thoughts

Co-constructing assessment partnerships with preservice teachers resulted


in a fruitful year of learning for us. Taking time to notice the effectiveness of
our teaching through reflections, combined with the preservice teachers’
experiences, yielded many opportunities for us to notice instances when
partnerships were flourishing and when there were obstacles. All parties
come to teaching and learning partnerships with previous experiences:
teacher educators bring their experiences and expectations of being
teachers; preservice teachers bring their experiences and expectations of
being students. These combined experiences are opportunities to notice
how teaching and learning are enacted and speak to both the professional
noticings of teacher educators and what preservice teachers notice about
partnerships. Whether in person or virtual, mutually co-constructed
partnerships are difficult—it gets messy, seems unorganized, and feels con-
fusing when pre-established ways of being are interrupted.
Innovating new ways of experiencing assessment was challenging.
Our experiences provoked us to reflect on our practices and acknowledge
that sometimes they contradicted our commitment to kiinwi, “you and
me/us (together)” (Morcom & Freeman, 2018, p. 815). Kiinwi means that
preservice teachers have voice and are partners. Our study revealed that
to cocreate the conditions where kiinwi is possible, teacher educators
need to continue to grow their awareness of the potential benefits and
limitations that co-constructing offers assessment partnerships. For pre-
service teachers, the experience of something new is also difficult as they
“[un]learn the game of school” (Rachel) and grow their understanding
of co-constructing partnerships. Essential for this unlearning is what
Barnett and Coate (2005) described as the influence of the enacted curric-
ulum. In other words, when teacher educators enact curriculum, in this
case through assessment as partnership, it provokes preservice teachers
to rethink assessment, and educators can expect a range of influences
on students’ emerging practices. In-person and virtual assessment
partnerships, similar to the ones discussed, can offer ways to co-construct
new understandings of teaching and learning.

References

Ayalon, M., & Wilkie, K. J. (2020). Developing assessment literacy through


approximations of practice: Exploring secondary mathematics pre-service
Assessment as Partnership 71

teachers developing criteria for a rich quadratics task. Teaching and Teacher
Education, 89, Article 103011. https://doi.org/10.1016/j.tate.2019.103011
Barnett, R., & Coate, K. (2005). Engaging the curriculum in higher education. SRHE/
Open University Press.
Boud, D. (2007). Reframing assessment as if it were important. In D. Boud &
N. Falchikov (Eds.), Rethinking assessment in higher education: Learning for the
long term (pp. 14–25). Routledge.
Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The
challenge of design. Assessment & Evaluation in Higher Education, 38(6), 698–
712. https://doi.org/10.1080/02602938.2012.691462
Boud, D., & Solder, R. (2016). Sustainable assessment revisited. Assessment &
Evaluation in Higher Education, 41(3), 400–413. https://doi.org/10.1080/026
02938.2015.1018133
Buck, G. A., Trauth-Nare, A., & Kaftan, J. (2010). Making formative assessment
discernable to pre-service teachers of science. Journal of Research in Science
Teaching, 47(4), 402–421. https://doi.org/10.1002/tea.20344
Cook-Sather, A., Bovill, C., & Felten, P. (2014). Engaging students as partners in
learning and teaching. Jossey Bass.
Dale, E. (2017). Reciprocity as a foundational concept in teaching philanthropic
and nonprofit studies. Philanthropy and Education, 1(1), 64–70. https://doi.
org/10.2979/phileduc.1.1.05
Deeley, S., & Bovill, C. (2017). Staff student partnership in assessment: Enhancing
assessment literacy through democratic practices. Assessment & Evaluation in
Higher Education, 42(3), 463–477. https://doi.org/10.1080/02602938.2015.11
26551
Doyle, E., Buckley, P., & Whelan, J. (2019). Assessment co-creation: An explora-
tory analysis of opportunities and challenges based on student and instructor
perspectives. Teaching in Higher Education, 24(6), 739–754. https://doi.org/10.
1080/13562517.2018.1498077
Eizadirad, A. (2019). Decolonizing educational assessment: Ontario elementary
students and the EQAO. Springer Nature.
Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching,
re-imagining teacher education. Teachers and Teaching, 15(2), 273–289. https://
doi.org/10.1080/13540600902875340
Hauge, S. (2021). Self-study research: Challenges and opportunities in teacher
education. In M. J. Hernandez-Serrano (Ed.), Teacher education in the 21st cen-
tury: Emerging skills for a changing world (pp. 139–157). IntechOpen.
Marton, F. (1994a). Phenomenography. In T. Husén, G. Handal, & T. N.
Postlethwaite (Eds.), The international encyclopedia of education (2nd ed.,
pp. 4424–4429). Pergamon Press.
72 Allison Tucker and Marc Husband

Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
Morcom, L., & Freeman, K. (2018). Niinwi—kiinwa—kiinwi: Building non-
Indigenous allies in education through Indigenous pedagogy. Canadian
Journal of Education, 41(3), 808–833. https://journals.sfu.ca/cje/index.php/
cje-rce/article/view/3344
Stefani, L. A. (1998). Assessment in partnership with learners. Assessment &
Evaluation in Higher Education, 23(4), 339–350. https://doi.org/10.1080/
0260293980230402
Stolz, S. (2020). Phenomenology and phenomenography in educational research:
A critique. Educational Philosophy and Theory, 52(10), 1077–1096. https://doi.
org/10.1080/00131857.2020.1724088
Truth and Reconciliation Commission of Canada. (2015). Truth and Reconciliation
Commission of Canada: Calls to action. https://ehprnh2mwo3.exactdn.com/
wp-content/uploads/2021/01/Calls_to_Action_English2.pdf
Wilson, S. (2007). What is an Indigenist research paradigm? Canadian Journal of
Native Education, 30(2), 193–195. https://doi.org/10.14288/cjne.v30i2.196422
Addressing Challenges 4
of Online Learning
through Quality
Assessment Principles
Caitlin Fox and Julia Rheaume

As the bridge between teaching and learning, assessment is how


educators “discover whether the instructional activities in which we
engaged our students resulted in the intended learning” (Wiliam, 2013,
p. 15). Assessment is the narrative that describes the quality of an educa-
tional experience. Educators must understand foundational assessment
practices to ensure teaching becomes learning.
In education courses, students look to their instructors to be exem-
plary in the planning of assessment and instruction processes. Great
educators model the mindset that “every student deserves a great edu-
cator, not by chance, but by design” (Fisher et al., 2016, p. 2). Designing
assessment tasks and practices before a course has begun is an important
step in ensuring that quality assessment is integrated into both teaching
and learning processes. Educators can incorporate common principles of
quality assessment, such as clear criteria, feedback, and aligning evidence
into all aspects of the course design and planning processes, whether in
online or face-to-face instructional settings. As a result of this integra-
tion, students’ assessment experiences can lead to increased learning and
engagement rather than limited transfer of knowledge and skills that
leaves students feeling overwhelmed, confused, or frustrated by a lack of
coherence (Conrad & Openo, 2018).

DOI: 10.4324/9781003347972-6
74 Caitlin Fox and Julia Rheaume

Participants in this study were registered in Years 1 through 4 of


a Bachelor of Education program in Alberta. Students in the first two
years of their program were in both education and other discipline
courses, whereas Year 3 and 4 preservice teachers were taking only edu-
cation courses. As preservice teacher educators, we noticed that student
experiences with assessment varied. The research question that prompted
and guided this study was “What were preservice teacher experiences
with assessment while learning online?”

Literature Review

Educators can create an environment for active participation from learners


by involving them in the assessment process, potentially enhancing their
engagement and clarity. As Davies (2011) described, this involvement can
include having students participate in constructing criteria, engaging with
feedback, and collecting evidence of learning. Educators are encouraged
to view learners as active participants in the learning process, gather
information from a variety of assessment sources, and provide opportun-
ities for deep learning to occur (Hogg et al., 2022). The following guiding
principles were common across assessment frameworks and served
as our conceptual framework: (a) criteria: knowing the learning goal,
(b) feedback: getting to the learning, and (c) evidence: knowing the
learning has occurred.

Criteria: Knowing the Learning Goal

The Alberta Assessment Consortium’s (AAC’s, 2017) visual for assessing


student learning in the classroom illustrated a cycle of planning, for-
mative assessment (coaching), and summative assessment (judging and
reporting) that guides teaching and learning processes. Understanding the
learning goal is the primary focus in the planning stage of an assessment
cycle.
The key to effective assessment and evaluation is the match between
learning outcomes and assessment tasks. This match ensures reliable
and valid results in determining if learning has occurred, supporting
the idea that “assessment methods and approaches need to be focused
on evidence of achievement rather than the ability to regurgitate infor-
mation” (Brown, 2005, p. 82). Our goal as educators is to ensure that
Addressing Challenges of Online Learning 75

we are measuring what we are supposed to be measuring: student per-


formance of learning outcomes (Hogg et al., 2022; McTighe & Ferrara,
2021). Educators initially study the learning outcomes and unpack them
to understand what students need to know (the nouns in the outcomes
signal the content) and do (the verbs signal the skills), and then plan for
intentional ways to collect evidence of learning through assessment
(Bennett & Mulgrew, 2019; Conrad & Openo, 2018). Assessment must
be included in the planning stages of instructional design to ensure clear
alignment between learning outcomes and assessment tasks (AAC, 2017;
McTighe & Ferrara, 2021).
As part of designing assessment tasks, educators develop evaluation
tools with clear criteria that describe quality learning (Bennett & Mulgrew,
2019). Clear criteria indicate to students what they are expected to dem-
onstrate and inform the depth of instruction that educators need to plan
(Bennett & Mulgrew, 2019). As Brown (2005) stated, “This means that
the assessment criteria need to be clear, explicit, [and] framed in language
that is meaningful to . . . students” (p. 84). Students can use these criteria
to set learning goals and reflect on how they are doing in meeting the
criteria (Bennett & Mulgrew, 2019). Students can then seek purposeful
feedback on the criteria and learning goals (Bennett & Mulgrew, 2013).
Educators need to share assessment and evaluation tools with students at
the onset of teaching and learning, with feedback opportunities consist-
ently incorporated into the lessons (McTighe, 2021).

Feedback: Getting to the Learning

“If assessment is to be integral to learning, feedback must be at the heart


of the process” (Brown, 2005, p. 84). Feedback opportunities should keep
students actively engaged in the learning process and be intentionally
designed as part of instruction (Bennett & Mulgrew, 2010; Bennett et al.,
2016; Wiliam, 2013). As students practice skills and demonstrate pro-
gress towards learning goals, educators must provide opportunities for
them to ask questions and receive formative feedback, which can include
self-assessment and peer feedback (Brown, 2005). This feedback can alle-
viate confusion and address misunderstandings. Students must be able to
articulate what they are learning and why, seek and use feedback to create
learning goals and improve their work, and collaborate with educators
and peers during the learning process (Frey et al., 2018). With intention-
ally planned opportunities for feedback, students have a sense of clarity
76 Caitlin Fox and Julia Rheaume

and confidence that they can accomplish the tasks on their own (Frey
et al., 2018). Feedback during instruction (for both the educator and the
learner) is a vital component of quality assessment design that supports
student achievement (Black & Wiliam, 1998, 2018).
Educators can provide opportunities for students to use the feedback
they have received during lessons and instruction. Feedback used during
the learning process feeds learning forward and provides opportunities
for students to believe they are meeting the intended learning goals.
Feedback provided only at the end of task completion, with no chance to
use it, is evaluative feedback. Educators can provide evaluative feedback
in the hope that students will consider it in future work, but they are not
providing an opportunity for students to improve or enhance their work
while working towards the learning goals (Davies, 2011).

Evidence: Knowing the Learning has Occurred

Beginning with the end in mind means that educators know what evi-
dence of learning will look like before teaching begins. Starting with a
vision of what and how the evidence will be collected allows educators
and students to focus on clear learning goals (Wiggins & McTighe, 2005).
When summative tasks are aligned to the verbs within the learning
outcomes, educators know they are measuring what they are intending
to measure: student performance of the learning outcomes. A closer
look at an assessment task should allow an educator to infer the learning
outcomes that are being assessed. If the connection to the outcomes is
not obvious, it may mean that the assessment must shift away from a
selected response to a demonstration of skill through a performance or
applied task (McTighe, 2021). Selected response and short-answer sum-
mative tasks can help to measure verbs such as “identify,” “describe,”
and “recall.” Performance tasks and extended thinking summative tasks
can measure verbs like “analyze,” “compare,” and “discuss” (Hogg et al.,
2022). Summative assessment tasks are often products that are evaluated
as evidence of learning. By creating products, students demonstrate what
they know and what they can do (Davies, 2011). Evaluating the products,
described with clear learning criteria and aligned to learning outcomes, is
one method of knowing if learning has occurred.
However, products are not the only way to measure learning. To
know that learning has occurred, observations of students performing
skills and using knowledge are essential (Davies, 2011). Conversations
Addressing Challenges of Online Learning 77

with students about their learning through feedback opportunities also


support students’ active participation (Davies, 2011). As active participants
in the learning process, students need to be part of conversations during
lessons, and those conversations allow educators to collect evidence of
learning.
Cooper and Catania (2022) articulated why triangulation of assessment
is essential for educators and students to know that learning has occurred.
They explained that products are valid in evaluating students’ knowledge
and demonstration of skills, whereas listening to and conversing with
students can reveal depth of understanding that may not be seen within
a product, and observing students as they perform tasks is necessary to
complete the picture of the learning that has occurred (Cooper & Catania,
2022). With clear criteria of summative tasks, educators can collect evi-
dence of learning while using evaluation tools to describe the students’
levels of performance (Bennett & Mulgrew, 2019). Including observa-
tion and conversation as ways to know learning has occurred allows
both educators and students to ask questions for clarity, and students
can use the conversations to review and improve their learning while
working towards summative tasks. When educators include observations
and conversations that occurred during the learning process as part of
evaluating students’ products, they are armed with more knowledge
about what students know and can do in relation to the learning goals
(Cooper & Catania, 2022).
Quality assessment design ensures that students know where they are
going in relation to learning goals, have the tools they need to get there,
and can monitor their own progress through feedback opportunities and
conversations (Frey et al., 2018). Hattie (2018) argued that educators must
possess a frame of mind that appreciates their role as evaluators of impact
on student learning more than as deliverers of information. This edu-
cator mindset requires consistent and intentional clarity of learning goals,
direction to get to the goals, and feedback to support students’ learning.

Methods

We used a survey research design (Creswell, 2012) to gain insight into


the experiences of preservice teachers with assessment practices during
online learning that was necessitated by the COVID-19 pandemic. We
developed an online survey, following the guidelines identified by Dillman
et al. (2014), that included 15 quantitative (Likert scale) and two qualitative
78 Caitlin Fox and Julia Rheaume

(open-ended) questions to provide insights into trends in assessment


practices during online learning. The questions were grounded in the
literature and our knowledge and experience with quality assessment
practices.
After we received institutional ethics approval, a program assistant
emailed an invitation to participate and the survey link to all 444
Bachelor of Education students (Years 1 to 4) in mid-January 2022,
with a follow-up reminder email sent in early February. Seventy-three
students participated, a 16.4% participation rate. Although demographic
data were not collected, the variety of responses demonstrated a range
of experiences with postsecondary education. All collected data were
anonymized before being shared with the researchers to ensure that no
identifying information was included.
The quantitative survey items are reported as percentages in the results
section. The two open-ended qualitative survey questions were analyzed
with structural and focused coding as described by Saldaña (2009). Both
researchers independently reviewed and initially coded the responses,
leading to analytical categories that were established collaboratively.

Results

Reporting of the survey results is organized by the three aspects of


quality assessment, as discussed in the conceptual framework. Except
where otherwise indicated, a 5-point Likert scale describing frequency
from 1 = never (10% or less of the time) to 5 = consistently (90% or more of the
time) was used for the 15 quantitative items. Percentages were rounded,
so totals may not equal 100%.
In keeping with the purpose of this chapter, focused on assessment
practices in online learning, it was important to determine what percent
of participants’ learning had occurred online, so Question 1 addressed
this factor. During the COVID-19 pandemic, many postsecondary
institutions were required to deliver programming only online. In our
context, although some learners were able to access some courses on
campus, all but three participants (4%) indicated that at least 50% of
their learning was online. Another 26% of participants were learning
online between 50% and 60% of the time, and over 70% indicated that
at least 70% of their learning occurred online. Thus, for the majority
of participants, responses describe experiences with assessment while
learning primarily online.
Addressing Challenges of Online Learning 79

Criteria

Survey participants were asked three questions pertaining to their per-


ception of understanding assignment expectations. Table 4.1 shows the
responses to Questions 2, 4, and 5, which were Likert scale items rating
frequency of their experiences. Approximately 70% of respondents
indicated that they consistently or often understood what was needed for
an assignment.
In Question 3, participants were asked to identify how they learned
what was expected to complete an assignment. They selected all choices
that applied from the following list: rubric, video, verbal, written, and
none of the above. All but two students indicated multiple means.
Rubrics and written explanation (selected by 86% and 85%, respectively)
were the most prevalent ways that instructors communicated assignment
expectations to students. Verbal explanation was indicated by 81% of
respondents. Videos were the least common response, although still
indicated by over 60% of respondents.
Question 8 asked students about their feelings when beginning
assignments when learning online. Structural coding (Saldaña, 2009) of
their responses to this open-ended question led to the categories presented

Table 4.1 Students’ initial understanding of the assignment (Questions 2, 4,


and 5).
Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
2. I understood the 0 1 29 41 29
requirements of the
assignment.
4. I had the information, 0 1 29 44 26
knowledge, and skills
that I needed to be
successful on the
assignment.
5. I had the opportunity 0 4 15 33 48
to get clarification
from the instructor
about the assignment
prior to submission.
Note. Response options ranged from 1 = never (10% or less of the time) to 5 = consistently
(90% or more of the time). Percentages were rounded, so totals may not equal 100%.
80 Caitlin Fox and Julia Rheaume

Table 4.2 Students’ feelings about assessment during online learning (Question 8).
Feeling n %
Capable/confident 22 30
Overwhelmed 33 45
Confused 21 29
Frustrated/challenged/stressed 16 22
Anxious/nervous/depressed 5 7
Alone 2 3
Neutral 1 1
Note. Data indicate students’ responses to the question, “How have you felt when begin-
ning an assignment while learning online over the last two years?”

in Table 4.2. Participant statements were sometimes attributed to more


than one category. In this section, we use selected participant responses
to illustrate each of the categories.
Approximately 30% of participants felt capable and confident when
beginning their assignments. Participants discussed the importance of
being organized, being able to get clarification if needed, and having clear
assignment instructions. One participant stated that “online quizzes were
usually open book, so it was beneficial for me in regards to stress manage-
ment and confidence.”
The majority of participants (45%) indicated feeling overwhelmed
when beginning an assignment in an online environment, as described
in these comments: “I find I don’t always know where to start and get
overwhelmed,” and “You have the assignment in front of you, but you
really are not sure where to start.”
Almost 30% of participants identified confusion and often attributed
it to learning online: “Often confused because online learning is not
a very good way to give students the required knowledge for many
assignments.” Other students were “annoyed, oftentimes confused and
flustered” and “felt confusion, mostly because sometimes it wasn’t clear
what was being asked of us!”
Over 20% of participants reported feeling frustrated, challenged, or
stressed about learning online. One student described the source of their
frustration:

The frustrating aspects about this shift was the lack of preparedness
and clarity from certain professors. At times it felt like I was pulling
Addressing Challenges of Online Learning 81

teeth in order to get a straight answer from these professors as to


how I could succeed on certain assignments.

A few participants referred to feeling “stressed out,” and found learning


online more difficult or challenging: “It’s been harder online because
educators aren’t really able to show their students what they’re supposed
to do, and students aren’t able to get feedback as easily.”
Although one participant was “rather neutral” about learning online,
other respondents referred to feeling alone because “educators were there
but their efforts weren’t always the same as in person” or “nervous only
because it is harder to meet with educator/professor over Zoom calls.”
Student responses to this survey item provided a range of enlightening
responses.

Feedback

This section of the survey pertained to the feedback participants received


during the learning process. Results for Questions 6, 7, and 10 are reported
in Table 4.3.

Table 4.3 Feedback during learning (Questions 6, 7, and 10).


Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
6. I had the opportunity 3 14 29 32 22
for feedback about
my progress on the
assignment prior to
submission.
7. I sought feedback on 11 34 25 18 12
my progress of the
assignment prior to
submission.
10. I used the feedback 4 7 18 33 39
I received to make
changes to my
assignment prior to my
submission.
Note. Response options ranged from 1 = never (10% or less of the time) to 5 = consistently
(90% or more of the time). Percentages were rounded, so totals may not equal 100%.
82 Caitlin Fox and Julia Rheaume

These result show that although 50% of participants indicated they


could access feedback on their assignment prior to submission, the
majority (70%) sometimes, rarely, or never sought it. Question 9 asked
students about the source of feedback on their assignments, prior to sub-
mitting them. The majority of students self-assessed (85%) or sought feed-
back from their instructors (77%), compared to only 38% who indicated
that their peers provided feedback.

Evidence

Questions 11 through 16 addressed the summative assessment experiences


that participants had while learning online. Responses to Questions 11,
12, 13, 15, and 16 are reported in Table 4.4.
These results indicate that approximately 85% of participants reported
a frequency rating of 3 and above, which can be interpreted to mean that
their assessment experiences were aligned with their learning experiences.

Table 4.4 Evaluation of assignment (Questions 11, 12, 13, 15, and 16).
Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
11. The feedback 0 0 16 55 29
I received after
evaluation aligned
with the assignment
expectations.
12. I feel the feedback 1 10 32 33 25
I received after
evaluation was
helpful for future
assignments.
13. I feel the assignments 3 12 23 33 29
motivated me and
helped me make
progress towards
my personal and
professional growth.
15. I feel that the types of 0 7 32 45 16
assessments accurately
captured my learning.
Addressing Challenges of Online Learning 83

Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
16. I had opportunities 0 15 16 52 16
to demonstrate
my learning and
understanding
(knowledge and skills)
of the course outcome
through a variety of
assignment tasks.
Note. Response options ranged from 1 = never (10% or less of the time) to 5 = consistently
(90% or more of the time). Percentages were rounded, so totals may not equal 100%.

Participants were also asked to indicate the frequency of different types


of assessment when learning online (Question 14). Selected response
assessments (including multiple choice) was the most prevalent answer,
although most types of assessment were experienced at least sometimes
(frequency of 3 and higher) by 85% of participants. The least common
types of assessment while learning online required extended thinking,
such as performance tasks and project-based learning, although these
were experienced by almost half (49%) of participants.

Student Experience

The responses of the 37 participants who provided additional comments


at the end of the survey about their experiences were coded into four cat-
egories: self-perception, instructor, assessment, and other.
In the self-perception category, participants described assessment while
learning online as everything from “liberating” and “convenient” to “an
ongoing struggle” and “awful.” Although individual perceptions varied,
the majority of respondents indicated a preference for in-person learning.
Participants reflected on the role of the instructor related to online
assessment. Although some commented on instructors being supportive
and helpful, others noted that their experiences “varied based on the
instructor,” as described by one participant:

I believe, based on my personal experiences and having taken more


than one course from the same instructors, that generally speaking,
84 Caitlin Fox and Julia Rheaume

instructors who provided quality assessments in person also did so


online. Similarly, I also found if quality assessment was not as much
of a strength in person, it was also not a strength online.

Some respondents commented on their experience with assessments,


such as preferring multiple choice, being rushed or unmotivated, or
being tempted to “just copy answers on the internet rather than actually
trying to answer it” on their own. One respondent commented about not
understanding the purpose of an assessment:

I think the one thing that stood out was the lack of clear expectations
on assignments and how they related to what we were learning or
doing in the classroom to prepare us for our careers. I felt some of
the assignments felt more like busy work without a real purpose as
educators were adapting their assignments to fit an online setting.

In the category of other, participants had a range of comments about


their peers, including that “expecting us to collaborate as peers is unreal-
istic” and that “missing the peer connections is a huge loss of feedback.”
Overall, the participant responses to this open-ended question provided
rich insights into their experiences with assessment while learning online.

Discussion

This study on the challenges that preservice educators experienced with


assessment while learning online provides important considerations for
teacher educators and others who teach online. The survey data show
that the majority of participants (70%) understood what was required for
assessments, and the evaluation of assignments was a good indication of their
learning for 85% of participants. However, their experiences between these
two end points of the learning cycle were often marked by feeling frustrated
(20%), overwhelmed (45%), and confused (29%). Our results emphasize the
importance of using quality assessment practices throughout the learning
cycle to alleviate some of the challenges related to online learning.

Criteria

As shown in Table 4.1, most participants (70%) generally understood


what was needed for an assignment and 81% indicated that they had
Addressing Challenges of Online Learning 85

the opportunity to get clarification if needed. Participants indicated that


rubrics and written explanations about assignments were most often used
to convey the expectations and that videos were used over 60% of the
time. This shift in practice is likely due to the online environment, as
in-person instruction would likely have more verbal explanations that
provide opportunities for students to ask for clarification in the moment.
Although participants seemed to have access to the information they
needed related to their assessments, only 30% indicated that they felt cap-
able and confident when starting assignments. The accessibility of their
instructor proved to be important, as captured by one respondent: “I have
felt capable because if I was unaware of exactly what I needed to do, it
was often easy for me to get clarification.” Students who were feeling
overwhelmed, confused, or frustrated often attributed those difficulties
to the fact that their instructors were not as readily available. Statements
such as assignments being “hard to understand since it’s not face to
face” and “it’s been harder online because educators aren’t really able
to show their students what they’re supposed to do” demonstrate that
students value live explanations. Whole group, in-person explanations
also provided opportunities for more interaction and clarification, as
indicated by a participant: “More often than not I felt very confused and
overwhelmed. It was difficult to not be in a classroom asking the edu-
cator clarifying questions, or even having classmates ask questions about
the assignments that I hadn’t considered.” Including similar opportun-
ities in online settings may therefore enhance student confidence from
the outset of assessments.
In addition to being available to their students, educators can take a
number of steps to ensure students understand what is expected of them,
thus increasing clarity, reducing stress and frustration, and boosting con-
fidence. For students who struggle with getting started on an assignment,
instructors can review the rubric criteria and levels of performance with
them, provide exemplars, make short videos, and ensure alignment
between the written explanations, the rubrics, and the verbal explanations
(Davies, 2011). Additionally, providing time for students to brainstorm
ideas with peers, or providing check-ins for larger assignments, can help
students get going and keep going with their work.

Feedback

Survey results show that students rarely sought feedback prior to sub-
mitting their assignments. The most common sources of feedback for
86 Caitlin Fox and Julia Rheaume

participants were their instructor or themselves (self-assessment). Peer


feedback was identified by fewer than 30% of respondents. As one par-
ticipant noted, “Missing the peer connections is a huge loss of feedback.”
Feedback is an important part of the learning process that helps students
transition from understanding what they need to know and will be able
to do, to actually demonstrating their understanding through assessment
(Wiliam, 2013). Therefore, educators can build opportunities for different
types of feedback (self, peer, instructor) as part of their lessons (Bennett &
Mulgrew, 2010). By providing structured time dedicated to feedback in
online settings, educators can support students to feel confident that they
are on the right track towards successful completion of assessments.
Prior to due dates, educators can also dedicate online office hours to pro-
viding clarification and feedback on assignments. These strategies pro-
vide students with opportunities to interact with both the instructors and
their peers.

Evidence

When it comes to evidence of learning, the preservice teachers indicated


that their assessment experiences were aligned with the learning goals.
However, selected response items (such as multiple-choice questions)
were used almost twice as often as performance tasks and projects. This
choice of assessment is not typical in the preservice education program,
which tends to be heavily focused on performance when learning occurs
face to face. The increase in selected response items is a shift in practice
that can be attributed to the difficulty of assessing online learning. It calls
into question the alignment between the learning goals of the courses
and the type of evidence that the assessments produce.
Also important to the students was ensuring that they clearly under-
stood the purpose of the assessments and how those assessments
connected to both the courses and their careers as educators. By
considering learning experiences from the perspective of the receiving
end of instruction, educators may recognize a need to re-evaluate some
of their assessment practices.

Student Experience

A few respondents noted that their experience with online assessments


varied by instructor. As noted above, one participant stated, “Generally
Addressing Challenges of Online Learning 87

speaking, instructors who provided quality assessments in person also


did so online.” Another participant suggested that quality assessment
practices did not always transfer to the online setting:

There were quite a few incidences [sic] where assessment practices


were more “do as I say, not as I do.” We were taught assessment
strategies to avoid or to be only used in certain situations, etc., but
then would be given those types of assessments, so the messaging
was muddy.

A participant discussed a noticeable shift in the quality of feedback they


received: “I am just pointing out the discrepancy between how I received
feedback in person—a beneficial experience—and how I received feed-
back online—rarely helpful.” These statements illustrate some of the
differences that students experienced due to the shift to online learning.
Such challenges could likely be addressed by ensuring that quality
teaching and assessment practices transfer to online settings. To do this,
educators need to find ways to build and maintain relationships with
students and to create opportunities for connection to both the content
and one another. One participant discussed how they valued communica-
tion and connection with their instructor:

I have made connecting with my educators a regular part of my


weekly timetable because if I’m not understanding a part of an
assignment right from the beginning, then I get confused and then
I lose marks for either getting questions wrong or misunderstanding
the assignment in its entirety. . . . Communication with educators is
essential and just as important as coming to class.

Modelling quality assessment practices to preservice teachers is neces-


sary to building positive assessment experiences that can be transferred
to their future classrooms.

Conclusion

Whether online or in a face-to-face classroom, quality assessment


practices are essential to support student learning. Investing time and
energy in the planning stage and communicating clear criteria are needed
to provide direction about where the learning is going and to give merit to
the work and learning ahead. Educators sometimes feel caught between
88 Caitlin Fox and Julia Rheaume

their belief systems that value creating meaningful learning experiences


and institutional structures that require evaluation of student learning.
They can reduce this tension by reflecting on what gets them closer to
what they know is true and right for student learning (Fox, 2021). When
educators invest time in the planning, they shift from asking the question,
“What will I teach and mark?” to “What knowledge and skills need to be
learned, and how can I and the students collect evidence of this learning?”
The challenges related to assessment in online learning settings are
not insurmountable. Although many of the preservice teachers who
participated in this study described feeling overwhelmed, frustrated,
and confused when working on assignments, we found that they also
recognized that the educators who provided a quality educational experi-
ence in person were generally able to translate those practices to online
settings. This quality experience included providing clear criteria so that
students understood the learning goals. Educators also need to inten-
tionally create opportunities for feedback to assist students to get to the
learning goals. Finally, participants appreciated educators who were able
to transfer quality teaching and assessment practices into online settings.
In particular, they valued opportunities to connect to their instructors,
the content, and their peers as a means to increase clarity, seek feedback,
and bolster their confidence.

Limitations and Future Research

Survey results indicate that the preservice educators who participated


in this study generally experienced quality assessment practices when
learning online. A limitation of this study was that, in an effort to ensure
participant anonymity, we did not ask for demographic data. In retro-
spect, it would have been helpful to know participants’ years of study.
Students in Years 3 and 4 of their Bachelor of Education degree would
have been registered in only education classes, which often had syn-
chronous components. In contrast, students in Years 1 and 2 of the
program would have been taking primarily noneducation classes, which
tended to be more asynchronous. A high number of participants in
Years 3 and 4 of the program may have skewed the data, as education
professors may have had more opportunities to model excellent teaching
and assessment practices in their online classes than their noneducation
colleagues did. Further, due to the range of percentage of learning that
occurred online (30% to 100%), the results likely pertained to both online
Addressing Challenges of Online Learning 89

and in-person learning modes, not only online learning. However, as 70%
of students were online approximately 70% of the time, the results do
pertain to online experiences for the majority of students.
Another limitation was inherent in using an online survey meth-
odology. Our interpretation of the results was limited to the partici-
pant selections on the Likert scale and two opportunities for qualitative
statements. Future qualitative research may be needed to further explore
student perceptions of online learning and the related challenges.
One vein of this study that we hope to explore in future research
is the importance of mindset towards online learning. We did not ask
about participants’ perceptions of online learning and wonder if there
might be a correlation between mindset and perceived ability to succeed.
For example, the participant who stated, “I enjoy online learning as the
instructors are always willing to help and the learning is in your hands,”
likely had a different experience than the one who indicated they felt
“very overwhelmed and stressed out.”
Students who articulated a positive experience with assessment were
likely connected to the instructors. They could articulate how they were
getting to the learning goals and how they understood their learning had
occurred. Student experiences in the learning process informed how they
felt about assessments. Placing value on the learning process more than
the teaching process can lead to positive experiences for students and pro-
vide rich learning experiences for everyone involved.

References

Alberta Assessment Consortium. (2017). AAC key visual: Assessing student learning
in the classroom. https://aac.ab.ca/wp-content/uploads/2018/01/AAC-Key-
VisualAUG2017.pdf
Bennett, S., Lore, P., & Mulgrew, A. (2016). What matters most about assessment.
Alberta Assessment Consortium.
Bennett, S., & Mulgrew, A. (2010). Scaffolding for student success. Alberta Assessment
Consortium.
Bennett, S., & Mulgrew, A. (2013). Building better rubrics. Alberta Assessment
Consortium.
Bennett, S., & Mulgrew, A. (2019). Creating credible criteria. Alberta Assessment
Consortium.
Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through class-
room assessment. GL Assessment.
90 Caitlin Fox and Julia Rheaume

Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.1
080/0969594X.2018.1441807
Brown, S. (2005). Assessment for learning. Learning and Teaching in Higher
Education, 1, 81–89. http://eprints.glos.ac.uk/id/eprint/3607
Conrad, D., & Openo, J. (2018). Assessment strategies for online learning: Engagement
and authenticity. AU Press.
Cooper, D., & Catania, J. (2022). Rebooting assessment: A practical guide for balan-
cing conversations, performances, and products (How to establish performance-based,
balanced assessment in the classroom). Solution Tree.
Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating
quantitative and qualitative research (4th ed.). Pearson Education Inc.
Davies, A. (2011). Making classroom assessment work (3rd ed.). Connections
Publishing.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and
mixed-mode surveys: The tailored design method. John Wiley & Sons.
Fisher, D., Frey, N., & Hattie, J. (2016). Visible learning for literacy: Implementing the
practices that work best to accelerate student learning. Corwin A Sage.
Fox, C. (February 2021). Aligning our values and action: Putting the focus on
learning. Canadian Assessment for Learning Network Newsletter. https://us18.
campaign-archive.com/?u=bbaef655f1d5ab9dfe083c0aa&id=f27ae33b5c
Frey, N., Hattie, J., & Fisher, D. (2018). Developing assessment-capable visible
learners, grades K–12: Maximizing skill, will, and thrill (1st ed.). Corwin Literacy.
Hattie, J. (2018). 10 mindframes for visible learning. Teaching for success. Routledge.
Hogg, R., Armstrong, D., & Jones, M. (February 2, 2022). An excerpt from
“A framework for student assessment” (3rd ed., pp. 8–15). Alberta Assessment
Consortium.
McTighe, J. (October 20, 2021). Leading the conversation: The pedagogy of assessment
[Webinar]. Edmonton Regional Learning Consortium.
McTighe, J., & Ferrara, S. (2021). Assessing student learning by design: Principles and
practices for teachers and school leaders. Teachers College Press.
Saldaña, J. (2009). The coding manual for qualitative researchers. Sage Publications.
Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Association
for Supervision and Curriculum Development.
Wiliam, D. (2013). Assessment: The bridge between teaching and learning. Voices
from the Middle, 21(2), 15–20.
Assessing the Content 5
Knowledge, Skills, and
Competencies of
Teacher Candidates
in an Online Learning
Environment
A Case Study
Renee Bourgoin

As the COVID-19 pandemic spread around the world in 2020, public


health authorities imposed strict health measures, including restricting
public gatherings. These restrictions forced Canadian universities to
quickly move operations and teaching online. Faculty members had to
shift to emergency remote teaching and rethink how courses would be
delivered online. This sudden shift posed challenges but also provided
opportunities to explore the possibilities offered by online teaching (e.g.,
delivery formats, course designs, pedagogical approaches). In profes-
sional programs, such as the Bachelor of Education (BEd) at St. Thomas
University, students are not only expected to gain knowledge of teach-
able subjects, but also acquire the pedagogical skills and practical compe-
tencies required of the profession, including those associated with lesson
planning, teaching, assessment practices, collaboration, communication,
and critical reflection. Teacher candidates learn about and gain these

DOI: 10.4324/9781003347972-7
92 Renee Bourgoin

skills through practice, as well as by seeing their professors model them


when teaching their courses.
The formats of online courses tend to rely on readings and text-based
assignments and discussion (Bonk & Zhang, 2006; Castro & Tumibay,
2021). These approaches do not lend themselves well to many methods
courses in the BEd program, particularly those like the ones I teach, which
are designed to foster cooperative learning, practical teaching techniques,
and a high level of interaction with peers. As a teacher educator, I strive
to deliver strong instruction through exemplary teaching practices. As
I moved my instruction online, it was important for me to consider the
pedagogical implications of various online delivery models to ensure that
exemplary teaching and assessment practices could be retained in the
most authentic and meaningful ways possible. As a result, I wanted to
explore, through an introspective lens, the use of online formative and
summative assessment practices and how teacher candidates responded
to those assessment practices.

Context

Each year, St. Thomas University welcomes 90 teacher candidates into


its 11-month postgraduate BEd program, comprising 60 credit hours
of academic work including two field placements. The program uses a
cohort model, where students are grouped together in the same classes to
foster peer collaboration, teamwork, and cooperative learning. Teacher
candidates are expected to “quickly develop very close relationships with
their own instructors as professors take an active interest in the profes-
sional development of students” (St. Thomas University, n.d., para. 7).
During the 2020–2021 academic year in which this study took place, all
education courses were delivered virtually.
I have been a teacher educator for over 15 years and have also taught
in the public school system. In 2020–2021, my teaching load included two
methods courses: Elementary Social Studies and Elementary Language
Arts and a foundational course, Exceptionalities & Differentiation. All
three courses started in the fall semester and continued into the winter
term. Prepandemic, I taught these courses face to face. In shifting to
emergency remote delivery, one of my biggest concerns was formative
and summative assessments. Although I do not teach assessment courses,
our program encourages cross-disciplinary teaching approaches. As
such, ongoing formative assessment practices are an integral part of my
Assess Knowledge, Skills, Competencies of Teacher Candidates 93

pedagogy. As for my summative assessments, they lean heavily on reflective


practice, collaborative skills, task authenticity, and praxis (moving theor-
etical concepts towards classroom practice). Given the research on the
effectiveness and importance of formative assessment practices (Black &
Wiliam, 1998; Hattie, 2008) coupled with teacher candidates’ need to
develop formative and summative assessment skill sets for their own pro-
fessional practice, this was not something I wanted compromised when
shifting my teaching online. I grappled with questions: Would I be able
to move my current formative assessment techniques online? Would they
be as effective, and would students respond well to them? Would I be able
to monitor students’ learning of content and skills and do so meaning-
fully? Would my assessment practices provide me with the data needed to
properly assess teacher candidates’ growth? After considering how best to
deliver my courses and assess my students, I decided to use a synchronous
delivery model using Zoom (https://zoom.us).

Conceptual Framework

This study is grounded in Brunner’s (2006) work on the use of technology


as a “tool for pedagogy” (p. 230) and aspects of Hattie’s (2008) theory of
teaching and learning. According to Brunner, who has written about the
potential of remote teaching, technology “does not make a good teacher;
rather good teaching makes effective use of any tool, including tech-
nology, in support of overall pedagogy and course design” (2006, p. 230).
Hattie’s work identified pedagogical practices found to have the most
impact on students’ learning: a teacher’s passion for helping students
learn and the abilities to adjust approaches based on students’ progress,
set clear learning objectives, monitor learning, engage in reflective prac-
tice, and employ evidence-based strategies such as effective feedback
techniques, direct instruction, teaching of skills, and reciprocal teaching.

Literature Review

Formative and Summative Assessments

Assessment can be defined as a series of systematic procedures that enables


teachers to collect information about students’ learning (Bachman,
2004). Scriven (1967) differentiated between formative and summative
94 Renee Bourgoin

assessments, suggesting that the former is used throughout the teaching/


learning process to collect information that both teachers and students
can use to adjust. Summative assessments are done at the end of an
instructional sequence to make a final judgement call on the extent of
learning that has occurred.
Black and Wiliam (1998) contributed to current understanding of
classroom-based formative assessments and underscored their positive
effects on learning, motivation, and self-efficacy. Specific practices such
as timely and ongoing feedback and clearly defined learning objectives
are integral components of formative assessments (Black, 2009), as they
guide students towards task expectations and allow them to achieve
success (Feldman, 2019). As a daily embedded pedagogical practice, for-
mative assessments allow explicit links to be made between teaching,
learning, and evaluation.

Visible Learning

Hattie (2008) conducted a meta-analysis of over 50,000 research articles


on classroom practices and interventions that highlighted the import-
ance of visible learning; that is, students making their learning visible
to their teachers. When the learning is made visible, teachers can gauge
the extent to which learning is taking place and whether their teaching
is impactful. Visible learning also refers to teachers making their own
thinking processes visible to students so the latter can internalize them
and apply them to their own learning. Essentially, when these processes
are visible, students know what to do. Visible learning is an integral com-
ponent of formative assessments (or assessment for learning), of which
Hattie (2012) was a strong proponent. Like Black and Wiliam (1998),
Hattie underscored the importance of clearly articulated learning goals as
they provide teachers and students with something by which to measure
learning.

Assessment Practices in the Online Environment

Although experiences during the COVID-19 pandemic are leading to


emerging research on remote teaching in professional programs (Burns
et al., 2020; Danyluk et al., 2022; Van Nuland et al., 2020), there exists a
solid base of existing literature from different disciplines. Research has
Assess Knowledge, Skills, Competencies of Teacher Candidates 95

identified specific formative and summative assessment practices deemed


effective in online learning environments, including self-paced learning,
student agency choices, meaningful and engaging tasks, frequent check-
ins, open-ended questioning techniques (Greenhow & Galvin, 2020;
Mortimore et al., 2021), and reinvested feedback (Ambrose et al., 2010).
When learning online, students need to engage with the content,
fellow classmates, and professors in meaningful ways. Through syn-
chronous delivery, professors can make personalized connections with
students, support their learning, facilitate connections among students,
and encourage deep connections with the content (Beilstein et al.,
2020; Duncan & Barnett, 2009; Hodges et al., 2020; Quezada et al.,
2020). To maintain engagement in synchronous learning environments,
Mortimore et al. (2021) suggested check-in activities, random open-ended
questioning techniques, open cameras, polls, quizzes, and small group
breakout discussions.
More recently, Danyluk et al.’s (2022) edited collection synthesized
Canadian teacher educators’ responses to shifting BEd programs online.
In terms of assessments, suggestions included more frequent check-ins,
changes to assignments (flexibility), more opportunities for teamwork and
team problem-solving, increased class time for content exploration, and
limiting discussion boards. Bahula and Kay (2021) and Lauricella and Kay
(2021) suggested the use of video and audio feedback, peer and group
online feedback, and videoconferencing. Online presentations and mini-
teaches were also recommended (MacMath et al., 2022). The online
environment opens new possibilities and tools for conducting formative
assessments. As Lauricella (2021) explained, “A benefit of assessment in
online environments is that technology can often provide instant results
to both students and educators” (p. 249). Online quiz widgets that gen-
erate instant data are useful in quickly assessing progress and adjusting
teaching accordingly.

Research Questions

Because St. Thomas University’s BEd program was not designed to be


offered online, questions emerged about shifting classes to emergency
remote teaching.

1. How do formative assessment measures, already embedded into my


pedagogical practice, transfer to online teaching environments?
96 Renee Bourgoin

2. Can teacher candidates learn, implement, and demonstrate newly


acquired content knowledge, teaching skills, and related competen-
cies in meaningful ways online?
3. Can my formative and summative assessment practices and
assignments capture not only teacher candidates’ content knowledge,
but also the development of skills and competencies?
4. What insights can teacher candidates provide with respect to online
assessment practices and assignments?

Methodology

Case studies as a methodological approach account for the importance of


“the problem, the context, the issue, and the lessons learned” (Creswell,
1998, p. 36). Using an intrinsic case study approach, I set out to describe
and examine my online formative and summative assessment practices
during the first fully online academic year of the COVID-19 pandemic
(2020–2021). I chose this type of case study because my own teaching was
of primary interest to this exploration. By using a case study approach,
I was able to examine how my students (N = 40) and I experienced online
teaching, learning, and assessments within its real-life context.
To provide an in-depth description of this case, various data sets were
collected. At the end of term, teacher candidates filled out university-
mandated student opinion surveys (N = 94) for each of my three courses.
These surveys included two open-ended questions: (a) explain what you
liked the most about this course and why, and (b) explain what you would
change about this course to improve it and why. The answers to these
questions were part of the analysis of my pedagogical practices. For tri-
angulation purposes, I collected additional data sources, including my self-
reflection notes and other course materials (recorded lessons/lectures,
course syllabi, and instructional materials). I analyzed all data sets using
the qualitative coding technique of inductive analysis, typically used in
exploratory and descriptive research. I created initial codes, followed by a
more refined analysis to codify them into emerging themes and patterns.

Findings

Findings are described in three general themes. The first reports on the
analysis of my formative and summative assessment practices in the online
Assess Knowledge, Skills, Competencies of Teacher Candidates 97

environment. The second theme includes subthemes that emerged from


students’ responses to the university survey’s open-ended questions on
assessment practices they found particularly impactful to their learning.
The final theme describes the transmission of content knowledge and
teaching skills online.

Theme 1: Online Assessment Practices

Formative Assessments Embedded in My Online Teaching


Practices

In analyzing the data on my online assessment practices, I found evidence


of my frequent use of comprehension checks to monitor learning. For
example, I asked direct, open-ended questions about specific aspects of
the content being studied. Students in turn discussed these questions in
small groups or answered them individually. This format afforded oppor-
tunities for students to engage with the material, get support from their
peers, and reflect on the extent to which they understood new concepts.
To gauge the class’s overall learning trajectory, to elucidate different
responses, and to include more students, I used equity sticks (popsicle
sticks with individual students’ names pulled randomly for equitable par-
ticipation) and other collaborative techniques (such as think-pair-share,
numbered heads, and assignment of roles such as discussion leader or
detail provider) for students to report their understanding. Students were
aware that active engagement during these comprehension checks was
expected and that their contributions were valued and important learning
opportunities for everyone.
I also provided several opportunities throughout each lesson for
students to ask clarifying or extending questions. To support students
in generating these questions, I provided verbal cues triggering them to
reflect on and monitor their own learning:

We are about to discuss a new concept. Are you ready to move on?
At this point, would it be a good idea to ask any lingering questions?
What may still be unclear to you? What has not yet been answered
for you? What would you like me to discuss before we move on?

By offering up these cues, I made myself receptive to students’ emerging


questions, comments, and feedback.
98 Renee Bourgoin

Data indicate that I often used open-ended reflection questions during


lessons or at the end of class to gather additional information about
students’ progress and promote self-reflection. These reflection questions
provided opportunities for teacher candidates to assess where they were
at with their learning, and based on their responses, I could readjust the
pace or content of my next lessons. For example, I asked,

On a scale of 1 to 5, to what extent to you feel competent in


describing this concept to a colleague? Can you think of a way that
you could apply this instructional practice in a classroom? How con-
fident are you in your abilities to apply this concept? What do you
feel you now need to learn about this concept? Tell me what you
need from me at this point.

Teacher candidates were often grouped in small breakout rooms to sum-


marize parts of a lesson, teach others about newly learned concepts, or
undertake collaborative learning tasks. I took every opportunity to visit
each breakout room. By listening in, I was able to quickly gather infor-
mation about their learning, provide them with immediate feedback,
refocus them on the learning outcome, or extend their learning. When
I was in breakout rooms, I noticed that students asked questions that
were poignant, specifically addressed their learning needs, and related to
their experiences.
Virtual breakout room discussions and small group tasks were almost
always accompanied by the creation of a learning artifact. Students
needed to represent their learning in a concrete and tangible way: a proof
of learning that I could see, review, and analyze once the learning had
taken place. Often, artifacts took the form of quick and informal products
capturing the essence of their learning. For example, students provided
a brief oral summary of their discussion, shared a divergent idea that
emerged, created an idea web related to a concept, or filled in a graphic
organizer. More elaborate artifacts included jigsaw activities, posters
reflecting their learning, presentations, or video responses. Learning
artifacts were designed as ways for students to take ownership of their
learning and be accountable to themselves and others.
Finally, I employed visible learning strategies such as explicitly indi-
cating to students when I was using formative assessment techniques,
drawing attention to them, and explaining their purpose for my instruc-
tional practice. I also created activities in which I taught a concept,
followed by asking students to assess the strengths of my lesson and
Assess Knowledge, Skills, Competencies of Teacher Candidates 99

areas where the lesson could have been improved. These types of activ-
ities, along with the rich discussions that ensued, enabled me to model
reflective practices and notions of lifelong learning while also developing
important critical thinking skills in students. With such activities I made
my teaching processes visible to my students.

Examples of Summative Assessments Used Online

Given COVID-19 restrictions, developing lesson delivery skills needed to


be done online, and the types of assessments I created to measure these
skills are important to review because teacher candidate survey data
revealed much feedback about these assignments. Examples of the types
of summative assessments I used in my online courses are as follows:

1. Discovery box: Groups of four teacher candidates were asked to


curate a collection of primary and secondary sources related to a
learning outcome from the social studies curriculum and describe
their collection in a video presentation. They needed to include an
application component of how the artifacts could be used in the class-
room to teach related concepts and skills.
2. I know my teaching strategy: Each group of three teacher candidates
was assigned a specific instructional strategy. Once I had explained,
modelled, and guided the class in learning an instructional strategy,
assigned students would design lessons highlighting that instructional
approach. They then taught these lessons virtually to the rest of the
teacher candidates, who took on the roles of elementary students.
3. Becoming a reading teacher: Students (individually or in pairs) were
asked to apply newly acquired knowledge of content and skills related
to literacy instruction by designing lessons and recording themselves
delivering these lessons (some pretended to have students in front of
them; others enlisted the help of their own children). Other teacher
candidates and I then watched these recorded lessons to provide
teacher and peer feedback.
4. I am a reflective practitioner: In the first few weeks of the course,
pairs of teacher candidates created two literacy lessons to establish
a baseline of their pedagogical knowledge. As they gained deeper
understandings of literacy instruction throughout the term, they
reviewed their initial lessons and engaged in a critical reflection exer-
cise, analyzing their lessons to identify strengths and changes to make.
100 Renee Bourgoin

At the end of the term, students were asked to reflect on the overall
value of these periodic reflection exercises. They took stock of their
growth, their biggest takeaways, the value of reflective practices, and
the importance of being a lifelong learner.
5. Inclusive teacher showcase: Students were grouped in triads at the
beginning of the term, and every few weeks, the team would meet
during class to share their perspectives on recently covered topics. To
facilitate discussions, teacher candidates documented their group’s
learning and ideas for classroom applications by engaging in collab-
orative notetaking activities. At the end of term, they transformed
their learning logs into culminating projects highlighting their
learning journey throughout the course.

For all assignments, students were provided specific success criteria and/
or assignment exemplars. Additionally, time in class was set aside for
teacher candidates to work on parts of their assignments to ensure they
felt supported. As they worked together in their small breakout rooms,
I moved from room to room and initiated formal and informal mini-
conferences, inviting group members to talk about their work. I provided
a fair amount of targeted oral feedback in terms of strengths and things
they may want to consider moving forward.

Theme 2: Impactful Assessment Practices

Teacher Candidates’ Perceptions of Online Formative


Assessment Practices

Although specific questions about formative assessment practices were


not asked on the student opinion surveys, some students mentioned them
as a key element of the course. Specifically, the frequent use of compre-
hension checks was mentioned: “I liked how she consistently did compre-
hension checks,” and “She did an excellent job at ensuring we felt okay
with the material before proceeding. . . . at making sure she gave us time
to grasp concepts and further explore them with our classmates.” A few
students mentioned the use of clear learning expectations and goals,
stating that “learning objectives were discussed regularly.” In addition,
teacher candidates described my use of visible learning; that is, when
I was making my teaching and learning processes visible to my students
Assess Knowledge, Skills, Competencies of Teacher Candidates 101

so they could internalize and take ownership of them. Two comments


about the use of visible learning stood out: “Having her explain the
strategy, teach us using the strategy, then having us plan for and teach
using this strategy is such a wonderful way to get us to learn and be com-
fortable using these strategies in the future,” and “She gave us the oppor-
tunity to see and practice the teaching strategies instead of just telling us
how we should do it.”

Elements of Summative Assessments of Importance


to Teacher Candidates

Students discussed specific elements of my summative assessments they


particularly enjoyed or found beneficial, including clear connections
between concepts taught and assignments (“I liked how each assignment
was clearly linked to what we had been learning”), embedded reflective
practices, peer collaboration, immediate in-class personalized feedback,
choice of assignment, and authenticity of assignments.
In terms of offering opportunities to engage in reflective practices,
the following comments highlight how teacher candidates viewed this
strategy: “I really liked how after every assignment there was a reinvest-
ment period. . . . It allowed me to learn and improve my lesson plans
and teaching methods,” and “We had opportunities to review and reflect
on our assignments to see how we would change them, now that we
are more informed.” These reflection opportunities allowed them to see
their growth as new teachers: “The assignments were fair in that they
evaluated and focused on what we were capable of doing at the time,”
and “Looking back and reflecting on our first assignment really showed
us how much we learned over the last two terms (. . . A LOT).”
Time in class to work on assignments was mentioned often, as was the
collaborative nature of assignments: “I really enjoyed having the oppor-
tunity to work with others,” and “I liked the opportunity to collaborate
with colleagues and work with different groups.” For some, it exposed
them to new ideas and deeper learning. As one student mentioned,
“The assignments really opened up my perspective.” Teacher candidates
appreciated the time to work on assignment with peers in class, in
part because of the feedback I provided during in-class teacher-student
conferences and my interactions in their small breakout rooms: “The prof
was helpful in providing comments and feedback when we were doing
102 Renee Bourgoin

our work.” The following two teacher candidates’ comments illustrate


their thoughts on receiving this type of ongoing and immediate feedback
when working on assignments: “She provides feedback and gets you to
think for yourself while being there to support you through the process,”
and “She provided me with lots of feedback and helpful comments to be
able to complete the tasks.”
Finally, a few students mentioned the importance of choice and the
ability to submit assignments using different modalities (e.g., video
presentations, written work, or mini-teaches). As one explained, “I liked
that there was a lot of choice in this course regarding what final product
would be created to showcase learning.” Students’ comments on the
summative assessment elements they appreciated and found benefi-
cial provide insights into how online instructors may want to construct
their evaluations, including clear connections between concepts taught
and assignments, the addition of reflective practices, peer collabor-
ation, and choice of assignment. The importance of teacher-student
conferences to discuss drafts prior to final submission also cannot be
neglected.
Students seemed to value the authentic nature of summative
assignments. The relevance of assignments and how they connected to
classroom practice was a major theme in the data. Authentic assessments
refer to “tasks students will encounter in their future professional practice”
(Koh, 2017, Introduction, para. 3). Many students spoke about finding the
assignments purposeful, useful, and beneficial to their learning: “I liked
how valuable the assignments were . . . everything we did was purposeful,
meaningful, and could be reinvested,” and “I liked how everything we
did felt relevant and useful.” Students believed that assignments were not
only connected to classroom practice, but that they would be useful as
they transitioned to classroom teaching. As students explained, “Every
assignment was relevant to our future teaching careers,” “I always knew
how the assignments connected to my improvement as a teacher,” and
“We were encouraged to create practical assignments that could be used
in our future classrooms.”
Notably, although my summative assignments were designed to assess
the extent of their learning, students spoke about how they were, in and of
themselves, additional learning opportunities. As one student expressed,
“Assignments made sense and contributed greatly to our learning. They
consolidated what we learned in each module as well as gave us the
opportunity to draw on previous knowledge from prior modules.” Other
students echoed similar sentiments: “I enjoyed how we learned through
Assess Knowledge, Skills, Competencies of Teacher Candidates 103

doing assignments,” and “Assignments were effective for my learning.”


Assessments for learning in these cases became assessments as learning.

Theme 3: Transmission of Content Knowledge, Teaching


Skills, and Competencies

Teacher candidates perceived their online experience as one in which they


learned to be an educator, both in terms of content knowledge and in
developing teaching skills and other related competencies. As they noted,
“She has equipped us with the tools we need to be excellent teachers and
teach passionately,” “I can speak for everyone when saying how much
we’ve learned,” “The course filled a large gap in my knowledge,” and
“I feel more prepared going into my practicum and like I can actually help
students.”
Even online, students spoke of how their teaching abilities grew
“immensely,” how much more confident and prepared they felt going
into a classroom, how more “equipped” they felt, and how they not only
learned information, but also “how to teach.” Students wrote comments
such as, “I now understand WAY more about teaching,” and “I feel like
I know how to effectively teach this course now and have the tools to be
successful.” Their perception of how much they had learned and grown
as educators was reflected in the work they submitted. Through their
assignments, I was able to evaluate the extent of their content knowledge,
but also their teaching abilities and competencies as future teachers. Skill
transmission did occur in my online classes; their assignments clearly
reflected this outcome.

Discussion and Lessons Learned

I started my online teaching experience most concerned about how


I would apply my pedagogical practices in the virtual learning environ-
ment. Would I be able to transfer pedagogical and assessment practices
deemed most impactful to students’ learning (Hattie, 2008), and would
my students respond well to them? To explore this unique case and
context, I used an introspective case study approach to get an in-depth
understanding of my online formative and summative assessment
practices. The results of this study, although not generalizable, revealed
that my formative assessment practices were easily transferable to
104 Renee Bourgoin

online teaching. Both my students and I were able engage in forma-


tive assessment practices in ways that seem to have supported students’
learning. Formative data collected in the online environment enabled me
to gather information about students’ growth and adjust my teaching as
needed (Black & Wiliam, 1998). This cohort of teacher candidates were
completing their entire BEd online. From my experience, I can say that
their learning mirrored the learning observed in other years with other
cohorts.
Given that I transferred my pedagogical practices to the online
environment rather than re-envisioning them completely, the ways in
which I conducted my formative assessments remained, for the most
part, unchanged. Through Zoom, I was able to use and model for-
mative assessment practices in real-time, and teacher candidates were
able to experience their application in an actual (virtual) learning
context. I believe that frequent comprehension checks, the use of
learning artifacts to document learning and monitor progress, and
the use of effective feedback strategies were easily shifted to the vir-
tual learning platform. In analyzing my online teaching data, I found
that I employed several formative assessment techniques and strat-
egies that had been present in my in-person teaching. These findings
connect to Brunner’s (2006) assertions that technology is but a tool for
delivering content, and that the teacher remains the central element
of pedagogy. Even in the virtual learning space, I was able to use for-
mative assessment to make explicit links between teaching, learning,
and evaluation.
Upon close reflection, I conclude that formative assessments played
an even greater role in my online teaching than in my in-person classes.
I think this is the result of being more aware and cognizant of students’
possible feelings of isolation and perceptions of being disconnected from
their peers and professors. Because their whole BEd program was virtual,
I did not want them to believe that they had received a lesser education or
that they had missed out by doing their education degree online. I wanted
the students to feel as supported as they would have been in an in-person
environment. I wanted them to become the best teachers they could be,
and I deemed that my role as a mentor was increasingly important for this
special COVID-19 teacher candidate cohort. The frequent check-ins and
other formative assessment measures were initially designed to support
their learning, but in retrospect, I used them to monitor their overall well-
being in the program as well.
Assess Knowledge, Skills, Competencies of Teacher Candidates 105

The nature of my online summative assignments stayed relatively


the same in terms of what they were intended to capture. Synchronous
teaching allowed me to keep intact the collaborative and supportive
nature of my summative assessment. What did change, however, were
the formats by which students could present their work and demonstrate
their learning. I offered more interactive and video options (e.g., recordings
of themselves delivering lessons, recordings of reflection discussions,
prerecorded video presentations, vlogs, role-playing scenarios). These
different assignments formats enabled me to put teaching skills and
related competencies at the forefront, alongside content knowledge,
when assessing their growth as teachers. The approach enabled me to
capture more holistically their competencies. Such online assessments
provided me with the data needed to properly assess their growth and
development as teachers. I also think these types of assignments were
more meaningful for students. Moving back to in-person teaching, I plan
to continue the use of these assignment formats.

Conclusion

This contextualized study of my online assessment practices revealed


that I was able to uphold and model effective assessment practices in a
time of emergency remote teaching. The practices I traditionally used
in face-to-face teaching contexts shifted easily online, in part because
certain conditions were present. Namely, consistent student attendance
was expected, as was active engagement and participation. St. Thomas
University’s BEd program upholds a strict attendance policy, and students
are told early on about the role of accountability to themselves, to their
classmates, and to their future profession. This accountability is built
into the professional program and did not change when we pivoted to
remote teaching. Students recognized that class time was designed not
only for the dissemination and absorption of content knowledge, but
also for important opportunities to develop competencies and skills
related to teaching. These factors certainly played a role in how students
experienced and responded to their year of online learning.
Additionally, I believe that my own approach to online teaching
contributed to the positive findings highlighted in this study. I was
committed not to let the online environment hinder my pedagogical
practices, nor to let it impede students’ abilities to grow as teachers.
106 Renee Bourgoin

I recognized that the students would compare their online learning


experiences with face-to-face learning. Perhaps due to this realization,
I focused even more on modelling and using formative assessments to
support their learning than I had in person. I hope that my efforts had an
impact on how positive and successful the online experience was for me
and for my students.

References

Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., & Norman, M. (2010). How
learning works: Seven research-based principles for smart teaching. Jossey-Bass.
Bachman, L. (2004). Statistical analysis for language assessment. Cambridge
University Press.
Bahula, T., & Kay, R. (2021). Video feedback in online learning. In R. Kay &
H. Williams (Eds.), Thriving online: A guide for busy educators (pp. 228–235).
Ontario Tech University.
Beilstein, S. O., Henricks, G. M., Jay, V., Perry, M., Bates, M. S., Moran, C., &
Cimpian, J. R. (2020). Teacher voices from an online elementary mathematics
community: Examining perceptions of professional learning. Journal of
Mathematics Teacher Education, 24, 283–308. https://doi.org/10.1007/s10857-
020-09459-z
Black, P. (2009). Formative assessment issues across the curriculum: The theory
and the practice. TESOL Quarterly, 43(3), 519–524. www.jstor.org/stable/
27785033
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment
in Education: Principles, Policy & Practice, 5 (1), 7–74. https://doi.org/
10.1080/0969595980050102
Bonk, C. J., & Zhang, K. (2006). Introducing the R2D2 model: Online learning
for the diverse learners of this world. Distance Education, 27(2), 249–264.
https://doi.org/10.1080/01587910600789670
Brunner, D. (2006). The potential of the hybrid course vis-à-vis online and trad-
itional courses. Teaching Theology and Religion, 9(4), 229–235. https://doi.
org/10.1111/j.1467-9647.2006.00288.x
Burns, A., Danyluk, P., Kapoyannis, T., & Kendrick, A. (2020). Leading the pan-
demic practicum: One teacher education response to the COVID-19 crisis.
International Journal of E-learning and Education, 35 (2), 1–25. www.ijede.ca/
index.php/jde/article/view/1173/1836
Castro, M. D. B., & Tumibay, G. M. (2021). A literature review: Efficacy of online
learning courses for higher education institution using meta-analysis. Education
Assess Knowledge, Skills, Competencies of Teacher Candidates 107

and Information Technologies, 26, 1367–1385. https://doi.org/10.1007/s10639-


019-10027-z
Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing among five
traditions. Sage Publications.
Danyluk, P., Burns, A., Hill, L. S., & Crawford, K. (Eds.). (2022). Crisis and oppor-
tunity: How Canadian bachelor of education programs responded to the pandemic.
Canadian Association for Teacher Education. https://doi.org/10.11575/
PRISM/39534
Duncan, H. E., & Barnett, J. (2009). Learning to teach online: What works for
pre-service teachers. Journal of Educational Computing Research, 40(3), 357–376.
https://doi.org/10.2190/EC.40.3.f
Feldman, J. (2019). Grading for equity: What it is, why it matters, and how it can trans-
form schools and classrooms. Corwin.
Greenhow, C., & Galvin, S. (2020). Teaching with social media: Evidence-based
strategies for making remote higher education less remote. Information and
Learning Sciences, 21(7/8), 513–524. https://doi.org/10.1108/ILS-04-2020-0138
Hattie, J. (2008). Visible learning A synthesis of over 800 meta-analyses relating to
achievement. Routledge.
Hattie, J. (2012). Visible learning for teacher: Maximizing impact on learning.
Routledge.
Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (March 27, 2020).
The difference between emergency remote teaching and online learning. Educause
Review. https://er.educause.edu/articles/2020/3/the-difference-between-
emergency-remote-teaching-and-online-learning
Koh, K. H. (February 27, 2017). Authentic assessment. In G. Noblit (Ed.),
Oxford research encyclopedia of education. Oxford University Press. https://doi.
org/10.1093/acrefore/9780190264093.013.22
Lauricella, S. (2021). Equitable assessment in online environments. In R. Kay &
H. Williams (Eds.), Thriving online: A guide for busy educators (pp. 248–255).
Ontario Tech University.
Lauricella, S., & Kay, R. (2021). Fair and formative feedback in online learning.
In R. Kay & H. Williams (Eds.), Thriving online: A guide for busy educators
(pp. 236–247). Ontario Tech University.
MacMath, S., Sivia, A., Robertson, J., Salingré, B., Compeau, H., & Britton, V.
(2022). From disruption to innovation: Reimagining teacher education during
a pandemic. In P. Danyluk, A. Burns, S. Hill, & K. Crawford (Eds.), Crisis and
opportunity: How Canadian bachelor of education programs responded to the pan-
demic (pp. 28–41). Canadian Association for Teacher Education.
Mortimore, A., Hoffert, M., Kokas, M. S., Air, E. L., Yeldo, N., Lanfranco, O.
A., & Passalacqua, K. (2021). Synchronous learning for synchronous teaching:
108 Renee Bourgoin

Lessons learned from creating an online seminar to help physician educators


develop best practices for synchronous online instruction [Version 1].
Mededpublish, 10(85). https://doi.org/10.15694/mep.2021.000085.1
Quezada, R. L., Talbot, C., & Quezada-Parker, K. B. (2020). From bricks and
mortar to remote teaching: A teacher education program’s response to
COVID-19. Journal of Education for Teaching, 46(4), 472–483. https://doi.org/
10.1080/02607476.2020.1801330
Scriven, M. (1967). The methodology of evaluation. In R. W. Tyler, R. M.
Gagne, & M. Scriven (Eds.), Perspectives of curriculum evaluation (Vol. 1,
pp. 39–83). Rand McNally.
St. Thomas University. (n.d.). Program information. https://stu.ca/education/
program-information/
Van Nuland, S., Mandzuk, D., Tucker Petrick, K., & Cooper, T. (2020). COVID-19
and its effects on teacher education in Ontario: A complex adaptive systems
perspective. Journal of Education for Teaching, 46(4), 442–451. https://doi.org/
10.1080/02607476.2020.1803050
Part II
Reconceptualizing
Assessment Frameworks for
Preservice Teachers
Decolonizing Assessment 6
Practices in Teacher
Education
Joshua Hill, Christy Thomas,
and Allison Robb-Hagg

The ongoing news of discoveries of mass graves at the former sites of


Indian Residential Schools offers a tragic reminder that the work of
advancing the Truth and Reconciliation Commission of Canada’s (2015)
Calls to Action is urgently needed and tremendously important. Policies
in K–12 and postsecondary education have created space for this work
(Alberta Education, 2017; Association of Canadian Deans of Education,
2010; Council of Ministers of Education, 2019), and teacher educators
have a role to play to prepare a new generation of teachers who will con-
tribute to reconciliation (Donald, 2014). Yet, structures and pedagogies in
Canadian universities are dominated by Western epistemologies and con-
tinue to perpetuate colonial legacies of marginalization and oppression
(Louie et al., 2017).
In this chapter, our team of Indigenous and non-Indigenous instructors
chronicle some of our attempts to disrupt this hegemony. Through a col-
laborative action research process, we have made shifts to our assessment
practice, drawing on the decolonizing principles of storytelling, negoti-
ation, and democratization. We do not purport to have decolonized our
practice; rather, our goal has been to take steps towards decolonization
and creating an “ethical space” (Ermine, 2007, p. 193) to engage with
Indigenous knowledges and peoples in our course. We put forward this
work humbly. Our intent is to make visible our learning journey. We hope
that by sharing the challenges we have encountered and the pedagogical

DOI: 10.4324/9781003347972-9
112 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

decisions we have made, we might open up space for others to reflect.


We begin by describing our context, then we reflect on the axiological
considerations that framed this work. We outline our collaborative action
research methodology and share how Louie et al. (2017) inspired our
work. We then present two primary ways we have sought to decolonize
our assessment practice: (a) learning task design and (b) implementing
formative and outcomes-based summative assessment. For each prac-
tice we describe how we implemented the practice in an online learning
environment, reflect on the pedagogical decisions we made, and discuss
the questions, tensions, and next steps raised. We conclude by reflecting
on our experience and visioning ahead to our ongoing journey.

Context

As authors of this paper, we chronicle our experiences working as a


team teaching three sections of the same introductory curriculum
design and program development course. The course is situated in the
first semester of the first year of an elementary-focused Bachelor of
Education after-degree program. Due to the COVID-19 pandemic, in
the Fall 2020 semester we taught this course fully online. It is important
to position ourselves in this work, acknowledging that our positionality
“peels back the layers, uncovering motives, intentions, and subjectivities
while identifying particular epistemic perspectives” (Styres, 2019, p. 39)
that informed this chapter. Our team is made up of Indigenous and non-
Indigenous teachers. Josh is a member of the Métis Nation of Alberta,
and his ancestors trace back to Métis, Eastern European, and British
communities. Christy’s ancestors settled in Canada from Britain and
France. Allison is a descendant of settlers from Scotland and Sweden. We
acknowledge that the university where we teach is located in the trad-
itional territories of the Blackfoot Confederacy, the Tsuut’ina Nation, and
the Stoney Nakoda Nations. It is through our relationships with Elders
from these Indigenous communities in the lands of Treaty 7 that we root
our understanding of Indigenization.

Purpose

Battiste (2013) noted that decolonization is, at the core, about asserting
the presence and humanity of Indigenous peoples. With this idea in mind,
Decolonizing Assessment Practices in Teacher Education 113

the decolonization of our practice must ultimately lead to including


Indigenous knowledges and peoples in meaningful ways. However, Louie
et al. (2017) cautioned that “the tendency of most university instructors
to employ methods of instruction firmly situated within the epistemo-
logical structure of the dominant culture exacerbates . . . institutional
marginalization” (p. 18). We recognized that if we attempted to take up
Indigenous knowledges through teaching practices rooted in Eurocentric
ways of coming to know, the result could be an incorporating or assimila-
tive effect. Our goal was to create an ethical space from which to include
Indigenous perspectives in our teaching.
We embedded Indigenous voices throughout the course and
positioned them not only to help us learn about Indigenous knowledges
but also to contribute to how we were taking up many of the areas
of our course content. That is, we wanted our students to have the
opportunity to learn not only about Indigenous perspectives but also
from Indigenous perspectives. We hoped to model the inherent value
Indigenous knowledges offer. This practice links to an Indigenizing
practice Louie et al. (2017) presented as celebrating survival and cre-
ating survivance. They spoke to the increasing recognition of the value
of Indigenous thinking and advocated for the inclusion of creative
works from Indigenous peoples in the classroom (Louie et al., 2017).
We drew on a wide range of different ways to include the voices of
Indigenous people in our course, which included having an Indigenous
knowledge keeper join our synchronous Zoom (https://zoom.us) class
and interacting with videos, websites, book chapters, journal articles,
children’s books, and web resources created by Indigenous peoples
included as readings in our weekly schedule alongside texts created by
non-Indigenous peoples.
Building from this modelling, we designed a task that invited our
students to develop lesson plans that drew on resources created by
Indigenous people. In one such task, students were asked to plan a
lesson focused on a picture book authored and illustrated by Indigenous
people. Students planned for how they would introduce the author,
Indigenous community, and topic, and how they would scaffold students
to respond to the text. We identified the importance of building on
this step and helping students plan (and enact their plans) to include
Indigenous perspectives across the curriculum at both the lesson
and unit plan levels. With the purpose of our decolonization efforts
presented, we next discuss the axiological considerations that were cen-
tral to our journey.
114 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

Axiological Considerations: Seeking to Do


This Work in a Good Way

Although we were driven by strong moral and professional mandates, we


wrestled with how to include Indigenous perspectives in our teaching in
ethical ways. The settler members of our team, Christy and Allison, felt
hesitation because of their lack of experience with Indigenous peoples
and knowledges. Donald (2014) reported that it is common for educators
to feel unprepared to include Indigenous perspectives in their classrooms
and attributed this tendency in part to generations of marginalization
of Indigenous peoples in school and society. In addition to this lack of
knowledge, our team was aware of the need to avoid cultural appropri-
ation, misrepresentation, and stereotypical representations of Indigenous
peoples (Battiste & Youngblood, 2000).
We also recognized that the postsecondary educational context in
which we teach features a dominance of Eurocentric knowledge and a his-
tory of contributing to oppression that is embedded in the very structures
and pedagogies that we work within (Louie et al., 2017). If we did not
find ethical ways to include Indigenous perspectives in our teacher edu-
cation program, we could not expect our student teachers to do so in
their future classrooms. We asked ourselves how choosing inaction, far
from being a neutral choice, would exclude by omission and maintain the
legacy of assimilative and colonial processes (Battiste, 2013; Smith, 2012).
We, likewise, reflected on whether we were the right people to publish
research on this topic. We concluded that because the work of decolon-
ization calls on all university instructors to get involved (with all levels
of experience with Indigenous perspectives), our story and collaboration
might resonate (Association of Canadian Deans of Education, 2010). It is
not enough for there to be a need for this work to be done; rather, it is our
responsibility to put in great effort to create an ethical space to take it up
(McKegney, 2008; Zinga, 2019).
Rather than seeking to resolve the tensions that exist in this work, we
have sought a way forward in ongoing reflection on and attentiveness
to axiological consideration (Styres, 2019). Drawing from Styres et al.
(2019), we recognized that decolonizing our practice is a journey that
involves ongoing efforts to examine and shift our own assumptions and
make changes in practice. Attending to our roles and responsibilities in
relationship to “kinship and community ties, to our places, to one another
cross-culturally” (Styres et al., 2019, p. 23) is critical in maintaining the
Decolonizing Assessment Practices in Teacher Education 115

reflexivity required in this work. We need to remain attentive to how our


actions are perceived, learn from our missteps, and do better as we come
to know better. To this end, we constantly asked of ourselves, Elders,
and our students the essential question: How are we doing this work in a
good way? Reflecting on ethical considerations was a key component of
our action research team meetings.
We also sought out wisdom and guidance from Elders to help create an
ethical space. Josh was able to meet on the land and talk on the telephone
with Elders during the semester to support the ethical decision-making
process. Josh has found that fostering relationships with Indigenous
Elders, seeking out opportunities to learn from them, and drawing
on their wisdom in making decisions is an important part of seeking
to do this work in a good way. Christy and Allison have been building
connections within Indigenous communities. As a faculty, we have had
several professional learning days of experiences with Indigenous Elders,
learning from them. We will continue to build on these relationships and
to find ways for our learning community to give back and contribute to
local Indigenous communities.
Further to seeking to create an ethical space, we welcomed our
students into conversations about how we were attending to and wrest-
ling with axiological considerations (Crawford et al., 2022). This approach
made us feel vulnerable, and we often spoke in our team meetings about
our feelings of vulnerability. We found that students responded to this
vulnerability by being vulnerable themselves. In modelling reflexivity,
and humility, along with explicitly detailing the axiological decisions we
were making, many of our students responded in kind. We describe in an
upcoming section how we used talking circles to create a space for this
practice. Modelling and explicitly reflecting on our decolonizing peda-
gogical decisions helped move our learning community past silence and
inaction (rooted in fear of saying or doing the wrong thing) and into a
space of careful, reflective steps.

Theoretical Framework

Louie et al. (2017) provided a theoretical framework to support one’s


ongoing efforts to decolonize one’s practice, and in this chapter, we
have drawn on their work to help us present our experience. Louie et al.
featured a case study of four Indigenous scholars who sought to decolonize
116 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

their practice, drawing on 25 decolonizing principles that Smith (2012)


outlined. Louie et al. rooted their article in the same Treaty 7 territory as
our context, and we hold a relational connection as Josh and Christy have
worked with and received mentorship from coauthors Louie and Hanson.
The latter have called for a “reformation of pedagogy in higher education
through the use and valuing of Indigenous epistemologies and ways of
knowing” (Louie et al., 2017, p. 21) and shared stories from their teaching
practices at the University of Calgary. They have contended that in order
to disrupt the dominance of Western epistemologies and corresponding
marginalization of Indigenous perspectives, Indigenization needs to go
beyond teaching about Indigenous content. One of the explicit aims of
their article is to offer Indigenous and non-Indigenous instructors a place
to start in Indigenizing pedagogy in higher education. We drew on Louie
et al. in our action research process to support our weekly dialogue, to
help us understand the problems emerging in our practice, and to guide
our pedagogical decision-making. We have also drawn on Louie et al.
throughout this paper to present and contextualize our experience. We
are grateful for their inspiration and support and hope that by sharing
how we have drawn on their ideas, we honour their work.

Methodology

We used a collaborative action research approach as a framework to col-


laborate as a teaching team towards iteratively improving our teaching
practice. Collaborative action research positions practitioners to inquire
into problems of practice involving ongoing reflective dialogue (Mills,
2013) and iterative problem-solving (Parsons et al., 2013). An important
aspect of action research is examining current practice (Mertler, 2022),
and as instructors, each week we were examining our practices and
reflecting on our next steps (Mills, 2013). The iterative process allowed us
to analyze our assumptions and to grow professionally (Hendricks, 2017).
Although not an Indigenous methodology, we position collabora-
tive action research as being “in alliance with” (Kovach, 2009, p. 152) an
Indigenist research approach. Wilson and Hughes (2019) offered that
Indigenist research involves “working from a relational understanding of
reality [and] . . . engaging respectfully with Indigenous Knowledge and
Indigenous Peoples” (p. 8). We believe that collaborative action research
has the potential to advance the tenets of decolonization within educa-
tional research because of its relational and democratic nature and its
Decolonizing Assessment Practices in Teacher Education 117

focus on amelioration in a local context (Mills, 2013). Furthermore, we


identified that a collaborative action research methodology provided a
framework for equal partnership between Indigenous and non-Indigenous
researchers, a necessary design feature of decolonizing methodologies
(Kovach, 2009).
To operationalize collaborative action research, we employed a cyc-
lical process that involved engaging in reflective dialogue on our teaching,
identifying emerging problems of practice, drawing on educational
research to consider possible solutions, creating instructional strategies,
testing those strategies in our teaching practice, collecting and analyzing
evidence of student learning, and beginning the cycle anew by engaging
in reflective dialogue on our teaching. During our weekly meetings, we
identified Louie et al. (2017) as a key resource to help us understand our
problems of practice and guide us as we made pedagogical decisions
and examined our current practices. Our data collection consisted of
documenting process notes and researcher reflections throughout the
action research cycle. At the end of the course, we collaboratively analyzed
and reflected on our experience by revisiting the data and engaging in
dialogue as a team. What emerged from this analysis and reflective dia-
logue were themes related to task design, reframing assessment, and the
associated challenges with decolonizing assessment practices. This ana-
lysis provided a foundation for the writing of this chapter. Rather than
reporting findings, in this paper we seek to represent our experience and
share the tensions and insights that emerged through our collaborative
action research process with the aim of opening up an opportunity for
readers to reflect on their own decolonization journeys.

Decolonizing Assessment

Decolonizing Assessment through Task Design

Task design emerged from our reflective dialogue as a key area to change
in our current practices to support our aim of decolonization. Commonly
assigned learning tasks in the postsecondary context, such as tests that
ask students to remember and retell information or essays that prompt
students to write from a third person or depersonalized perspective,
emanate from the Eurocentric views of knowledge as fixed, discoverable,
and external from the knower (Davis et al., 2015). In contrast, Wilson and
Hughes (2019) stated that an Indigenist approach is founded on a relational
118 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

and emergent understanding of reality and knowledge. To better align


with Indigenous epistemology, we designed a talking circle learning task
drawing on the decolonizing principle of storytelling (Louie et al., 2017).
We were inspired in the design of our talking circle learning task by
teachings Josh has received from Blackfoot Elder Randy Bottle (personal
communication, January–June 2020). In talking circles facilitated by
Elder Bottle, participants are invited into a circle of trust where, in turn,
everyone around the circle is given the opportunity to speak and have their
perspective heard. In some Indigenous communities, traditional sharing
circles hold a sacred and ceremonial significance and require knowledge,
protocols, and blessings from an Elder from that community. The talking
circle learning task we designed was inspired by this Indigenous way of
being and included teachings from Elder Bottle. We made an explicit dis-
tinction with our students about how the circles we engaged in were dis-
tinct from traditional sharing circles. Hanson and Danyluk (2022) offered
an in-depth exploration of the ethical and pedagogical considerations in
drawing on talking circles as a pedagogical approach. Their informative
article was published following the teaching journey we describe here;
however, we intend to use it to inform our future practice and encourage
those interested to pursue it as well. In creating a talking circle learning
task, we hoped to create a learning environment that would nurture dia-
logue and would support students to make sense of and reflect on their
own and their peers’ perspectives.
We facilitated a talking circle in an online course we designed for both
synchronous small group engagements using breakout rooms and for
students to record video reflections using a web-based platform called
Flip (https://info.flip.com/). Each week, we introduced a weekly topic
through preassigned readings, videos, and podcasts, designed for students
to meet in small groups to share and listen to one another’s perspectives
on the topic. Following this exchange, students were assigned to post a
video reflection to synthesize their reflections on the topic. We also asked
students to watch peers’ videos and respond in a video format to build
on those ideas. This strategy supported collective knowledge building
and fostered instructor and peer interactions. As instructors, we provided
video feedback to each student. In our responses we sought to reflect
with students and engage in dialogue with them. To assess this task, self-
assessment, peer assessment, instructor observations, and the video posts
provided evidence to inform grading. The assessment criteria for this task
focused on self-awareness, reflection, and the demonstration of how new
perspectives contribute to an evolution of thinking.
Decolonizing Assessment Practices in Teacher Education 119

Louie et al. (2017) suggested that that “stories can create an envir-
onment in which learning emerges from individuals’ meaningful
experiences and multiple ways of knowing are honoured” (p. 28).
Reflecting on this pedagogical approach as we experienced in our talking
circle learning task, we identified that storytelling supported students to
position themselves in relation to their learning and reflect upon and
share their assumptions and biases. Along with this strength emerged
a challenge in navigating how students responded to one another’s
reflections. We found that a tension existed in that as we encouraged
open, honest, and personal reflection, the reflections of some students
were perceived as offensive to some other students. Through this process
we identified a need to support students to build trusting relationships
to create the conditions for the talking circle to be a safe and welcoming
place. We led our students in large group conversations about how to
frame reflections and choose vocabulary from a place of awareness
about others’ perspectives. We also established a way for students in
breakout rooms to ask for help when they identified a situation where
tensions existed and instructor presence and coaching would be helpful.
Overall, we found that these strategies to assist students in group work
helped us to build relationships with them and in turn helped them to
build relationships with each other. This finding is supported by Louie
et al., who suggested that the instructor should “respond respectfully
and critically, working relationally to identify discourses and experiences
of power, privilege, and marginalization” (2017, p. 27).
These experiences are leading us to explore other ways for creating a
space for reflective dialogue and additional strategies for how we might
support our students in building relationships and learning in groups.
This work has also inspired us to explore how we might draw on peda-
gogical approaches that promote compassionate and culturally respon-
sive learning experiences for our students. We are seeking ways to support
our students in developing learning designs that affirm diversity and strive
for a more equitable curriculum to support antiracist pedagogies.

Decolonizing Assessment through Formative


and Summative Assessment

Instructor assessment practices can be conceptualized as including both


formative assessment and summative assessment (Dixson & Worrell,
2016). Summative assessment is typically used at the end of the learning
120 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

sequence, whereas formative assessment is used throughout the learning


sequence to inform instructional decisions and yield feedback to the
student on their next steps (Wiliam & Leahy, 2015). Drawing on the
decolonizing principle of negotiation, we endeavoured to democratize
the classroom and reframe assessment and grading through formative
assessment and outcomes-based summative assessment.
As an instructor team we provided extensive formative feedback in the
form of video responses, real-time feedback in Zoom, asynchronous feed-
back in Google Docs (https://docs.google.com), and scaffolding for self-
and peer assessment. We modelled how to use evidence and criteria in
giving feedback, how to give productive feedback to promote learning,
and how to give feedback from a place of care and humility. We designed
opportunities for students to receive feedback from peers. We scaffolded
towards shared understanding of criteria and deeper reflection by designing
feedback protocols. Formative feedback design was used not only to drive
student learning but also to improve instruction. We also designed for
students to provide feedback to instructors in a variety of ways: Google
Forms (https://forms.google.com) exit passes, thumbs-up check-ins for
understanding in Zoom, and one-on-one meetings for individualized
support and coaching. We encouraged students to provide us with for-
mative feedback to be able to adapt our teaching to respond to individual
and communal student needs. We created anonymous surveys in Google
Forms and designed opportunities during class sessions for students to
provide feedback. We attempted to model how to receive feedback by
explicitly sharing how we were adapting instruction in response to feed-
back. Over the course, students became more adept and comfortable with
giving feedback, receiving feedback, and using feedback to improve their
work. Students reflected on the value of engaging in formative feedback
and demonstrated growth in designing feedback into their lesson plans.
Self-assessment was another formative assessment strategy we
implemented in seeking to shift the hierarchical culture of our prac-
tice. We designed for students to curate evidence of their best work and
identify strengths and areas for improvement using criteria. Instructors
used student self-assessment as a form of further evidence of learning.
Furthermore, we replied to student self-assessments with the aim of
negotiating a shared understanding of their work in relation to the criteria
and to provide them with specific feedback to stimulate their learning.
Both instructor feedback and self-assessment were supported by
outcomes-based assessment. We developed criteria and linked each
assignment to the course outcomes. We gathered assessment evidence
Decolonizing Assessment Practices in Teacher Education 121

of these course outcomes, and to determine final grades, we reviewed a


range of evidence to inform our professional judgement relative to the
course outcomes and associated criteria. This practice allowed students
to build an understanding of the criteria throughout the course, apply
action feedback from one assignment into future assignments, and grow
throughout the term towards mastery. In determining final grades, we
took recency into account and looked at all evidence to determine what
the body of evidence showed about what students knew and were able to
do. This meant that if a student had struggled with meeting an outcome
at the beginning of the term, but through the teaching and learning pro-
cess (driven by formative assessment) had grown in their understanding
and consistently demonstrated mastery in the final stages of the course,
they would receive a grade that reflected this mastery. We posit that
outcomes-based summative assessment created the space for students to
take risks and provided the opportunity for students to integrate feed-
back to improve learning. Outcomes-based summative assessment also
provided a common and consistent criterion to support self- and peer
assessment.
It is important to state that we are not positioning formative assessment
and outcomes-based summative assessment as Indigenous pedagogical
practices. They are rooted in a Eurocentric tradition to the progressive
education movement of the 20th century, underpinned by constructivist
learning theories and a conceptualization of knowing as subjective and
existing within webs of relationship (Davis et al., 2015). Although not
Indigenous, we contend that this frame shares kinship with the decolon-
izing principle of negotiation and is less oppressive to and has the poten-
tial to be in alliance with an Indigenous approach (Kovach, 2009, p. 152).

Challenges

We encountered some challenges in making these shifts in our assessment


practice. First, the university grading policy represented a constraint we
needed to work within. In collaboration with our dean, we adapted the
course outline to include rubrics for our learning tasks, a description of
outcomes-based assessment, and a reworked grading scale. Even though
these changes allowed us to proceed with the previously described changes
to our assessment practices, the requirement to frame our assessment
practice in relation to grading represented a tension with our intention to
democratize our course.
122 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

Additional challenges were faced when communicating the changes in


assessment practice to students. Students who were highly motivated by
grades found the shift in assessment culture to be challenging. We attempted
to clearly communicate the rationale and pragmatics of the changes to
assessment and revisited the topic throughout the course. We hoped that
in modelling and talking through our assessment practice with students,
we would not only support their comfort with assessment in our course
but also impact their understanding of assessment in their future practice.
Despite our efforts, the course evaluations indicated that some students
were not satisfied with our assessment methods. We will continue to iterate
the ways in which we communicate changes of assessment to students.
We also intend to support changes in assessment practices throughout our
program, as consistent practice may contribute to student understanding.
A further challenge noted was that some students expressed hesita-
tion and lack of confidence in their ability to engage in peer and self-
assessment. Louie et al. (2017) supported this finding and suggested that
it stems from students’ “indoctrination within the ‘universal’ Western
model of student-teacher power dynamics” (p. 25). We sought to coun-
teract this reluctance with open dialogue and humility in how we replied
to student self-assessment and how we positioned self-assessment as
important in the course. All students were able to identify areas for
improvement and strengths, and most students demonstrated reflection
and growth. Thus, the practice of self-assessment worked towards a shift
in the student-teacher power dynamic. We are hopeful that this practice
helped our students see the value in self-assessment and perhaps include
it as a decolonizing principle in their future practice.

Conclusion

Our action research design provided a valuable process for us to work


together as an instructor team. Specifically, we found value in how it
created a space for us to collaboratively reflect on matters of axiology
and current practices, design new pedagogical practices, and document
the actions we took in response to these problems. Our collaboration
was paramount in supporting us to take up the important and challen-
ging work of seeking to decolonize our practice. Through these practices
we made steps towards creating an ethical space to include Indigenous
perspectives related to assessment in our course. Throughout this paper
we have identified questions and next steps. One theme to explore in
Decolonizing Assessment Practices in Teacher Education 123

conclusion is the need for this work to grow and be woven throughout
our teacher education program. We believe that continuity and collective
effort are required to build sustainable relationships with Indigenous
Elders and communities to create the safe conditions for students to
encounter, reflect on, and reframe deeply held perspectives; and to disrupt
and replace colonial structures. To this end, we plan to build on and grow
this model of collaborative faculty development through scholarship of
teaching and learning by inviting our colleagues across the program to
partner with us. We also plan to explore Métissage (Donald, 2012) as an
Indigenous research methodology to advance this work. We hope that
in making our learning journey visible, we have opened up some new
questions or possibilities for readers. In closing, we want to thank Louie
et al. (2017) for creating a grounding from which we have sought to grow
towards decolonizing our assessment practices.

References

Alberta Education. (2017). Teaching quality standard. https://education.alberta.


ca/media/3739620/standardsdoc-tqs-_fa-web-2018-01–17.pdf
Association of Canadian Deans of Education. (2010). Accord on Indigenous education.
https://csse-scee.ca/acde/wp-content/uploads/sites/7/2017/08/Accord-on-
Indigenous-Education.pdf
Battiste, M. (2013). Decolonizing education: Nourishing the learning spirit. Purich.
Battiste, M., & Youngblood, J. (2000). Protecting Indigenous knowledge and heritage:
A global challenge. UBC Press.
Council of Ministers of Education. (2019). Indigenizing teacher education: Summary
report. www.cmec.ca/Publications/Lists/Publications/Attachments/390/
SITE_2018_Report_no-marks_EN.pdf
Crawford, C., Hill, J. T., Dykema, D., Hilterman, E., Tata, H., & Wong, J. (2022).
(Re)storying education. In E. Lyle (Ed.), Re/humanizing education (pp. 10–20).
Brill. https://doi.org/10.1163/9789004507593_002
Davis, B., Sumara, D., & Luce-Kapler, R. (2015). Engaging minds: Cultures of edu-
cation and practices of teaching (3rd ed.). Routledge.
Dixson, D., & Worrell, F. (2016). Formative and summative assessment in the
classroom. Theory into Practice: Psychological Science at Work in Schools and
Education, 55(2), 153–159. https://doi.org/10.1080/00405841.2016.1148989
Donald, D. (2012). Indigenous Métissage: A decolonizing research sensibility.
International Journal of Qualitative Studies in Education, 25(5), 533–555. https://
doi.org.10.1080/09518398.2011.554449
124 Joshua Hill, Christy Thomas, and Allison Robb-Hagg

Donald, D. (2014). Teaching and learning from aboriginal perspectives in the social
studies classroom. CAP Final Report. http://galileo.org/pl/wp-content/
uploads/CAP-Report_D.Donald.docx
Ermine, W. (2007). Ethical space of engagement. Indigenous Law Journal, 6(1),
193–203.
Hanson, A., & Danyluk, P. (2022). Talking circles as Indigenous pedagogy in
online learning. Teaching and Teacher Education, 115, Article 103715. https://
doi.org/10.1016/j.tate.2022.103715
Hendricks, C. (2017). Improving schools through action research: A reflective practice
approach (4th ed.). Pearson Education.
Kovach, M. (2009). Indigenous methodologies; Characteristics, conversations, and
contexts. University of Toronto Press.
Louie, D. L., Poitras Pratt, Y., Hanson, A. J., & Ottmann, J. (2017). Applying indi-
genizing principles of decolonizing methodologies in university classrooms.
Canadian Journal of Higher Education, 47 (3), 16–33. https://doi.org/10.47678/
cjhe.v47i3.187948
McKegney, S. (2008). Strategies for ethical engagement: An open letter concerning
non-native scholars of native literatures. Studies in American Indian Literatures,
20(4), 56–67.
Mertler, C. A. (2022). Introduction to educational research (3rd ed.). Sage.
Mills, G. E. (2013). Action research: A guide for the researcher (5th ed.). Pearson.
Parsons, J., Hewson, K., Adrian, L., & Day, N. (2013). Engaging in action research:
A practical guide to teacher-conducted research for educators and school leaders.
Brush Education.
Smith, L. T. (2012). Decolonizing methodologies: Research and indigenous peoples
(2nd ed.). Zed Books.
Styres, D. (2019). Pathways for remembering and (re)cognizing Indigenous
thought in education. In H. Tomlins-Jahnke, S. Styres, S. Lilley, & D. Zinga
(Eds.), Indigenous education: New directions in theory and practice (pp. 39–62).
University of Alberta Press.
Styres, D., Zinga, D., Lilley, S., & Tomlins-Jahnke, H. (2019). Contested spaces
and expanding the Indigenous education agenda. In H. Tomlins-Jahnke,
S. Styres, S. Lilley, & D. Zinga (Eds.), Indigenous education: New directions in
theory and practice (pp. xiii–xxi). University of Alberta Press.
Truth and Reconciliation Commission of Canada. (2015). Honouring the truth, rec-
onciling for the future: Summary of the final report of the Truth and Reconciliation
Commission of Canada. James Lorimer. https://irsi.ubc.ca/sites/default/
files/inline-files/Executive_Summary_English_Web.pdf
Wiliam, D., & Leahy, S. (2015). Embedding formative assessment: Practical techniques
for K–12 classrooms. Learning Sciences International.
Decolonizing Assessment Practices in Teacher Education 125

Wilson, S., & Hughes, M. (2019). Why research is reconciliation. In S. Wilson,


A. V. Breen, & L. DuPré (Eds.), Research and reconciliation: Unsettling ways of
knowing through Indigenous relationships (pp. 6–19). Canadian Scholars.
Zinga, D. (2019). Teaching as the creation of an ethical space. In H. Tomlins-
Jahnke, S. Styres, S. Lilley, & D. Zinga (Eds.), Indigenous education: New
directions in theory and practice (pp. 277–310). University of Alberta Press.
Meaningful Feedback 7
in the Online Learning
Environment
Maggie McDonnell

Most teachers, no matter how novice or experienced, recognize that feed-


back is an essential element of best practices in assessment and learning.
Ramaprasad’s (1983) seminal work on feedback in teaching and learning
defined feedback as information about the gap between student performance
in an assessment task and the standard against which that performance is
evaluated. Building on that notion of feedback, others, including Wiggins and
McTighe (2005), Hattie and Timperley (2007), and Boud (2015), have made it
clear that feedback is only truly effective if follow-up and interaction are expli-
citly part of the assessment design. In this model of feedback, there is a fun-
damental difference between simply responding to student performance and
engaging in a dialogue or “system of feedback loops” (Wiggins & McTighe,
2005, p. 185). A teacher might respond to a student’s paper with a checkmark,
an X, or the ubiquitous “good work” comment in the margins, but such com-
munication is unidirectional and terminal. There’s no invitation to continue
the dialogue, nor is there any indication of what to do next or what to take
from the exercise of performance and assessment. To be effective—that is,
to be meaningful and supportive of learning through assessment—feedback
must be a two-way communication, with space for reactions and revisions.
I say, then, that meaningful feedback is multifaceted communication about
the relationship between production, performance, and the explicit standard,
with the goal of changing that relationship over time.
In their comprehensive review of the literature, Nicol and Macfarlane-
Dick (2006) proposed seven principles of effective feedback: (a) clarify what

DOI: 10.4324/9781003347972-10
Meaningful Feedback in the Online Learning Environment 127

constitutes good performance, (b) facilitate self-assessment, (c) deliver


high-quality information about student learning, (d) encourage dialogue,
(e) encourage positive self-perception, (f ) provide opportunities to close the
gap, and (g) extrapolate information to move forward. Nicol and Milligan
(2006), working from these seven principles, reframed them for technology-
supported assessment. The principles themselves did not change but were
reconsidered in how the online learning environment influences implemen-
tation of the practices. Recent research into online teaching and learning
in the COVID-19 context has underscored the importance of assessment
and feedback in the online learning environment (Gopal et al., 2021) and
suggested that teacher familiarity with the online learning environment
and digital resources is an essential ingredient for effective assessments and
feedback (Dolighan & Owen, 2021). Although Nicol and his colleagues
proposed these principles before the COVID era, educators can use them
as a framework for the ubiquitous virtual classrooms.
For this analytic literature review, I began with Nicol and Macfarlane-
Dick’s (2006) principles as thematic guidelines and gathered recent research
to examine each principle in turn, exploring strategies and tools to support
feedback practices. Using keywords from the list of effective principles,
I narrowed the search only in terms of temporal proximity; that is, research
published within the past five years. I have included some older articles for
foundation and depth for certain concepts and themes. In the interest of
brevity, for each principle I focused on one or two strategies to support its
implementation; of course, most of the strategies respond to more than
one of the principles of effective feedback. I have incorporated many of
these strategies into my own practice as a teacher of undergraduate com-
position and professional writing courses, and in my graduate teacher edu-
cation courses, I use these strategies and help my preservice college teacher
students adapt them for their disciplines. Adoption, with or without adap-
tation, of any of these approaches will no doubt help teachers and teacher
educators develop in myriad ways their assessment and pedagogy.

The Seven Principles and How to Use Them

Clarifying What Constitutes Good Performance

Some studies (Buck et al., 2007; Ogan-Bekiroglu & Suzuk, 2014) have
suggested that preservice teachers have a good theoretical grasp of
128 Maggie McDonnell

formative assessment and assessment-for-learning practices but revert


to conventional assessment tools and evaluation in practice. The online
learning environment may prove to be a means to combat that ten-
dency by providing teachers, at any stage of their professional life,
with tools to foster better assessment habits before, during, and after
assessment tasks.

Screencasts and Video Feedback

The online learning environment provides opportunities for teachers


to connect with students in real-time and asynchronously. Through
these connections, teachers can support student learning by making
assessments more authentic and criteria more explicit. For instance,
teachers can use screencasting tools such as Loom (www.loom.com/) to
support assessment at all stages of the process, and at the same time,
develop a “personal connection” (Waltemeyer & Cranmore, 2018, para.
2) with students who see their faces and hear their voices. Screencasts
can be used for videos shared to the class, as the instructor presents an
assessment task, explains the rubric, and walks through a sample submis-
sion. Teachers can also use screencasts to provide feedback to individual
students, recording comments and corrections on screen while talking
through the process (Waltemeyer & Cranmore, 2018); students can view
this video feedback more than once and refer to it when preparing the
next assessment task.

Assessment Documentation

Assessment documentation—instructions, templates, rubrics, and


checklists—provides an opportunity to engage students in the assessment
process. It can be challenging for newer teachers to invite students to dis-
cuss, question, amend, or even cocreate assessment documentation; how-
ever, being open to these engagements creates a sense of “authenticity,
reciprocity, and inclusion” (Smith et al., 2021, p. 123) between students
and teachers. Shifting from a focus on the hierarchical relationship of
evaluator and performer to “a learning-focused view emphasizes the
active role of students in making judgments about the quality of their
own work in progress and taking informed actions in enhancing work”
(Yan & Carless, 2022, p. 1117).
Meaningful Feedback in the Online Learning Environment 129

Bloxham and Campbell (2010) noted that students understand


assessment tasks better through “repeated cycles of formative and/or
summative assessment” (p. 292); that is, teachers cannot rely solely on
assignment instructions, no matter how clear or well-written. Bloxham
and Campbell extrapolated from this idea that unidirectional feedback
has limited effectiveness, and pointed to previous research that supported
the notion that frequent, dialogic feedback (explored in more detail
below) helped students grasp the objectives, standards, and evaluation
of assessments. Bloxham and Campbell looked specifically at the benefit
of engaging in dialogue with students through interactive cover sheets
on which students could request feedback on specific aspects of their
work. While some students appreciated the time to reflect on their work
prior to submission and get focused feedback, others found the exercise
of limited value, primarily because the dialogue was initiated after the
assessment was completed. Bloxham and Campbell concluded that inter-
active cover sheets had the potential to plant the seeds of academic com-
munities of practice.

Rubrics

Arguably, the interactive cover sheet is but one important part of a


multistage dialogue around the assessment task. Yan and Carless (2022)
proposed cocreated rubrics “to enhance understanding and application
of criteria” (p. 7), or, if co-construction of the rubric was not feasible,
sharing, explaining, and discussing the rubric with students at the outset
of the assessment process. Struve (2006) also explored how cocreated
rubrics supported student learning and motivation and found that
“including students’ voices in co-creating rubrics” (p. 86) made the class-
room environment more motivating and increased self-efficacy among
students. Struve determined that teachers also benefitted from cocreating
rubrics, as the exercise revealed information about students’ prior know-
ledge, readiness for assessment, and understanding of the task.
Rubrics, whether cocreated or not, are widely recognized as essential
tools in educators’ assessment toolbox and are fundamental to clarifying
what constitutes good performance. Stevens and Levi (2013) explained
that rubrics can support teaching and learning in several ways:

1. Rubrics save time when grading, which can be particularly important


with large groups and complex assignment tasks.
130 Maggie McDonnell

2. Teachers can use rubrics to provide relevant feedback to individual


students.
3. Students can use rubrics to understand the relationship between their
work and the evaluation standard.
4. Rubrics encourage critical thinking about the assessment task and the
learning goals, especially if the class can cocreate or discuss the elem-
ents of the rubric at the start of the assessment process.
5. Rubrics make assessment expectations clear not only to students, but
to parents, colleagues, learning support staff, and administrators.
6. Rubrics reveal patterns within a group of students, and these patterns
in turn reveal material that needs revisiting and skills that would
benefit from more practice.
7. Rubrics foster equity and encourage teachers to evaluate student
work based exclusively on the learning objectives being assessed.

As explored in more detail in the following sections, rubrics support sev-


eral of the principles of effective feedback. In fact, if the reader adapts
only one new assessment strategy from this study, that strategy should
be using a rubric.

Facilitating Self-assessment

At all levels, self-assessment is a valuable skill and an effective way to


support student learning. In their review of the impact of self-assessment
on teaching and learning, Papanthymou and Darra (2019) found that self-
assessment had comparable benefits for learners at different levels (see
Table 7.1), and concluded that regardless of level, self-assessment was cer-
tainly worth implementing.
Yan and Carless (2022) made the point that self-assessment is a skill that,
like any other skill, needs practice and guidance. Building this skill begins
by showing students how to “effectively seek, process, and use feedback
from different sources” (Yan & Carless, 2022, p. 1116). Andrade (2019)
made the point that poorly defined self-assessment or self-assessment
without subsequent revision did not constitute feedback and did not
appear to influence learning. Truly effective self-assessment includes self-
monitoring and assessing one’s own work according to “developmentally
appropriate criteria” (Andrade, 2019, p. 2) and then revising that work.
There are many ways to encourage and guide effective self-assessment,
including having students keep self-assessment diaries (Yan et al., 2020),
Meaningful Feedback in the Online Learning Environment 131

Table 7.1 Positive outcomes of self-assessment by level.


Level Most beneficial aspect Other benefits
Primary Better motivation and Improvements in
engagement in learning academic performance
and learning
Secondary Improvements in Better self-regulated
academic performance learning; better
and learning motivation; increased
self-esteem
Higher education Improvements in Better self-regulated
academic performance learning; better
and learning motivation

directing them to set personal learning goals (Zimmerman, 2008),


including them in setting assessment criteria and creating rubrics, and
breaking complex assessments into scaffolded stages with time for reflec-
tion and self-assessment between steps.
Adding technology and the digital classroom environment to the
assessment toolbox means having even more ways to support self-
assessment. Models, templates, rubrics, and other parallel resources can
be made available online during some or all of the assessment cycle, and
the approaches listed above can all happen online as well; in fact, strat-
egies such as personal learning journals work very well online, and digital
platforms, such as Moodle’s journal module or MS Notebook, allow
teachers and resource staff to interact with students individually through
their journals. There are, of course, more high-tech tools as well: Jones
et al. (2018) looked at using robotic tutors to support self-assessment
and found that students demonstrated better learning and metacogni-
tive skills when the robotic tutors personalized feedback and adjusted
the pace and direction of learning based on each student’s performance.
The research also suggested that students benefitted from teaching robot
peers. Tools such as e-portfolios (Chang et al., 2018) are perhaps more
widely accessible than high-tech tools and have demonstrated efficacy
in terms of self-assessment and self-regulated learning. Developing and
articulating individual learning goals helps students feel more engaged
and invested in their assessment tasks (Chang et al., 2018); recording,
reviewing, reflecting on, and revisiting these goals in an e-portfolio,
online journal, or digital notebook “significantly and positively” (Chang
et al., 2018, p. 1245) influences students’ learning.
132 Maggie McDonnell

Delivering High-quality Information About Student Learning

Students in all levels struggle with feedback for several reasons, but the
three factors most often cited are brevity, negativity, and complexity—
feedback is too short, too negative, or too difficult to interpret (Pinheiro
Cavalcanti et al., 2019). Weaver (as cited in Pinheiro Cavalcanti et al.,
2019) reported that more than half of college students had not been
taught how to read and apply feedback effectively. If students do not
know what to do with teacher feedback, they will struggle to progress.
Delivering high-quality information about students’ learning, then,
tacitly compels teachers to train them to interpret and apply that feed-
back. Carless and Boud (2018) defined this skill as feedback literacy; that
is, “the understandings, capacities and dispositions needed to make sense
of information and use it to enhance work or learning strategies” (p. 2).
Delivery necessarily requires reception: A package is not delivered if it
has not been received. The same is true of feedback: If students do not
know how to use the information teachers provide, then the teachers must
have not successfully delivered it. To help students develop their feedback
literacy, teachers must focus on three stages, as outlined in Figure 7.1:
seeking information, processing the information, and finally, acting on the
information (Carless & Boud, 2018). The initial stage, seeking informa-
tion, can be as direct as the student asking for feedback on their own work
or for clarification on instructions, material, or assessment tools. As well
as directly seeking feedback, the student may get information through
monitoring; that is, drawing feedback from context (e.g., comparisons

Figure 7.1 The feedback literacy loop.


Meaningful Feedback in the Online Learning Environment 133

with peers, self-perceived progress on the current task or from previous


work, comparison with exemplars or rubrics). This stage implies that
feedback can be sought at any time (which teachers should be encour-
aging), including before performance, so that students can check to be
sure they understand the task and the evaluation criteria. As suggested in
Figure 7.1, the elements of feedback literacy lend themselves naturally to
the feedback loop (Wiggins & McTighe, 2005); students begin by seeking
feedback, then seek further feedback after an iteration of the process.

Encouraging Dialogue

The COVID-19 pandemic, and the recurring disruptions to classrooms and


campuses, reminded many of the indirect advantages of in-person teaching
and learning. Even prior to the collective, global shift to online classrooms
in whatever form those took, it was clear that students present in a shared
physical space felt more engaged with “their learning, their peers and their
instructor” (Burke & Larmar, 2021, p. 601) than when online. The benefit
of online learning—namely, better access for more people—is clear, but
it comes with a lack of the organic connection derived from just being in
the same place at the same time, engaged in the same activity. This lack
of connection has myriad negative implications, from decreased motiv-
ation to attrition. Cole et al. (as cited in Mehall, 2020) found that the “main
source of student dissatisfaction” (p. 184) for online learners in higher edu-
cation was lack of interaction with instructors and classmates.
In the context of assessment, naturally, feedback is teachers’ primary
means of interacting with students. Feedback, as discussed, is necessarily
interactive, but in an online environment, where interaction is neither
organic nor easy, feedback may be more challenging to initiate and main-
tain. In classrooms, teachers more naturally offer verbal and nonverbal
feedback (Mehall, 2020) and can more easily monitor peer interactions.
This is not to say that dialogue is impossible online, and in fact, with
planning, the digital classroom can be a source of rich feedback.
Guasch et al. (2018) described the “new paradigm” (p. 111) of feed-
back in which feedback is an interactive process, rather than the unidirec-
tional monologue of older models. Feedback that encourages interaction
supports formative assessment and assessment as and for learning; because
students are expected, encouraged, and guided to interact with feedback
and demonstrate its implementation, they are more likely to take steps
and make decisions that improve learning. In particular, Guasch et al.
134 Maggie McDonnell

found that epistemic-suggestive feedback—that is, feedback presented as


questions with suggestions for answering them—was effective in terms
of both student learning and metacognitive development.

Encouraging Positive Self-perception

Much of what has been explored in previous sections influences students’


positive self-perception. Self-regulated learning, feedback literacy, and par-
ticipation in setting goals and articulating assessment criteria are essential
elements in better understanding the expectations of assessment and the
student’s relationship to those expectations.
Rubrics have already been presented as helpful tools for better
assessment and effective feedback. In this context, one specific variation,
the single point rubric (SPR), is an excellent strategy to adopt. Gonzalez
(2015) claimed that the SPR was beneficial to students and teachers, per-
haps more so than analytic rubrics. Indeed, Wilson (2018) used an SPR
for her secondary English language arts courses, in part because she
discovered that the conventional analytic rubric “was the wrong tool for
the writing response and assessment that [her] students deserve” (p. 2).
The SPR is criterion-based, but rather than describing levels of pro-
ficiency (excellent > satisfactory > weak > unacceptable), the rubric
descriptors are focused only on the expected standard of performance.
The SPR in Figure 7.2 uses a beginners’ yoga class to illustrate how to
articulate the standard for performance and provides space to comment
on how the class has met or exceeded the standard, and how performance
can be improved.
Gonzalez (2014, 2015) and Chao (2020) have argued that the benefits
of SPRs include

• fewer words, which means that they are quicker and easier to create
and easier to understand;
• more flexible interpretations of the criteria, which means that
teachers do not need to anticipate how students might deviate from
the expected standard;
• better quality feedback, because teachers can focus their remarks on
specific problems and on ways in which the student has surpassed
expectations; and
• clearer objectives and standards, which support more effective peer
and self-assessment.
Meaningful Feedback in the Online Learning Environment 135

Beginners’ Yoga Class


What was beyond
What needs work Criteria our expectations
Atmosphere:
The room was a comfortable
temperature. The music was well
chosen and played at an appropriate
volume. There was adequate space
for all participants. The lighting did
not interfere with relaxation.
Instructor:
The instructor was friendly
and supportive. Cues were easy
to understand. The instructor
demonstrated new poses and
provided options for different levels.
Yoga poses:
The sequence of poses was easy to
follow. The combination of poses
provided a full-body experience.
There was adequate time for
warm-up and cool down.

Figure 7.2 Example of a single point rubric. Adapted from Meet the Single Point
Rubric by J. Gonzalez, 2015, Cult of Pedagogy (www.cultofpedagogy.
com/single-point-rubric/). Copyright 2015 by Cult of Pedagogy.

There are variations of the SPR. In 2017, Gonzalez proposed a modified


version of the SPR in which the centre column is split into levels, using
the scale to indicate students’ performance relative to the standard, as
shown in Figure 7.3.
Chao (2020) provided models of the SPR with and without ratings for
use in summative and formative assessments. In my own undergraduate
courses, I use a simplified version of the SPR, in which I describe what
constitutes excellence for a given criterion. My feedback then focuses
on how the student’s work compares to that description. I use the same
descriptors to create peer-assessment guidelines. In Figure 7.4, I have used
Gonzalez’s (2015) breakfast in bed scenario to demonstrate my approach;
the corresponding suggestions for peer feedback are illustrated in Figure 7.5.
Both the modified SPR and the corresponding peer-feedback worksheet
can be printed and used effectively as digital resources. At my university,
instructors use Moodle as our course management platform; in Moodle, I use
136 Maggie McDonnell

Rating
Criteria (or Standard) 1 2 3 4 Feedback
Describe the standard  Can be used to explain number given,
to be met or explain offer suggestions for improvement, or
the criteria. give advice for pushing even further.



Scale: 1 = standard not met; 2 = standard partially met; 3 = standard met; 4 = exceeds
expectations.

Figure 7.3 Modified single point rubric. Adapted from Meet the Single Point
Rubric: Another Variation (Added in 2017), by J. Gonzalez, 2017, Cult of
Pedagogy (www.cultofpedagogy.com/single-point-rubric/#:~:text=
ANOTHER%20VARIATION%20(ADDED%20IN%202017)).
Copyright 2017 by Cult of Pedagogy.

Criteria What excellence looks like Comments


Atmosphere: The room was the perfect temperature
for the activity, and there was adequate
space for all participants. The music
was well chosen and at a comfortable
volume, and the lighting was adjusted
seamlessly according to the focus of
the class.
Instructor: The instructor was friendly and
supportive. Cues were easy to
understand and apply. The instructor
demonstrated new poses and provided
options for different abilities and
comfort levels.
Yoga poses: The sequence of poses was challenging
but easy to follow. The combination of
poses provided a full-body experience.
There was adequate warm-up and cool
down, and final relaxation was the
ideal length of time.

Figure 7.4 Beginners’ yoga class modified single point rubric.


Meaningful Feedback in the Online Learning Environment 137

Your comments
Criteria What to look for or suggestions
Atmosphere: How is the temperature in the room? Does
it ever feel too chilly or too warm?
Is there enough room for everyone to set up
their mat and move comfortably?
Did the music add to the experience? Was it
ever distracting, too loud, or too quiet?
How was the lighting? Could you see the
instructor? Were the lights ever too bright?
What else could your colleague do in this
area to achieve excellence?
Instructor: Was the instructor friendly and supportive?
Were the cues easy to understand and
apply? Did the instructor demonstrate new
poses? Did the instructor offer alternative
poses or incorporate props to support
students at different levels? What else could
your colleague do in this area to achieve
excellence?
Yoga poses: Were you able to follow the sequence of
poses? Was there enough challenge without
anything being completely beyond you?
Did you feel as though your whole body
was involved? Was there enough warm-up?
Were you able to cool down completely
before final relaxation? Was final relaxation
too short or too long? What else could
your colleague do in this area to achieve
excellence?

Figure 7.5 Corresponding peer-feedback worksheet.

the modified SPR in creating assignments by selecting marking guide as the


grading method (selecting rubric allows only the creation of an analytic, level-
based rubric). Based on the modified SPR, I then create an assessment guide
for peer feedback in Moodle’s Workshop module, using the question format
demonstrated in the peer-feedback worksheet. The Workshop module, per-
haps better understood as a peer review activity, is an excellent tool that does
require significant time and thought to set up and manage. When planned
well, the Workshop, like most forms of peer assessment, helps students see
the connections between course content, assessment, and their own work.
138 Maggie McDonnell
Figure 7.6 Scaffolded essay process for Cégep English course.
Meaningful Feedback in the Online Learning Environment 139

Providing Opportunities to Close the Gap

Perhaps the most effective and most obvious way to provide students
opportunities to close the gap between performance and learning goals
is through scaffolded stages of drafting and revision. In my own prac-
tice, most major assignments involve multiple submission stages with
feedback in various forms at each stage. In my Cégep English courses,
for instance, students write three essays over five weeks. Each essay
involves planning, outlining, drafting, and revising prior to submission, as
illustrated in Figure 7.6.
As is clear from Figure 7.6, peer feedback is woven into the scaffolded
process. Peer review is valuable to teaching and learning in many ways
(Debby Ellis Writing Center, 2022): It fosters collaboration and engage-
ment, and students learn from one another and help one another navigate
assignments and material. The skills they develop in providing helpful
feedback to their classmates prepare them for providing and using feed-
back in future. As well, peer feedback adds a natural step to the writing
process that some students might be tempted to skip; because they are
accountable to their peers for submitting draft work and providing feed-
back, they develop parallel skills and are better prepared for their final
submission.

Extrapolating Information to Move Forward

Extrapolating information can be understood in two facets: reflective


practice (internally focused extrapolation) and progress tracking (exter-
nally focused). Tracking cohort progress is, arguably, easier in the digital
workspace than on paper. Course management systems such as Moodle
and Lea typically allow teachers to produce myriad reports to explore
different trends, and the truly reflective practitioner can generate side-by-
side comparisons for members of a single cohort, multiple iterations of
related assessments, different sections of the same course, and different
cohorts in the same course over time.
Teachers who strive towards betterment know that learning does not
end where teaching starts; teaching experiences are, simultaneously,
learning opportunities, and teachers become better by reflecting on
and learning from these experiences. John and Thomas (2018) argued
that technology provides opportunities to engage in the reflection that
140 Maggie McDonnell

is an essential component of personal and professional development.


Several studies (Hamel & Viau-Guay, 2019; Körkkö et al., 2019; Nagro,
2019; Yuan et al., 2020) suggested that video-based reflections, especially
when undertaken in parallel with peer review and mentorship, enhanced
teacher development. Video can be a simple tool in reflective practice if
teachers choose to record then watch themselves teach, or it can be part
of a more sophisticated model of deliberate practice if they use an app
such as VEO to record, share, annotate, and tag videos.
Ultimately, the approach to meaningful feedback in the online learning
environment is grounded in the same principles that guide assessment
practice in the physical classroom. To ensure that, in the digital classroom,
feedback, and assessments more generally, support student learning and
foster a collaborative, positive, and generative learning space, teachers
can apply the seven principles, summarized here:

1. clarify what constitutes good performance by providing models with


correlation and cocreating assessment rubrics;
2. facilitate self-assessment by introducing rubrics early in the process
and making connections with learning objectives;
3. deliver high-quality information about student learning by fostering
feedback literacy and providing focused and specific feedback;
4. encourage dialogue by using respectful inquiry and guiding peer
reviewers;
5. encourage positive self-perception by using single point rubrics and
learner reflection journals to validate progress;
6. provide opportunities to close the gap by scaffolding assessment tasks
and using peer review and revision; and
7. extrapolate information to move forward by tracking cohort progress
and engaging in reflective practice.

Conclusion

I have proposed that meaningful feedback is multifaceted communication


about the relationship between production, performance, and explicit
standards, with the goal of changing that relationship over time. The key
word in this understanding of feedback is “communication.” In the online
learning environment, students feel isolated and remote from their peers
and their teachers (Al-Maskari et al., 2022; Lemay et al., 2021). In con-
sciously embracing best practices in assessments and feedback, teachers
Meaningful Feedback in the Online Learning Environment 141

naturally rely on—and create—open channels of communication that


support and motivate online learners.
Feedback is, of course, the essential ingredient in this approach to
assessment, beginning with the understanding that feedback can, and
should, occur at every stage of assessment, from explanations of the
tasks and expectations to final feedback and evaluation. Incorporating
peer- and self-assessment strategies creates engagement, provides feed-
back from multiple perspectives, and supports peer communication in
the online learning environment. Inviting discussion of the assessment
and the feedback given and received fosters a sense of ownership and
accountability in students and assures them that teachers are accessible
and open to dialogue.
Most important, feedback should encourage students and inform
practice through dialogue and reflection. Ramaprasad’s (1983) definition
of feedback as information about the gap between a level-appropriate
standard and student performance provides a useful image to bear in
mind: the space between expectations and deliverables. People tend to
think of that space as a negative—student performance is anticipated to
be less than the desired outcome. Perhaps the most supportive thing to do
for students is to consider the gap as multidirectional and focus feedback
on how student work differs from, rather than fails to meet, the defined
standard. Feedback may indeed provide students with information they
need to improve their performance, and it can also provide insight into
how their performance surpasses or redefines expectations.

References

Al-Maskari, A., Al-Riyami, T., & Kunjumuhammed, S. K. (2022). Students [sic]


academic and social concerns during COVID-19 pandemic. Education and
Information Technologies, 27, 1–21. https://doi.org/10.1007/s10639-021-10592-2
Andrade, H. L. (2019). A critical review of research on student self-assessment.
Frontiers in Education, 4(87), 1–13. https://doi.org/10.3389/feduc.2019.00087
Bloxham, S., & Campbell, L. (2010). Generating dialogue in assessment feedback:
Exploring the use of interactive cover sheets. Assessment & Evaluation in Higher
Education, 35(3), 291–300. https://doi.org/10.1080/02602931003650045
Boud, D. (2015). Feedback: Ensuring it leads to enhanced learning. The Clinical
Teacher, 12, 3–7. https://doi.org/10.1111/tct.12345
Buck, G. A., Macintyre Latta, M. A., & Leslie-Pelecky, D. L. (2007). Learning
how to make inquiry into electricity and magnetism discernable to middle
142 Maggie McDonnell

level teachers. Journal of Science Teacher Education, 18, 377–397. https://doi.


org/10.1007/s10972-007-9053-8
Burke, K., & Larmar, S. (2021). Acknowledging another face in the virtual
crowd: Reimagining the online experience in higher education through an
online pedagogy of care. Journal of Further and Higher Education, 45(5), 601–
615. https://doi.org/10.1080/0309877X.2020.1804536
Carless, D., & Boud, D. (2018). The development of student feedback literacy:
Enabling uptake of feedback. Assessment & Evaluation in Higher Education,
43(8), 1315–1325. https://doi.org/10.1080/02602938.2018.1463354
Chang, C.-C., Liang, C., Chou, P.-N., & Liao, Y.-M. (2018). Using e-portfolio
for learning goal setting to facilitate self-regulated learning of high school
students. Behaviour & Information Technology, 37(12), 1237–1251. https://doi.
org/10.1080/0144929X.2018.1496275
Chao, C. I. (2020). Validity evidence to explore the educational impact of using a single-point
rubric in interprofessional education [Master’s thesis]. University of Alberta,
University of Alberta Education and Research Archive. https://era.library.
ualberta.ca/items/480114b9-2594-4341-8ce8-30ac094cce82/view/58146b93-
756e-44ee-ae42–518712af b1d6/Chao_Cheng_In_202009_MSc.pdf
Debby Ellis Writing Center. (2022). Benefits of peer review. Southwestern University.
www.southwestern.edu/offices/writing/faculty-resources-for-writing-instruc
tion/peer-review/benefits-of-peer-review/
Dolighan, T., & Owen, M. (2021). Teacher efficacy for online teaching during
the COVID-19 pandemic. Brock Education Journal, 30(1), 95. https://doi.
org/10.26522/brocked.v30i1.851
Gonzalez, J. (May 1, 2014). Know your terms: Holistic, analytic, and single-point
rubrics. Cult of Pedagogy. www.cultofpedagogy.com/holistic-analytic-single-
point-rubrics/
Gonzalez, J. (February 4, 2015). Meet the single point rubric. Cult of Pedagogy.
www.cultofpedagogy.com/single-point-rubric/
Gonzalez, J. (2017). Meet the single point rubric: Another variation (added in 2017).
Cult of Pedagogy. www.cultofpedagogy.com/single-point-rubric/#:~:text=
ANOTHER%20VARIATION%20(ADDED%20IN%202017)
Gopal, R., Singh, V., & Aggarwal, A. (2021). Impact of online classes on the
satisfaction and performance of students during the pandemic period of
COVID 19. Education and Information Technologies, 26, 6923–6947. https://doi.
org/10.1007/s10639-021-10523-1
Guasch, T., Espasa, A., & Martinez-Melo, M. (2018). The art of questioning
in online learning environments: The potential of feedback in writing.
Assessment & Evaluation in Higher Education, 44(1), 111–123. https://doi.org/1
0.1080/02602938.2018.1479373
Meaningful Feedback in the Online Learning Environment 143

Hamel, C., & Viau-Guay, A. (2019). Using video to support teachers’ reflective
practice: A literature review. Cogent Education, 6(1), 1–14. https://doi.org/10.
1080/2331186X.2019.1673689
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational
Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
John, T. E., & Thomas, B. (2018). Reflective practitioner: Fostering motiv-
ation, thinking skills and self-regulation to enhance the quality of teaching.
International Journal of Research and Analytical Reviews, 5(2), 1056–1058.
Jones, A., Bull, S., & Castellano, G. (2018). “I know that now, I’m going to learn this
next” Promoting self-regulated learning with a robotic tutor. International Journal
of Social Robotics, 10, 439–454. https://doi.org/10.1007/s12369-017-0430-y
Körkkö, M., Morales Rio, S., & Kyrö-Ämmälä, O. (2019). Using a video app as a
tool for reflective practice. Educational Research, 61(1), 22–37. https://doi.org/
10.1080/00131881.2018.1562954
Lemay, D. J., Bazelais, P., & Doleck, T. (2021). Transition to online learning
during the COVID-19 pandemic. Computers in Human Behavior Reports, 4,
Article 100130. https://doi.org/10.1016/j.chbr.2021.100130
Mehall, S. (2020). Purposeful interpersonal interaction in online learning: What
is it and how is it measured? Online Learning, 24(1), 182–204. https://doi.
org/10.24059/olj.v24i1.2002
Nagro, S. A. (2019). Reflecting on others before reflecting on self: Using video
evidence to guide teacher candidates’ reflective practices. Journal of Teacher
Education, 71(4), 420–433. https://doi.org/10.1177/0022487119872700
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated
learning: A model and seven principles of good feedback practice. Studies in
Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Nicol, D. J., & Milligan, C. (2006). Rethinking technology-supported assessment
in terms of the seven principles of good feedback practice. In C. Bryan & K.
Clegg (Eds.), Innovative assessment in higher education. Taylor and Francis. www.
taylorfrancis.com/chapters/edit/10.4324/9780203969670-16/rethinking-
technology-supported-assessment-practices-relation-seven-principles-good-
feedback-practice-david-nicol-colin-milligan
Ogan-Bekiroglu, F., & Suzuk, E. (2014). Pre-service teachers’ assessment literacy
and its implementation into practice. The Curriculum Journal, 25(3), 344–371.
https://doi.org/10.1080/09585176.2014.899916
Papanthymou, A., & Darra, M. (2019). The contribution of learner self-assessment
for improvement of learning and teaching process: A review. Journal of Education
and Learning, 8(1), 48–64. https://doi.org/10.5539/jel.v8n1p48
Pinheiro Cavalcanti, A., Rolim, V., André, M., Freitas, F., Ferreira, R., & Gašević,
D. (2019). An analysis of the use of good feedback practices in online learning
144 Maggie McDonnell

courses. IEEE 19th International Conference on Advanced Learning Technologies,


153–157. https://doi.org/10.1109/ICALT.2019.00061
Ramaprasad, A. (1983). On the definition of feedback. Behavioral Science, 28(1),
4–13. https://doi.org/10.1002/bs.3830280103
Smith, S., Akhyani, K., Axson, D., Arnautu, A., & Stanimirova, I. (2021).
Learning together: A case study of a partnership to co-create assessment cri-
teria. International Journal for Students as Partners, 5(2), 123–133. https://doi.
org/10.15173/ijsap.v5i2.4647
Stevens, D. D., & Levi, A. J. (2013). Introduction to rubrics: An assessment tool to
save grading time, convey effective feedback, and promote student learning. Stylus
Publishing.
Struve, M. E. (2006). “Why do you want to co-create rubrics?” Relationship between
co-created rubrics, student motivation, self-efficacy, and achievement [Master’s
thesis]. California State University San Marcos. ScholarWorks. https://
scholarworks.calstate.edu/downloads/6h440s858?locale=en
Waltemeyer, S., & Cranmore, J. (2018). Screencasting technology to increase
engagement in online higher education courses. eLearn, 12, Article 8. https://
doi.org/10.1145/3302261.3236693
Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Association
for Supervision and Curriculum Development.
Wilson, J. (2018). Write outside the boxes: The single point rubric in the sec-
ondary ELA classroom. Journal of Writing Assessment, 11(1), 1–9. https://
escholarship.org/uc/item/9175z7zs
Yan, Z., & Carless, D. (2022). Self-assessment is about more than self: The enab-
ling role of feedback literacy. Assessment & Evaluation in Higher Education,
47(7), 1116–1128. https://doi.org/10.1080/02602938.2021.2001431
Yan, Z., Chiu, M. M., & Ko, P. Y. (2020). Effects of self-assessment diaries on aca-
demic achievement, self-regulation, and motivation. Assessment in Education:
Principles, Policy & Practice, 27(5), 562–583. https://doi.org/10.1080/09695
94X.2020.1827221
Yuan, R., Mak, P., & Yang, M. (2020). “We teach, we record, we edit, and we
reflect”: Engaging pre-service language teachers in video-based reflective practice.
Language Teaching Research, 26(3), 552–571. https://doi.org/10.1177/
1362168820906281
Zimmerman, B. J. (2008). Goal setting: A key proactive source of academic self-
regulation. In D. H. Schunk & B. J. Zimmerman (Eds.), Motivation and self-
regulated learning: Theory, research, and applications (pp. 267–295). Lawrence
Erlbaum Associates. https://doi.org/10.4324/9780203831076
Analyzing Presence 8
in Online Learning
Environments through
Student Narratives
An Autoethnographic Study
Tan Xuan Pham, Chi (Linh) Tran, Le Pham Hue
Xuan, and Giang Nguyen Hoang Le

The COVID-19 pandemic required a shift in the learning environment


from in-class to online. This shift significantly influenced educational
experiences, including sense of belonging, academic support, and peda-
gogical strategies such as teaching practice and assessment. Given the
limitations and challenges of online learning (OL) in this time of crisis,
it is helpful to reconceptualize its material and discursive practices.
We are early career education researchers, who have used collabora-
tive autoethnography (Chang et al., 2013) to study our OL experiences
during the pandemic and examine pedagogical implications for online
instruction. Garrison et al. (1999) proposed a community of inquiry (CoI)
model for successful OL. The CoI model was our theoretical framework
to uncover the dynamics of our OL experiences through the interaction
of three concepts: cognitive presence, social presence, and teaching
presence (Garrison et al., 2010). Using this framework, we explore how
OL environments have shaped and reshaped our perceptions. We present
four narrative vignettes (Humphreys, 2005) of our personal stories set in
Vietnam, Canada, Australia, and Taiwan. Our narratives challenge the

DOI: 10.4324/9781003347972-11
146 Pham, Tran, Xuan, and Le

concept of presence in virtual platforms when it is interpreted as teachers’


academic support, students’ emotional connection, and students’ aca-
demic gains. Our stories, supported by the literature, suggest implications
to inform pedagogy in OL landscapes.

OL at the Outset of COVID-19

OL showed its power in continuing the teaching and learning process


when COVID-19 forced face-to-face educational activities onto virtual
platforms. The impact of the global pandemic required instructors to
deliver courses online (Clark & Mayer, 2016), to shorten the time for
in-class participation, and to enhance self-directed learning (Yang &
Cornelius, 2004). With the support of technology, OL could connect dis-
tant learners to form a community of learning and allow them to interact
with one another, similar to conventional classes. It also provided students
with a global learning environment, facilitating international collabor-
ation and partnership (Appana, 2008).
However, switching from conventional to online schooling also placed
graduate students in an unprecedented position. Although some uni-
versities had prior experience with technology-mediated education,
for most, OL was new (Ali, 2020). As well, as Ali (2020) noted, the shift
was not merely a technical one: It required changes in curriculum,
learning materials, and assessment strategies. Inconsistencies in curric-
ulum and assessment raised concerns about educational fairness and
equity. Evidence has emerged on the toll OL wrought on students (and
teachers), including physical disconnection, sense of unease, uncertainty
with technological skills, inadequate instructional pedagogy, poor mental
health, and lack of academic and mental health supports (Firang &
Mensah, 2022; Sahu, 2020).
The deployment of OL instruction and evaluation was fraught with
challenges in multiple countries. At Melbourne University, according to
a Graduate Student Association (2020) survey, online assessment during
the pandemic lacked timely contact. Communication was late, unex-
pected, and sometimes disregarded student well-being issues. Students
also reported little technological support or training sessions about
online assessment (Graduate Student Association, 2020). Similarly, at the
Chinese University of Hong Kong, only 16.6% of undergraduate and
graduate students were happy with how their online tests were set up
during the pandemic (Lee et al., 2022). The most common obstacle they
Analyzing Presence in Online Learning Environments 147

encountered was technological difficulties (52.6%). Most students (72.6%)


believed that computer problems and internet connectivity significantly
impacted online assessments. They requested more instructor input,
quick and detailed feedback on their performance, technical assistance,
and standardized metrics to ensure academic honesty (Lee et al., 2022).
In South Korea, a graduate school survey found that during COVID-19,
students experienced prolonged mental and emotional distress, such as
feeling isolated when having to study online because the institution was
not well prepared in terms of educational and technical infrastructure
(Yee et al., 2022). As a final example, graduate students in the United
Arab Emirates pointed out that although student-faculty interactions
are essential to learning, few faculty members used discussion threads
or blogs when they shifted to OL to create that interaction with students
(Omar et al., 2021).

Theoretical Framework

In this study, we used the CoI model (Garrison et al., 1999) to frame the
multifaceted realities of OL landscapes. This model was a useful theor-
etical framework for contextualizing our lived experiences from various
learning and evaluation backgrounds during and after the pandemic.
Garrison and colleagues (1999) developed the concept of presence within
the CoI model, influenced by Dewey’s thinking about collaborative
constructivism in learning, to enable researchers to identify significant
dimensions influencing students’ OL in relation to teachers, peers, and
other stakeholders (Kim & Gurvitch, 2020). As shown in Table 8.1, the
CoI model includes three interconnected elements in the synchronous
learning mode: social presence, cognitive presence, and teaching presence
(Garrison et al., 1999; see also Anderson et al., 2001; Garrison, 2007). We
use synchronous learning to describe face-to-face communication without
physical contact, which has become an increasingly popular method of
communication on virtual platforms such as Zoom (https://zoom.us/)
or Google Meet (https://meet.google.com/).
Given that the global effectiveness of OL for graduate students is
still understudied (Bains et al., 2021; Xie et al., 2020), the CoI model
(Garrison et al., 1999) provided a theoretical framework for investigating
our OL experiences by analyzing each of the three presences in our
autoethnographic stories. We asked two questions: How does the concept
of presence in virtual platforms inform and illuminate our experiences of
148 Pham, Tran, Xuan, and Le

Table 8.1 Three elements of the CoI framework.


Element Features
Social presence Having a sense of personal connection and a meaningful
relationship
Creating the conditions for inquiry and quality
interaction through reflective and threaded discussions
Cognitive presence Comprising four phases: triggering events, identifying
an educational issue or task; exploring phase,
discussing potential solutions; making and integrating
connections between these solutions; and selecting the
most suitable solution
Relating to the ability of students to generate knowledge
through ongoing communication and collective
sharing with others
Teaching presence Involving the creation, facilitation, and guidance of
cognitive and social processes to achieve personally
meaningful and educationally valuable learning
outcomes
Note. Adapted from “Critical Inquiry in a Text-based Environment: Computer
Conferencing in Higher Education,” by D. R. Garrison, T. Anderson, and W. Archer, 1999,
Internet and Higher Education, 2(2–3), pp. 87–105 (https://doi.org/10.1016/s1096-7516(00)
00016-6). Copyright 1999 by ScienceDirect.

learning and assessment practices? How should teaching and assessment


practices be developed to enhance student learning in the OL environment?

Researcher Positionality

Autoethnographers should elaborate on their positionalities before


weaving their stories together as a collaborative work (Chang et al., 2013);
thus, we have each written an introduction to set the stage for a better
understanding of our contexts in this research. There are links between
our backgrounds (Vietnamese), areas of expertise (educational studies),
and the focus of this study (OL and assessment practices in the context of
the COVID-19 pandemic).

Tan

I am an English language teacher at a centre in Ho Chi Minh, Vietnam.


When the national lockdown was imposed, all in-class activities were
Analyzing Presence in Online Learning Environments 149

moved to a virtual space. As I was shifting my lessons to online platforms,


I was also participating in two online courses. In one, a course on inten-
sive research writing, I met my coauthors. The OL enabled us to join the
same course while living and studying in different countries. As both a
teacher and a learner, I had to learn technical skills to acclimate to this
new learning environment. Working and studying from home deprived
me of face-to-face communication with my colleagues and students, but
it also offered me a space for self-study and an opportunity to collaborate
with educators and researchers from around the world. The simultan-
eous challenges and benefits reinforced my interconnected identity as a
teacher and researcher.

Chi

I am a doctoral candidate in educational studies at Swinburne University


of Technology, Australia. The COVID-19 pandemic struck when I was in
the second year of my candidature, collecting data in my home country
of Vietnam. Due to a travel ban, I was forced to remain there for more
than two years. The analytical post qualitative inquiry approaches that
I am using for my degree are emerging and challenging ones. Few rele-
vant materials and resources were available online to support me as
I navigated my learning from afar. Furthermore, I did not have one-on-
one guidance from my supervisors to assist me with research-focused
writing or reading. I completed two significant milestones during OL,
both of which were organized via Zoom meetings with my supervisors
and review panel members. My mental health suffered significantly due to
my distance from the university and my need for more intensive support
and supervision to work with data, writing, and revisions.

Xuan

I am a doctoral student at National Chung Cheng University in Chiayi,


Taiwan, studying educational leadership. During the COVID-19 pan-
demic, before switching to education, I spent a semester studying eco-
nomics online at another university. The change to OL caused me a lot
of worry, particularly given the ambiguity about what was occurring
and what would happen next. My educational experience during that
time, especially the assessment of learning, left me disillusioned with the
150 Pham, Tran, Xuan, and Le

higher education system. The difficulty in connecting with professors and


the superficiality of tests made me question the quality of online teaching
and learning.

Giang

I am a candidate in a joint doctoral program of educational studies at


Brock University, Canada. My studies pertain to gender and sexuality
issues in schooling. When the COVID-19 pandemic exploded, I was not
able to do my coursework in person, leading to a physical and emotional
disconnect from my colleagues, instructors, supervisor, and committee
members. OL was a new and challenging experience; my colleagues in
the program and I felt that real human connection had been taken away
from us.

Methodology

To explore the three concepts of presence in OL, we used autoethnography


as a methodological framing. Autoethnography facilitates the process
of narrating and analyzing (graphy) personal experience (auto) within
shared cultural experience (ethno; N. H. G. Le, 2021, p. 217). In doing
autoethnography, researchers retrospectively and selectively narrate
their significant moments and then place themselves in a specific culture
with particular identities to study their experiences ( Jones et al., 2016).
Collaborative autoethnography invites different researchers to share their
self-studies and “collaboratively analyze and interpret them for common-
alities and differences” (Hernandez et al., 2017, p. 251).
As graduate students in teacher education programs, researchers, and
participants, we retrospectively collected our “stay-with fragments of
memory” (Blaikie, 2021, p. 36) to narrate our experiences of OL. Our
stories are in the form of vignettes (Humphreys, 2005), for which we
selected and organized events that described our distinctive experiences.
We extracted these events from journals each of us kept as a praxis of
reflexive learning during the pandemic, choosing those that resonated
with us the most in terms of OL challenges and benefits. To analyze and
interpret our narratives as data, we followed a narrative style of writing
(N. H. G. Le, 2021), using storytelling and first-person voice to frame
Analyzing Presence in Online Learning Environments 151

our experiences and consider social presence, cognitive presence, and


teaching presence (Garrison et al., 2010) in OL environments.

Narrative Vignettes

Each vignette represents one of the themes that emerged from our OL
experiences during the COVID-19 pandemic: a lack of belonging (Tan),
the value of a virtual “library” (Chi), lack of engagement and integrity in
OL (Xuan), and chaos and disconnections (Giang). In our narratives, we
link to the literature to illustrate how presence was embedded into our
experiences. We include photographs that encapsulate some of our phys-
ical, emotional, and psychological experiences during OL (all are used
with the permission of those included in the images). We invite readers
to immerse themselves in our feelings, thoughts, and courses of action,
and find resonance with them in terms of the OL challenges and benefits
they highlight.

Tan’s Vignette: A Lack of Belonging in the Learning


Community

I have participated in an online (post)qualitative methodologies course


and an intensive research writing course since the second wave of the
COVID-19 pandemic. It was OL that gave me and my colleagues the
opportunity to take this distance course and to meet my coauthors to
write this chapter. The scenario of exchanging knowledge with my
colleagues was not as I imagined:

“You are now going to discuss how photovoice works as a teaching


method,” said my course instructor. He divided the class into three
breakout rooms. I was randomly put into a group to discuss how
photovoice can be used for individual work. Nine people were in
the room, but only two, a female member and me, turned on our
cameras and engaged in the discussion.

Silence often resulted when the class transferred from lecture presenta-
tion to collaborative work in a breakout room. I wondered whether what
I was sharing was meaningful because almost nobody turned on their
152 Pham, Tran, Xuan, and Le

camera or responded to anything (see Figure 8.1). This was not a dis-
course as I had expected. The silence disrupted my iterative process of
“moving between the personal worlds and the shared worlds” (Garrison
et al., 2001, p. 10) and reduced my sense of belonging to the group. I did
not learn much in this breakout room (or others like it).
My classmates’ decision not to display their profile pictures or videos,
hindering a sense of classroom belonging, encapsulated a dilemma
(Hirsch & Smith, 2017) in terms of time and identity engendered by demo-
graphic differences. It was not until the instructor joined our breakout room
that a sense of belonging began to form. He reconfigured our identities and
reconnected our shared worlds by asking questions back and forth, encour-
aging group members to share or comment on others’ ideas, summarizing
the discourse, and (re)constructing knowledge. His course of action built
understanding in terms of social presence and was a precondition for self-
reflection in terms of cognitive presence (Garrison et al., 1999).
My experience in these group discussions emphasized to me that
group discussions could be an assessment approach in OL if the mutual
relationship between cognitive and social presence were balanced (by
the instructor). Given that OL retains several drawbacks related to gaps
in content and understanding, group discussion could serve to help
learners exchange information and co-construct new knowledge. In
terms of student assessment, it could illustrate to the instructor how
much knowledge students have acquired and how they have used their

Figure 8.1 A screenshot of my online class. Three profile pictures have been
covered for anonymity.
Analyzing Presence in Online Learning Environments 153

understanding to contribute to the (re)construction of new knowledge


(Fall et al., 2000). An assessment—not merely in OL—should go beyond
evaluating students’ achievement at the end of the course. It should focus
on nurturing students’ learning progress and improving the teaching and
learning process through regular assessment activities for continuous
feedback and pedagogical adaptation (Baleni, 2015).

Chi’s Vignette: The Value of a Virtual “Library”

OL was a challenge for me right from the first meetings with my supervisors,
as I tried to deal with the complexity of online meetings given the unstable
internet connections, calls dropping, and time differences. These negative
aspects amplified my feelings of disconnection and isolation, resulting in
me experiencing a lack of belonging and an uncertain sense of identity.
When my data generation was finished, I submitted applications to return
to campus. Unfortunately, my applications were rejected. My concern was
that the lack of physical study support and lack of social contact with staff
and other students would jeopardize my capacity to finish my program
on time. All supervision and communication activities were redesigned
and conducted online. It was not until three of my colleagues and I had a
group chat one day, coming up with the idea of creating a virtual “library”
on Zoom, that I reestablished my momentum. We held a daily Zoom
meeting that replicated a library’s quiet space. Every time I looked at the
screen and saw my friends working, it inspired me to do the same.
Our virtual library provided not only intellectual support when we
shared academic knowledge, but also significant emotional support
during our difficult and painful research journey. I was moved to tears
when one friend burst into tears after learning that her elderly parents
had contracted COVID-19 and were being rushed to hospital. Another
friend broke down after learning that her best friend had died of COVID-
19. I was away from the screen for three weeks during my mother’s
hospitalizations for a spike in liver enzymes, followed by the deaths of
two individuals who were close to me. The library added an element of
humanity by connecting us with others and making our suffering visible
to those who could support us (see Figure 8.2).
In the second year of the pandemic, I reached a critical point in finishing
my thesis. As I was new to post-qualitative inquiry approaches, I anticipated
that working closely with the supervision staff during in-person meetings,
rather than online meetings, would be more beneficial and increase my
154 Pham, Tran, Xuan, and Le

Figure 8.2 Attendees in the virtual library. We put our hands over our mouths to
de-identify ourselves.

chances of success. However, the international travel ban made this option
impossible. Recognizing my academic challenges, my supervisor held
monthly online individual progress meetings that provided a safe space
for me to explore my feelings of insecurity and manage my academic
journey. In addition, my supervisor chaired monthly “silence and write
sessions” and “peer reading sessions,” where she encouraged her doctoral
student cohort to meet and connected us to external reading groups, net-
work activities, podcasts, and conferences related to our fields of study.
Using regular online connection, she created a community of robust
scholarly engagement that fostered new approaches, perspectives, skills,
knowledge, and meaning-making in online workshops and meetings. As
a result, I had an individual meeting, group reading or writing session,
or other academic forums weekly. These sessions were useful sources of
development of cognitive presence (Gibson et al., 2012).
This OL experience was an incredible form of intellectual and cogni-
tive support that offered participants a high level of course engagement
Analyzing Presence in Online Learning Environments 155

(Majeski et al., 2018). During these sessions, my supervisor served as both


a mentor and a critical friend, managing her teaching roles and our cog-
nitive presence (Garrison, 2007). I am aware that some of my doctoral
colleagues have had negative outcomes from virtual learning. Among
them are two of my friends. I am the only one in our group of three
who completed my thesis on time. One had to request a three-month
extension; the other returned to her home country and was unable to
finish after six years of full-time study. In terms of assessment, my satis-
factory progress indicates that success is dependent on online mentorship
as well as positive peer relationships and collaborative learning (Homer,
2022). Other students may require the combination of all three types of
presence in order to achieve positive learning outcomes. I believe that the
degree to which OL is positive and meaningful is inextricably linked to
individuals’ level of engagement with those they care about.

Xuan’s Vignette: Lack of Engagement and Integrity in OL

OL had negative impacts for me, despite the fact that initially it looked
advantageous. Yes, it allows learners to learn from anywhere, yet I think
that OL has caused students to become disengaged from the classroom
(Kuo et al., 2014). I noticed that the hugs and handshakes that normally
occur when meeting others did not take place; instead, I was confronted
with black squares on the computer screen that made my emotions grad-
ually turn negative, leading to loss of motivation. Although my class was
multicultural, no activities facilitated any cultural interactions. The class
was merely a monologue presentation of the lecturer. I found that I was
missing out on important elements of building a social presence in an OL
environment because my classmates and I did not have the opportunity
to express our feelings to one another (Garrison, 2009).
I also concluded that the way in which assessment was conducted in
my class did not accurately gauge student learning. I decided to drop out
from my doctoral program in economics because my teacher used free
material on the internet to make questions for the test. I was shocked: I did
not think that a professor would test graduate students with questions
sourced from Quizlet (https://quizlet.com/). Although I consulted with
other instructors and other students in the class, no one seemed to care
about the matter. This situation soured my opinion about the integrity
and quality of online teaching and learning during the pandemic.
According to Garrison (2009), to create and maintain a demanding com-
munity, teaching presence is essential, and instructional planning must
156 Pham, Tran, Xuan, and Le

be undertaken to ensure that a welcoming environment is established.


As well, activities that involve students in group projects and reflective
exercises with defined goals must be designed. Teachers in my department
were not prepared to transition to OL, evidenced by the lack of discussions
and group activities and the fact that test questions were sourced from
Quizlet. Graham et al. (2018) found that using multiple-choice quizzes,
such as those found on Quizlet, may not accurately assess students’ know-
ledge or skills in online courses. According to their findings, more complex
and authentic assessments, such as case studies or group projects, may be
better suited for evaluating student learning in online courses.
For Garrison (2009), teaching presence and instructional preparation
are crucial to create a challenging community and encourage cognitive
presence, which are necessary for effective OL and evaluation. Cognitive
presence, hence, is difficult without teaching presence. Without proper
communication and formative input from teachers, students will not gain
skills relevant to their careers (Lara et al., 2020). My course lacked this
proper communication and formative input from teachers. In terms of
assessment, switching to OL without planning or preparation resulted in
mismatched strategies and overused multiple-choice exams that did not
effectively evaluate learning (Hodges et al., 2020). Gikandi et al. (2011)
also noted the misalignment of assessment strategies in OL, finding that
many online courses used summative assessment, such as final exams,
rather than formative assessment, which may have provided more feed-
back and interaction. Formative assessment, feedback, and interaction,
which are essential to good assessment processes, may be limited without
group activities and discussions (Hrastinski & Dennen, 2020). Appropriate
planning and preparation for OL and assessment are essential to maintain
the quality of student learning (Garrison, 2011).

Giang’s Vignette: Chaos and Disconnections

My vignette is framed by my experience taking Doctoral Seminar 2, a


key course in my joint PhD program in Educational Studies in Canada. In
the summer of 2020, amid a COVID-19 surge that had significant global
impacts on education (Nguyen & Le, 2021; Tran et al., 2022), my sem-
inar moved to Zoom. It caused tremendous disappointment and created
many difficulties for students and instructors in balancing their study,
work, and family obligations. One of my classmates was a single mother
based in Vancouver. Given the time difference, she spent two months
Analyzing Presence in Online Learning Environments 157

engaging in all course activities (e.g., lectures, group work, presentations,


and discussions) starting at 5:00 am.
When I saw people on my computer screen—people I knew well
and had waited a year to reconnect with—I did not feel any emotional
connection. I missed hugs, shoulder pats, cuddles, and smiles. According
to Garrison (2009), affective expressions (e.g., sad, frustrated, happy,
joyful, excited, and satisfied) are key to the construction of social presence
in an OL environment, where students can share personal experiences of
emotions and feelings. For this online course, affective expressions were
missing from the main lecture sessions, which were exhausting, hour-
long lectures. Shorter meetings consisting of small group discussions
about the course assignments would have been better. Figure 8.3 captures
a moment of a delightful and relaxing discussion I had with my cohort
after a long, tiring lecture.

Figure 8.3 Virtual presence.


158 Pham, Tran, Xuan, and Le

OL was a particular challenge for students who were also parents. To


engage in the course, some of my classmates had to hide from their chil-
dren, who were staying home with them due to school closures across
Canada. Others found it difficult to commit (Garrison, 2009) to learning
activities due to family obligations. I felt sorry for them, but their disen-
gagement affected me, and I had to accommodate their unstable schedules.
Late night meetings occurred regularly as that was the only time they
were free from their children. The social presence was disrupted as the
class struggled to sustain our commitment to learning and interacting
with one another.
Without social presence, cognitive presence is less possible in any edu-
cational context, including OL (Garrison, 2009; Garrison et al., 2010).
Garrison (2009) indicated that cognitive presence cannot be achieved
without socialization in learning and teaching, including teacher-
student and student-student interactions, and teachers’ facilitations
as teaching presence. In the online course, given that my classmates
and I could not concentrate on our collaborative learning and that
we received little support from instructors (online teaching was a
new experience for them too), we did not engage well in course activ-
ities that involved inquiry and exploration in learning. Quarantines,
distractions, and the online format challenged us to stay focused on our
learning. The course syllabus was provided in advance and included fre-
quent breaks, instructors offered individual meetings to identify issues,
and a few assignments and activities were removed. Instructors were
listening to students and trying to show their presence in this online
classroom. However, their efforts did not do much to reduce anxiety or
stress, and from my perspective, the removed assignments would have
helped students to evaluate their own learning outcomes and progress.
Instructor-student dialogue is crucial for student engagement; collab-
orative OL course development would help to ensure courses are built
and refined to meet desired learning outcomes.

Implications and Conclusion

Our experiences as graduate students enrolled in education programs in


international contexts have given us a unique perspective on OL during
the pandemic. Our vignettes illustrate benefits and drawbacks of OL, and
we have used our learning experiences to illuminate the social, cogni-
tive, and teaching elements of the CoI framework (Anderson et al., 2001;
Analyzing Presence in Online Learning Environments 159

Garrison, 2007; Garrison et al., 1999). On the positive side, OL opened


a new environment to continue formal education during the pandemic.
It was a safe space for live lectures, group discussions, and assessments,
and—mediated by an internet connection—it allowed for international
partnership and collaboration. In Tan and Chi’s stories, the presence of
social interaction and emotional connection in their courses reduced
their anxiety and promoted their cognitive skills.
On the negative side, our stories indicate difficulties in creating an
effective OL environment for teachers and students. All four stories high-
light that OL affected our identities and disconnected us from our peers
and teachers. Our OL did not provide social presence, which resulted
in cognitive deficits; this outcome was evident in Xuan and Giang’s lack
of motivation and exhaustion as their affective expressions and physical
interactions were blocked. The absence of the teaching element in the OL
environment also led to cognitive deficiency. Chi encountered a lack of
academic, professional, and technical support that initially influenced her
task completion. Xuan’s instructors failed to create a space that allowed
for social interaction and did not prepare well for changes in curriculum
and assessment approaches. Supported by the instructor in the group
discussion, Tan and his classmates became active and conducted mean-
ingful conversations. These experiences illustrate an absence of cognitive,
social, and teaching presence, which are essential in framing an effective
OL pedagogical approach (Garrison, 2009).
From our experiences, we suggest more effective pedagogical
approaches from teachers to enhance social, cognitive, and teaching
presence in virtual environments. Approaches to enhance social presence
could encompass dialogical and interactive activities (e.g., open and
inclusive discussions) in which students would feel engaged and have
their learning concerns be heard. Reciprocal listening circles could facili-
tate cognitive presence through the presence of teaching, allowing both
teachers and students to share their thoughts and cocreate knowledge
(G. N. H. Le et al., 2021). In terms of promoting teaching presence in OL,
teachers should receive more technological and academic support from
institutions to improve their preparation for online teaching. Giang and
Xuan’s stories suggest that if their instructors had received support from
their institutions earlier, they would have had more time to gather better
learning materials and formulate better assessment approaches.
We conclude by presenting recommendations for better assessment
practices. First, assessment (not merely in OL) should focus on nurt-
uring students’ learning progress. Rather than summative assessment,
160 Pham, Tran, Xuan, and Le

regular formative assessment based on selected learning activities


allows for continual feedback and pedagogical adaptation. Second, short
meetings about course assignments should be offered to eliminate any
misunderstandings about the requirements or marking scheme. Third,
adopting group discussions as formative assessment (such as in Tan’s
story) could be effective: They provide learners with a space for social
interaction and cognitive development while also demonstrating their
level of understanding and progress to the instructor. Finally, we believe
that all institutions should invest more resources in technology-enabled
teaching and learning. Those efforts should include more information on
modalities of pedagogy in terms of teaching practices, curriculum devel-
opment, and how assessment frameworks are organized.

References

Ali, W. (2020). Online and remote learning in higher education institutes:


A necessity in light of COVID-19 pandemic. Higher Education Studies, 10(3),
16–25. https://doi.org/10.5539/hes.v10n3p16
Anderson, T., Liam, R., Garrison, D. R., & Archer, W. (2001). Assessing teaching
presence in a computer conferencing context. Online Learning, 5(2). https://
doi.org/10.24059/olj.v5i2.1875
Appana, S. (2008). A review of benefits and limitations of online learning in the
context of the student, the instructor and the tenured faculty. International
Journal on E-learning, 7(1), 5–22.
Baleni, Z. G. (2015). Online formative assessment in higher education: Its pros
and cons. Electronic Journal of e-Learning, 13(4), 228–236. https://files.eric.
ed.gov/fulltext/EJ1062122.pdf
Bains, M., Goei, K., & Kaliski, D. (2021). Online learning and engagement in the
foundational sciences during the COVID‐19 era: Perceptions and experiences
of graduate students. The FASEB Journal, 35, 21–40. https://doi.org/10.1096/
fasebj.2021.35.S1.03132
Blaikie, F. (2021). Worlding youth: Visual and narrative vignettes embodying
being, becoming, and belonging. In Visual and cultural identity constructs of
global youth and young adults (pp. 36–61). Routledge.
Chang, H., Ngunjiri, F. W., & Hernandex, K. C. (2013). Developing qualitative
inquiry: Collaborative autoethnography. Left Coast Press.
Clark, R. C., & Mayer, R. E. (2016). E-Learning and the science of instruction: Proven
guidelines for consumers and designers of multimedia learning (4th ed.). Wiley.
Analyzing Presence in Online Learning Environments 161

Fall, R., Webb, N. M., & Chudowsky, N. (2000). Group discussion and large-
scale language arts assessment: Effects on students’ comprehension. American
Educational Research Journal, 37(4), 911–941. https://doi.org/10.2307/1163497
Firang, D., & Mensah, J. (2022). Exploring the effects of the COVID-19 pandemic
on international students and universities in Canada. Journal of International
Students, 12(1), 1–18. https://doi.org/10.32674/jis.v12i1.2881
Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive,
and teaching presence issues. Journal of Asynchronous Learning Networks, 11(1),
61–72. www.learntechlib.org/p/104064/
Garrison, D. R. (2009). Communities of inquiry in online learning. In Encyclopedia
of distance learning (2nd ed., pp. 352–355). IGI Global.
Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and
practice. Routledge.
Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-
based environment: Computer conferencing in higher education. The
Internet and Higher Education, 2(2–3), 87–105. https://doi.org/10.1016/
s1096-7516(00)00016-6
Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive
presence, and computer conferencing in distance education. American Journal of
Distance Education, 15(1), 7–23. https://doi.org/10.1080/08923640109527071
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the
community of inquiry framework: A retrospective. The Internet and Higher
Education, 13(1–2), 5–9. https://doi.org/10.1016/j.iheduc.2009.10.003
Gibson, A. M., Ice, P., Mitchell, R., & Kupczynski, L. (2012). An inquiry into
relationships between demographic factors and teaching, social, and cogni-
tive presence. Internet Learning, 1(1), 7–17. https://doi.org/10.18278/il.1.1.2
Gikandi, J., Morrow, D. A., & Davis, N. F. (2011). Online formative assessment
in higher education: A review of the literature. Computers & Education, 57(4),
2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
Graduate Student Association. ( July 2020). Submission to University of Melbourne’s
Semester 1 2020 assessment review. University of Melbourne. https://gsa.
unimelb.edu.au/wp-content/uploads/2020/07/Online-Assessment-
Submission-FINAL-copy.pdf
Graham, C. R., Woodfield, W., & Harrison, J. B. (2018). A framework for insti-
tutional adoption and implementation of blended learning in higher educa-
tion. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.
iheduc.2012.09.003
Hernandez, K. A. C., Chang, H., & Ngunjiri, F. W. (2017). Collaborative
autoethnography as multivocal, relational, and democratic research:
162 Pham, Tran, Xuan, and Le

Opportunities, challenges, and aspirations. a/b: Auto/Biography Studies, 32(2),


251–254. https://doi.org/10.1080/08989575.2017.1288892
Hirsch, S., & Smith, A. (2017). A view through a window: Social relations,
material objects and locality. The Sociological Review, 66(1), 224–240. https://
doi.org/10.1177/0038026117724068
Hodges, C. B., Moore, S. A., Lockee, B. B., Trust, T., & Bond, M. (2020). The diffe-
rence between emergency remote teaching and online learning. Educational
Review. https://vtechworks.lib.vt.edu/handle/10919/104648
Homer, D. (2022). Mature students’ experience: A community of inquiry study
during a COVID-19 pandemic. Journal of Adult and Continuing Education.
https://doi.org/10.1177/14779714221096175
Hrastinski, S., & Dennen, V. P. (2020). Instructional planning for online teaching:
A learning design approach. Routledge.
Humphreys, M. (2005). Getting personal: Reflexivity and autoethnographic
vignettes. Qualitative Inquiry, 11(6), 840–860. https://doi.org/10.1177/
1077800404269425
Jones, S. H., Adams, T. E., & Ellis, C. (Eds.). (2016). Handbook of autoethnography.
Routledge.
Kim, G. C., & Gurvitch, R. (2020). Online education research adopting the com-
munity of inquiry framework: A systematic review. Quest, 72(4), 395–409.
https://doi.org/10.1080/00336297.2020.1761843
Kuo, Y. C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2014). Interaction,
internet self-efficacy, and self-regulated learning as predictors of student sat-
isfaction in online education courses. The Internet and Higher Education, 20,
35–50. https://doi.org/10.1016/j.iheduc.2013.10.001
Lara, J. A., Aljawarneh, S., & Pamplona, S. (2020). Special issue on the current
trends in E-learning assessment. Journal of Computing in Higher Education,
32(1), 1–8. https://doi.org/10.1007/s12528-019-09235-w
Le, G. N. H., Tran, V., & Le, T. T. (2021). Combining photography and
duoethnography for creating a trioethnography approach to reflect upon edu-
cational issues amidst the COVID-19 global pandemic. International Journal of
Qualitative Methods, 20. https://doi.org/10.1177/16094069211031127
Le, G. N. H. (2021). Living a queer life in Vietnam. In F. Blaikie (Ed.), Visual
and cultural identity constructs of global youth and young adults (pp. 213–227).
Routledge.
Lee, V. W. Y., Lam, P. L. C., Lo, J. T. S., Lee, J. L. F., & Li, J. T. S. (2022). Rethinking
online assessment from university students’ perspective in COVID-19 pan-
demic. Cogent Education, 9(1), Article 2082079. https://doi.org/10.1080/233
1186X.2022.2082079
Analyzing Presence in Online Learning Environments 163

Majeski, R. A., Stover, M., & Valais, T. (2018). The community of inquiry and
emotional presence. Adult Learning, 29(2), 53–61. https://doi.org/10.1177/
1045159518758696
Nguyen, T. M., & Le, G. N. H. (2021). The influence of COVID-19 stress on psy-
chological well-being among Vietnamese adults: The role of self-compassion
and gratitude. Traumatology, 27(1), 86–97. https://doi.org/10.1037/
trm0000295
Omar, H. A., Ali, E. M., & Belbase, S. (2021). Graduate students’ experience and
academic achievements with online learning during COVID-19 pandemic.
Sustainability, 3, Article 13055. https://doi.org/10.3390/su132313055
Sahu, P. (2020). Closure of universities due to coronavirus disease 2019 (COVID-
19): Impact on education and mental health of students and academic staff.
Cureus, 12(4).
Tran, V., Le, G. N. H., & Thuy, T. L. (2022). Impacts of international education
shifts through transnation stories of three Vietnamese doctoral students. In
A. W. Wiseman (Ed.), Annual review of comparative and international education
2021, Vol 42A (pp. 93–105). Emerald Publishing House.
Xie, X., Siau, K., & Nah, F. F. H. (2020). COVID-19 pandemic—online education
in the new normal and the next normal. Journal of Information Technology Case
and Application Research, 22(3), 175–187. https://doi.org/10.1080/15228053.2
020.1824884
Yang, Y., & Cornelius, L. F. (2004). Students’ perceptions towards the quality of online
education: A qualitative approach. Association for Educational Communications
and Technology.
Yee, E., Jung, C., Cheriberi, D., Choi, M., & Park, W. (2022). Impacts of
transitioning to an online curriculum at a graduate school in South Korea due
to the COVID-19 pandemic. International Journal of Environmental Research and
Public Health, 19, Article 10847. https://doi.org/10.3390/ijerph191710847
The Unintended 9
Influence of COVID-19
Optimizing Student Learning by
Advancing Assessment Practices
through Technology
Katrina Carbone, Michelle Searle,
and Saad Chahine

Classroom assessment is foundational to the current standards-based


education system (DeLuca et al., 2016; Popham, 2013). Despite the
continuous emphasis placed on classroom assessment (McMillan,
2013; Turner, 2012), assessment education is an overlooked compo-
nent of teacher education programs (DeLuca et al., 2013; Popham,
2011). Nevertheless, teachers are still expected to be assessment lit-
erate, meaning they can understand significant concepts of classroom
assessment (DeLuca et al., 2019; Popham, 2013). Stiggins (1991) defined
assessment literacy as having the knowledge and understanding of,
and ability to apply, foundational assessment principles and practices
to support and report student learning. Assessment literacy helps
teachers motivate student learning and informs instructional strategies
to enhance teacher effectiveness, thus improving the learning environ-
ment for all students (Black & Wiliam, 2018). Despite the continued
growth of assessment research, teachers—especially beginning teachers
who lack confidence in this area—are largely unprepared to successfully
integrate assessment practices into the classroom (Brookhart, 2004;

DOI: 10.4324/9781003347972-12
The Unintended Influence of COVID-19 165

DeLuca & Bellara, 2013). Teacher preparation was further complicated


by the COVID-19 pandemic, which redirected attention to practical and
pedagogical issues with a focus on technology integration and its role
in supporting assessment (Carrillo & Flores, 2020; König et al., 2020).
In alignment with efforts to increase assessment education (DeLuca
et al., 2010, 2021), in this chapter we examine a course design oriented
towards assessment literacy, with technology interwoven to adjust to
pandemic circumstances.
Due to the pandemic, we, the instructional team, and authors of
this chapter, were prompted to (re)conceptualize how foundational
assessment learning and skill development for preservice teachers
could be implemented in an online environment. We sought to trans-
form course delivery to provide meaningful opportunities for preservice
teachers to develop assessment literacy by using and integrating available
technology. To explore how technology was integrated into the course,
the technological pedagogical content knowledge (TPACK) framework
was infused with assessment literacy constructs to develop a conceptual
heuristic (see Figure 9.1).
The TPACK framework identifies three types of knowledge needed
for the successful integration of technology (Koehler et al., 2013):
The intersection of course content (what is being taught), instructor
pedagogy (how content is delivered), and technological knowledge
(using technology) form TPACK (Mishra & Koehler, 2006). Since its
original publication in 2006, TPACK has become one of the leading
theories for educational technology integration in research and prac-
tice (Graham, 2011).
Preservice teacher education programs typically include a single
course on technology integration (Parra et al., 2019). However, that
course focuses on technological skills rather than the intersection of
pedagogy and content knowledge (Chai et al., 2010; Mishra et al., 2009).
This study explored how technology was integrated within assessment
instruction to model and activate assessment literacy while developing
preservice teachers’ confidence in integrating assessment and technology.
Specifically, we asked, in what ways can the TPACK framework provide
insight into the role of assessment and technology in a post-pandemic
educational landscape? How does a preservice teacher course focused on
assessment and evaluation enable learning about the role of assessment
and technology while also exploring ways to integrate them to optimize
learning in preservice and K–12 contexts?
166 Katrina Carbone, Michelle Searle, and Saad Chahine

Integrating the TPACK framework with assessment literacy


Figure 9.1 
constructs. T = technology; P = pedagogy; C = content; PCK = peda-
gogical content knowledge; TCK = technological content know-
ledge; TPACK = technological pedagogical content knowledge
framework; TPK = technological pedagogical knowledge. Adapted
from “Using the TPACK Image,” by M. J. Koehler, 2011 (http://
matt-koehler.com/tpack2/using-the-tpack-image/). Copyright 2011
by tpack.org. Reproduced with permission. Also adapted from
“Technological Pedagogical Content Knowledge: A Framework for
Teacher Knowledge,” by P. Mishra and M. J Koehler, 2006, Teachers
College Record, 108(6), pp. 1026–1029 (https://doi.org/10.1111/j.1467-
9620.2006.00684.x). Copyright 2006 by Columbia University.
The Unintended Influence of COVID-19 167

Research Context

This study was conducted in Ontario where preservice teachers are


prepared for the profession through 16 months of coursework and
placements, which because of the pandemic, took place both in person
and online. Courses include foundational courses, professional studies,
and curriculum and concentration courses. Practice opportunities
include four classroom-based placements and one alternative placement
where preservice teachers gain experience beyond the traditional class-
room. Within the program, assessment education is offered through a
blended approach consisting of explicit instruction through a mandatory
three-week foundational course and integration with other courses at
each instructor’s discretion. Preservice teachers enrol in one concentra-
tion course on a topic of interest. The assessment and evaluation concen-
tration (AEC), the focus of this study, offers direct instruction promoting
assessment literacy. The AEC includes two graded courses related to the
theoretical positioning of assessment and evaluation through the prac-
tical application of theory, knowledge, and skills.
Prior to the pandemic, preservice teacher education took place entirely
in person. The health crisis resulted in a swift change to an online-only
format, and then, during this study, we transitioned into a blended com-
bination of in-person and online delivery. We embrace the blended format,
which can optimize learning by intentionally pairing content with peda-
gogical approaches that strategically use technology to complement and
reinforce learning (Garrison & Vaughan, 2008; Linder, 2017). We concur
with Namyssova et al. (2019) that blended learning is an opportunity to inte-
grate instruction and learning using synchronous and asynchronous tools.

Methodology

This qualitative study used a case study design. Yin (2018) defined a case
study as a comprehensive empirical inquiry that is positioned in context
to construct aggregate understandings of a phenomenon. The study
involved preservice teachers enrolled in the AEC. By studying this specific
group within a specific course, we aimed to facilitate a rich description
and understanding of how assessment and technology can support the
growth and development of preservice teachers.
168 Katrina Carbone, Michelle Searle, and Saad Chahine

Participants

Data were collected from one AEC cohort of preservice teachers (N = 16)
in Ontario. This cohort included primary/junior (Grades K–6) and inter-
mediate/senior (Grades 7–12) preservice teachers. Thirteen of the 16
preservice teachers enrolled in the AEC consented to their data being
included. Participation in the research was voluntary, and participants
were free to withdraw any data related to their course assignments or
activities. As we were the AEC instructors, participants were a sample of
convenience, given that they were enrolled in the course, but it was also
a purposeful sample, given the intended area of study.

Data Collection

Data were gathered from course activities, assignments, and three focus
groups held at two time points. Although the course syllabi were not
a direct data source, they informed the description of assignments and
course topics. Data collection took place both in person and online to
mirror the course delivery format. As well as consenting to participate in
the focus groups, participants also consented to the use of their course
assignments and activities for research.

Course Activities and Assignments

The preservice teachers in the sample were expected to complete all


course activities and assignments as usual. Figure 9.2 provides an over-
view of the eight assignments that all participants completed during
the course. The assignments were numbered (A1, A2, A3, etc.), and
these numbers are used to refer to these assignments throughout this
chapter.
Throughout the course, technology was used to model the way pre-
service teachers could integrate technology in the classroom. Examples
include Nearpod (a platform that features interactive slide-based lessons;
https://nearpod.com), Plickers (an audience response system that uses
QR codes; https://get.plickers.com), Kahoot (a game-based learning plat-
form; https://kahoot.com), and Poll Everywhere (an audience response
system that uses personal devices; www.polleverywhere.com).
The Unintended Influence of COVID-19 169

Figure 9.2 AEC course assignments. A = assignment; AEC = assessment and


evaluation concentration.

Focus Groups

Participants engaged in reflective check-ins in small groups at two


time points to reflect on their experiences with assessment and tech-
nology in both the AEC and during their placements. The information
provided insights related to developing assessment literacy and the role
of assessment and technology, and it gave us feedback to inform future
course iterations.

Data Analysis

Qualitative data were extracted from the course site and deidentified
before data analysis using a two-phase process. The initial analysis began
with us precoding course activities (e.g., Nearpod reports) and course
assignments the participants had agreed to share. Precoding involved
circling and highlighting significant passages worthy of attention
(Layder, 1998; Saldaña, 2016). The course activities and assignments
were developed to encourage personal and professional growth, and thus
included reflections on the learning process, technology integration, and
course material. We then engaged in open coding of the verbatim focus
170 Katrina Carbone, Michelle Searle, and Saad Chahine

group transcripts to identify and discuss emerging themes. As part of


the analytic process, we carefully reviewed and examined the alignment
between the focus group conversations and individual reflections
participants documented in their course activities and assignments.
The initial insights gained from precoding were compared with
the codes elicited from the open coding to highlight similarities, com-
bine codes, or generate new codes. From this process, a code book
was developed, and all data were deductively coded. At multiple stages
throughout the analysis process, we held regular working sessions to
support reflexivity and meaning-making. We used thematic data analysis
to identify and uncover patterns to reveal a deeper understanding of the
possibilities for assessment and technology to optimize student learning
(Braun & Clarke, 2006; Patton, 2014).

Findings

The data extracted from the course activities and assignments were cross-
referenced with the focus group transcripts to provide insight into how
the AEC enabled learning about classroom assessment and the use of
technology. Identifying alignment between the data sources also provided
an opportunity to examine participants’ growth in knowledge and skills
needed for assessment literacy. The TPACK framework (Mishra & Koehler,
2006) was used to identify areas where technology had been integrated and
provided a lens for considering aspects of participants’ contemporaneous
experiences and future possibilities for assessment and technology. Four
overarching themes were found: infinite assessment possibilities, limited
access to resources, increased time commitment, and the learning curve.
The themes reflect preservice teachers’ perspectives about assessment edu-
cation and technology. In the discussion section, we examine our efforts to
develop assessment literacy in relation to the TPACK model.

Infinite Assessment Possibilities

Assessment literacy is a core concept in the AEC that is operationalized


and linked with assessment theory and policy. Then, theory and practice
are integrated when working through the major purposes of assessment
(for, of, and as learning). In the course, participants are encouraged to
think about the arch of assessment as spanning from the moment they
The Unintended Influence of COVID-19 171

meet students through their continuous formative and summative


assessments. Assessment and technology were explored in broad ways
and within their subject areas through instructional examples, course
assignments, and activities.
The use of technology was an expectation in the AEC, not just in the
delivery but also in the linking of activities, assignments, and practice.
One preservice teacher described it this way:

Being in this course in general assessment, I found I went into my prac-


ticum more confident about assessment [and] evaluation compared
to peers who haven’t been in [the AEC] . . .. Because of this course,
and my broader understanding of assessment and evaluation, I was
able to incorporate lessons much easier with tech[nology]. I found
this course has opened up my eyes to tech[nology].

Across the data, participants expressed an increased willingness during


the course to explore multiple forms of assessment, including diagnostic,
formative, and summative, and a range of modes, including technology.
Another perspective emerged as participants described the possibil-
ities of technology and assessment, and those of teaching more gen-
erally, as immense and ever evolving. Some participants indicated that
their increased learning about teaching, assessment, and technology left
them feeling overwhelmed with the possibilities that technology enables.
One noted that it can be difficult to select the best option for assessment
because they are “constantly discovering better and better tools and
[are] definitely eager to learn more about them.” We worked to culti-
vate an appetite for inquiry and ongoing learning while also emphasizing
planning and assessment strategies that could be used to minimize the
prodigious number of options. Strategies included coplanning with cur-
riculum expectations in mind, seeking advice on useful applications from
mentors, setting time limits when scouring the internet for resources, and
co-constructing success criteria with students. Preservice teachers were
often reminded that the purpose of assessment is to help them achieve
their learning goals.
Participants noted that when selecting tools for assessing students, con-
text is important, such as the preservice teachers’ future subject matter or
division. One preservice teacher noted that the tools the instructors “gave
us weren’t very specific to each of our teachables.” Another asked, in
describing their confidence with technology, “how it can be applied to our
classroom.” In relation to assignment A1, participants sourced a range of
172 Katrina Carbone, Michelle Searle, and Saad Chahine

assessment tools, many of which use technology. These included Plickers,


Mindomo (www.mindomo.com), Socrates (www.socrates-software.
com), and Pear Deck (www.peardeck.com), among others. Although we
recognize that applications are used differently in different contexts, iden-
tifying these specific tools made participants’ learning about assessment
and technology more tangible and offered a starting place for them to
think about how to integrate technology and assessment.
Participants also noted that assessment-related technology could be
interchangeable. One preservice teacher described technology as “more
diverse” because there is “no longer such a simple platform of delivery . . .
but it’s more assistive of certain ideal, expected results.’’ Although
participants noted that context is important, the flexible possibilities
for assessment and technology were seen as essential to optimizing stu-
dent learning. One preservice teacher shared a common refrain: “Your
assessment should always have a purpose, and the technology you use
should align with that purpose.” The data revealed that participants were
developing a more sophisticated understanding of assessment literacy
and of the possibilities of technology and student learning.
Many participants increased their knowledge about assessment
opportunities and became more skilled at identifying how to build their
assessment and technology repertoire while developing their professional
practice. Preservice teachers recognized that assessment purpose is a
foundational consideration, as is making choices about using technology
to achieve it.

Limited Access to Resources

Participants frequently noted limited access to resources as a challenge


because not all students in K–12 environments have or can acquire
personal devices, and even those with devices experience issues with
internet connections during school or online learning from home.
Participants noted that in their placements, it was rare for all students
to have access to a device; usually the preservice teachers had to plan far
in advance to secure devices if they wanted to incorporate any type of
technology into their assessment or instructional practices. For example,
one preservice teacher noted that “the school that I’m at, it’s big; we have
1,600 students, and so there’s only a certain amount of Chromebooks. . . .
The Unintended Influence of COVID-19 173

You’d have to book all of them.” Another preservice teacher expressed


similar experiences:

In my past practicum there was probably about half of the amount


of [devices] required that we needed, so it was always an ordeal to
go to other classrooms to see if we could borrow them. . . . It always
took a great amount of planning prior, to ensure that we could have
the amount of tech that we needed.

The preservice teachers offered both elementary and secondary


perspectives at a time of instability when classes were sometimes
occurring in person and sometimes online as required by public health.
Teacher preferences, departments, and school contexts, as well as district
and ministry guidelines and policies, play a role in the availability of tech-
nology for teaching or assessment.
Although many preservice teachers reported limited access, a select
few were within school districts that supported funding for technology
and provided access for all students and teachers. For example, one par-
ticipant explained that at thsssse school they were placed at, “all students
were expected to have access to a laptop or tablet.” Although this universal
access to technology was useful for some preservice teachers, for others it
was troublesome because, given that they were not employees, they did not
have the same access as their students did, and thus could not access student
work. A preservice teacher noted their frustration when teaching online:

All the kids had Chromebooks. . . . But because I don’t have a


Chromebook, I couldn’t do some of the quizzes, or I couldn’t see
them because the school had a setting that you can only access
[coursework] from a Chromebook. . . . That way students couldn’t
cheat, but because that happened, I didn’t have access to some of
their tests.

Chromebooks were mentioned by many preservice teachers as a form


of technology available in schools. Many preservice teachers experienced
and overcame challenges with gaining access to district email systems,
identifying district-supported subject software, using online booking
systems, gaining access to learn grading software, and learning other
systems that support teaching and assessment and technology. Even with
174 Katrina Carbone, Michelle Searle, and Saad Chahine

access to technology, preservice teachers noted that their desire to use it


required a flexible approach to teaching. As one participant explained,

You have to be prepared for things to not go as planned neces-


sarily. Believe me, the Wi-Fi just won’t work that day because that
happens, sometimes, so I would say, technology can add a lot, but
have a backup plan.

Another preservice teacher stated, “You have to be versatile enough that


if it doesn’t work, there is a backup plan. Yes, you’ve tested it and looked
into it, but if it doesn’t work, you have to start talking because instruc-
tional time is precious.” However, other participants noted that developing
assessment and technology backup plans requires time and expertise.
Given their desire to maximize their opportunities to teach, participants
found they did not have additional time to develop them. Thus, the next
theme is the increased time commitment that implementing assessment
and technology often entailed.

Increased Time Commitment

Within the AEC course, interactive presentation tools (e.g., Nearpod, Poll
Everywhere, Pear Deck) were used to bolster student engagement and
model the use of technology with multifaceted options as part of peda-
gogy. One preservice teacher saw the use of Nearpod in their classroom
as effective for teaching but felt they “couldn’t do any more than one
lesson, because it just took so long to prepare,” and they were responsible
for three classes each day. After all, preservice teachers told us, “Teachers
are busy,” and instructional time is precious. Participants found that an
increased time commitment is required to plan for and then effectively
use technology for teaching and assessment in K–12 classrooms.
In addition to preparation time, a major hurdle in classroom work or
assessment sessions is the extra time students needed to set up or log
in. For instance, one preservice teacher shared their experience of the
amount of time needed for students to type in “a password that was lit-
erally on the board.” Despite the password being in plain sight, students
“still like to raise their hands to get help, . . . [and technology] uses a
lot of time.” The increased time commitment varied with class size,
students having their own devices, and school access to shared devices.
Where technology can streamline submission and feedback, learners’
The Unintended Influence of COVID-19 175

technology-related skills and challenges navigating different platforms


may require additional instruction and planning to avoid limiting oppor-
tunities for learning and feedback. Although time was frequently cited as
a challenge, the data from the AEC course activities and assignments and
the focus groups showed that participants would persevere to gain experi-
ence and grow their confidence as professionals who are committed to
using assessment and technology to optimize student learning.

The Learning Curve

When initially learning about assessment and technology in the AEC


course, participants described feeling “anxious and panicked.” In fact,
one participant explained that at the outset of the class, when reviewing
the syllabi, they “thought [they were] definitely going to fail” because
of the scope of new knowledge and content. Nevertheless, they were
surprised “to find out that [assessment and technology] can be extremely
fun, handy, and interesting.” Participants said they found the course had
“opened up” their understanding of the possibilities that exist when peda-
gogy and technology are intentionally and skillfully combined.
When reflecting on the AEC’s blended approach, the participants
spoke about the necessity and value of having time individually and in
groups, both during class time and beyond it, to process thinking and
learning. One preservice teacher explained, “I really liked the [blended]
classroom model.” Another noted the combination of asynchronous and
synchronous learning was very “accessible to the [pandemic] situation.”
An advantage of the multiple modalities (e.g., face-to-face, instructor-
led, online asynchronous components) of blended learning was the
opportunity to access and extend learning beyond the in-person experi-
ence and to engage with peers to navigate challenges or share ideas in an
online space.
When on placements, many participants described implementing tech-
nology and assessment learning in K–12 classrooms. Examples included
using Kahoot to conduct diagnostic assessments, Nearpod to integrate
teaching with assessment, and Google Classroom (https://classroom.
google.com) to promote formative feedback. Unfortunately, our data
show that not all participants had equal opportunities to experiment with
assessment and technology during their placements due to the need to
negotiate new relationships and programmatic requirements alongside
curricular expectations, and because of the resistance they encountered.
176 Katrina Carbone, Michelle Searle, and Saad Chahine

They cited associate teachers, department cultures, and school policies


as the main sources of resistance. In some instances, associate teachers
were resistant to allowing technology to be used in the classroom at all,
let alone in connection with assessment tasks.

Discussion

The study findings provide a glimpse into the ways in which preservice
teachers think about assessment education, technology, and assessment
practices (e.g., assessment design, co-constructing learning goals, test
design, scoring) using technology. Given the dynamic nature of learning,
this discussion explores the complexity of developing assessment literacy
and how the many intersections offered through the TPACK framework
(Mishra & Koehler, 2006) align with the AEC course.
Participants enrolled in the course because they had an interest in
the topics or had goals related to eventual educational leadership. Most
course participants expressed uncertainty around assessment and a desire
to feel confident using data to make decisions about instruction and
grading. Our experiences and research show a lack of teacher confidence
about assessment education and also provide evidence of how central it
is to education because a significant amount of time is spent in activities
related to assessment and evaluation (e.g., Black & Wiliam, 2018; Looney
et al., 2018).
As instructors who have codeveloped and taught the AEC course
over multiple years, we often simultaneously conduct inquiry to con-
tinually improve our teaching and learning related to assessment (e.g.,
DeLuca et al., 2021; Searle et al., 2021). Although research into classroom
assessment (e.g., McMillan, 2013; Turner, 2012) provides valuable insight,
assessment literacy remains the core goal of the course. When it comes
to promoting assessment literacy, although the placement assignments
were not our responsibility as the AEC instructors, the different options
(e.g., in-person, remote synchronous, asynchronous modules) proved
to be a barrier to preservice teachers learning to apply assessment and
technology ideas. Course participants expressed regret in not having
in-person K–12 placement opportunities due to the COVID-19 pandemic.
Often, preservice teachers were curious about how virtual placements
would be applicable to in-person classrooms and whether these modified
placements might influence their professional learning or future class-
room practices.
The Unintended Influence of COVID-19 177

Data confirmed that preservice teachers require placements in


authentic teaching and learning environments to become acquainted
with the requirements and expectations of teaching. Buckworth (2017)
highlighted the importance of these placements for learning about daily
operations and for practising pedagogy, including the ability to develop
assessment literacy. Despite some limited placements, overall, participants
reported that they began to use the AEC as a supportive professional
community, where they could access resources from the course site, post
questions to seek support, collaborate with other AEC participants, and
when not in a placement with K–12 students, work on coteaching to prac-
tice assessment with technology.
Technology was central to assessment education during the pandemic
and required us as AEC instructors to move beyond our usual instruc-
tional practices and assignments to meaningfully create opportunities to
explore technology and assessment education. The TPACK framework
originated in 2006 (Mishra & Koehler, 2006), but the pandemic awakened
an urgency to increase our awareness and embrace this model as an oppor-
tunity for reflective learning. The TPACK framework recognizes that cer-
tain technological tools work best depending on the context. Adjusting
instruction to include collaborative thinking allowed us to make visible
how preservice teachers could respond or modify assessment and tech-
nology goals, depending on the needs of students, the available resources,
and local guidelines or policies.
Layering the TPACK framework (Mishra & Koehler, 2006) into the
AEC provided an aspirational map for considering the integration of
assessment literacy and technology. Being attentive to the alignment
between content and pedagogy, including assessment, has always been
paramount for us as AEC instructors. The requirement to use tech-
nology to teach assessment and foster assessment literacy demanded
further experimentation to sustain alignment. Our emerging findings
reflect the need to remain attentive to the TPACK core values while also
examining the integration of these elements as a productive space for
preservice teachers to learn about assessment identity. Instructionally,
intentionality of assessment with technology integration was evident
in our ongoing use of Zoom (https://zoom.us), Nearpod, and the dis-
cussion forum. Discussion boards, for example, promoted the intersec-
tion of technological pedagogical knowledge by making use of existing
and university-supported technologies such as the learning management
system. With the goal of integration and learning about assessment, the
discussion boards provided a common space for accessing class activities,
178 Katrina Carbone, Michelle Searle, and Saad Chahine

storing class resources, promoting discussion, and, critically, hosting links


to access additional platforms. An emphasis on the integration of tech-
nology within AEC led us to rethink the role of attendance and class-
room participation in a blended environment.
All course assignments directly aligned with the TPACK framework
(Mishra & Koehler, 2006) and promoted the intersection of one or more
intersections. For example, five assignments (A1, A3, A4, A5, and A8)
allowed participants to use foundational components of assessment lit-
eracy to promote learning and develop strong connections between the
content and pedagogy (pedagogical content knowledge). Similarly, five
assignments (A1, A2, A6, A7, and A8) offered preservice teachers oppor-
tunities to use technology and understand how these various resources
can change and influence pedagogy (technological pedagogical know-
ledge). Finally, participants had the opportunity to explore how tech-
nology and content can influence each other yet conflict. The preservice
teachers identified the websites and resources that aligned best for certain
grades and subjects, and that certain types of technology are associated
with constraints (e.g., cost) or limitations (e.g., access to one-to-one
devices). Examples of the assignments that promoted the technological
content knowledge intersection were A1, A4, A5, A6, and A8.
These reflections highlight the intentionality required when mer-
ging technology with assessment throughout the AEC. We suspect that
the overlap in the TPACK intersections may have positively reinforced
the assessment literacy concepts and how assessment can be connected
effectively to technology. To different extents, all assignments explicitly
promoted the intersection of TPACK domains by identifying how tech-
nology can be applied to subject matter to represent and formulate it
in novel ways. Course assignments needed to be altered in a blended
learning environment to honour the multiple forms of engagement that
were introduced. For example, presentations could be prerecorded and
uploaded to the discussion forum, an offering unimaginable prior to the
blended learning model. The design for the AEC is intended to advance
the core values of assessment literacy while addressing the need to be
mindful about access and the barriers that can arise when integrating
technology.
The TPACK framework (Mishra & Koehler, 2006) is influenced by
identity development and practice; thus, having the opportunity to work
with others can promote changes in pedagogy and practice (Adie, 2012;
Phillips, 2017). As the preservice teachers collaborated with the AEC
instructors and peers, and with associate teachers and other colleagues
The Unintended Influence of COVID-19 179

while on teaching placements, they constructed and negotiated new


understandings about assessment and technology.

Conclusion

The TPACK framework highlights and promotes technology integration


in various ways (Mishra & Koehler, 2006), but teachers face barriers when
striving to assess students using technology. In Canada, for example, edu-
cation programs are accredited by provincial bodies and K–12 education
is provincially mandated with funding and oversight through regional
school districts. Thus, mandates about assessment (e.g., Ministry of
Education, 2010) and decisions about technology funding, access, and
resources are led provincially. District-level policies and school-level
guidelines regarding the implementation of assessment and technology
vary widely to tailor educational decisions to local contexts. As a conse-
quence, as highlighted by this study’s data, no matter what values and
messages are espoused in teacher education programs, teachers always
face varying levels of dissonance in practice. For example, our preservice
teachers experienced uneven access to technology and support for the
integration of assessment and technology.
This research highlights challenges and opportunities as context for
thinking about the integration of assessment and technology. Our study
reinforces teaching and learning as fluid, driven by purpose, context,
and desired outcomes. To promote assessment literacy, instructors must
be skilled pedagogues who can model and support the use of different
approaches for teaching and assessing. The TPACK framework offers a
deep understanding of the importance of technology and the nuances to
be considered when integrating assessment knowledge and skills in prac-
tice (Mishra & Koehler, 2006). For example, preservice teachers explored
various types of technology through course assignments and had oppor-
tunities to develop and integrate new knowledge through placements
and reflections. We tried to bridge theory and practice beyond our
in-class instruction by co-constructing tools (e.g., learner inventories)
and resources (e.g., Google Slides) with participants. We also encouraged
the use of assessment with technology and structuring reflection about
assessment practice in action.
The AEC course provides opportunities that prioritize professional
learning through goal setting, experience, and reflection. Were it not
for the pandemic, the emphasis on technology might have remained
180 Katrina Carbone, Michelle Searle, and Saad Chahine

secondary to other assessment literacy aims. With this cohort, we inten-


tionally and consistently modelled the use of technology to promote
assessment literacy by developing knowledge and skills related to all
types of assessment practice. Further, course assignments offered choices
(e.g., A1, A3) for meeting learners’ identified goals and overcoming barriers
while enabling the demonstration of assessment knowledge acquisition
and promoting possibilities for assessment pedagogy. Updating the AEC
course design, achieved through successive revisions and cycles of inquiry,
led to prioritizing technology as a catalyst linking assessment knowledge
and application.
We learn by studying our practice as instructors and inviting pre-
service teachers to share what they learned about assessment literacy
in the AEC course. Together, through the data collection and analysis,
we consider the blended space we are teaching in to anticipate how
our teaching of assessment literacy may continue to expand or shift to
accommodate different eventualities or contexts. Mishra and Koehler’s
(2006) TPACK framework informed our thinking and our analysis of
student experiences, and we wonder if in the future we might gen-
erate different understandings from students if they co-construct data
and analysis with us using the framework. Future research might
explicitly include assessment literacy with direct connections to
the TPACK framework and opportunities for preservice teachers to
think about TPACK in their future classrooms. More broadly, future
research could build upon the pandemic experiences to address the
continued purposeful integration of assessment with technology,
using the TPACK framework as part of the revisioning of assessment
education.

References

Adie, L. (2012). Learning as identity and practice through involvement in online


moderation. Educational Assessment, Evaluation and Accountability, 24(1), 43–
56. https://doi.org/10.1007/s11092-011-9132-4
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.
1080/0969594X.2018.1441807
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative
Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp0
63oa
The Unintended Influence of COVID-19 181

Brookhart, S. M. (2004). Classroom assessment: Tensions and intersections in


theory and practice. Teachers College Record, 106(3), 429–458. https://doi.
org/10.1111/j.1467-9620.2004.00346.x
Buckworth, J. (2017). Issues in the teaching practicum. In G. Geng, P. Smith, &
P. Black (Eds.), The challenge of teaching (pp. 9–17). Springer.
Carrillo, C., & Flores, M. A. (2020). COVID-19 and teacher education: A litera-
ture review of online teaching and learning practices. European Journal of
Teacher Education, 43(4), 466–487. https://doi.org/10.1080/02619768.2020.
1821184
Chai, C. S., Koh, J. H. L., & Tsai, C.-C. (2010). Facilitating preservice teachers’
development of technological, pedagogical and content knowledge (TPACK).
Journal of Educational Technology & Society, 13(4), 63–73. www.learntechlib.
org/p/52307
DeLuca, C., & Bellara, A. (2013). The current state of assessment education:
Aligning policy, standards, and teacher education curriculum. Journal of
Teacher Education, 64(4), 356–372. https://doi.org/10.1177/0022487113488144
DeLuca, C., Chavez, T., Bellara, A., & Cao, C. (2013). Pedagogies for preservice
assessment education: Supporting teacher candidates’ assessment literacy
development. The Teacher Educator, 48(2), 128–142. https://doi.org/10.1080/
08878730.2012.760024
DeLuca, C., Coombs, A., MacGregor, S., & Rasooli, A. (2019). Toward a differen-
tial and situated view of assessment literacy: Studying teachers’ responses to
classroom assessment scenarios. Frontiers in Education, 4, Article 94. https://
doi.org/10.3389/feduc.2019.00094
DeLuca, C., Klinger, D. A., Searle, M., & Shulha, L. M. (2010). Developing a cur-
riculum for assessment education. Assessment Matters, 2, 20– 42. https://doi.
org/10.18296/am.0080
DeLuca, C., Lapointe-McEwan, D., & Luhanga, U. (2016). Teacher assessment
literacy: A review of international standards and measures. Educational
Assessment, Evaluation and Accountability, 28(3), 251–272. https://doi.
org/10.1007/s11092-015-9233-6
DeLuca, C., Searle, M., Carbone, K., Ge, J., & LaPointe-McEwan, D. (2021).
Toward a pedagogy for slow and significant learning about assessment
in teacher education. Teaching and Teacher Education, 101, Article 103316.
https://doi.org/10.1016/j.tate.2021.103316
Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education:
Framework, principles, and guidelines. Jossey-Bass.
Graham, C. R. (2011). Theoretical considerations for understanding techno-
logical pedagogical content knowledge (TPACK). Computers & Education,
57(3), 1953–1960. https://doi.org/10.1016/j.compedu.2011.04.010
182 Katrina Carbone, Michelle Searle, and Saad Chahine

Koehler, M. J., Mishra, P., & Cain, W. (2013). What is technological pedagogical
content knowledge (TPACK)? Journal of Education, 193(3), 13–19. https://doi.
org/10.1177/002205741319300303
König, J., Jäger-Biela, D. J., & Glutsch, N. (2020). Adapting to online teaching
during COVID-19 school closure: Teacher education and teacher competence
effects among early career teachers in Germany. European Journal of Teacher
Education, 43(4), 608–622. https://doi.org/10.1080/02619768.2020.1809650
Layder, D. (1998). Sociological practice: Linking theory and social research. Sage
Publications.
Linder, K. E. (2017). The blended course design workbook: A practical guide. Stylus
Publishing.
Looney, A., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualising
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.1080/
0969594X.2016.1268090
McMillan, J. (Ed.). (2013). SAGE handbook of research on classroom assessment. Sage.
Ministry of Education. (2010). Growing success: Assessment, evaluation, and reporting
in Ontario schools. www.edu.gov.on.ca/eng/policyfunding/growSuccess.pdf
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content know-
ledge: A framework for teacher knowledge. Teachers College Record, 108(6),
1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
Mishra, P., Koehler, M. J., & Kereluik, K. (2009). The song remains the same:
Looking back to the future of educational technology. TechTrends, 53(5),
48–53. https://doi.org/10.1007/s11528-009-0325-3
Namyssova, G., Tussupbekova, G., Helmer, J., Malone, K., Mir, A., & Jonbekova,
D. (2019). Challenges and benefits of blended learning in higher education.
International Journal of Technology in Education, 2(1), 22–31. www.ijte.net/
index.php/ijte/article/view/6
Parra, J., Raynor, C., Osanloo, A., & Guillaume, R. O. (2019). (Re)Imagining
an undergraduate integrating technology with teaching course. TechTrends,
63(1), 68–78. https://doi.org/10.1007/s11528-018-0362-x
Patton, M. Q. (2014). Qualitative research & evaluation methods: Integrating theory
and practice. Sage.
Phillips, M. (2017). Processes of practice and identity shaping teachers’ TPACK
enactment in a community of practice. Education and Information Technologies,
22(4), 1771–1796. https://doi.org/10.1007/s10639-016-9512-y
Popham, W. J. (2011). Assessment literacy overlooked: A teacher educator’s con-
fession. The Teacher Educator, 46(4), 265–273. https://doi.org/10.1080/08878
730.2011.605048
The Unintended Influence of COVID-19 183

Popham, W. J. (2013). Classroom assessment: What teachers need to know (7th ed.).
Pearson.
Saldaña, J. (2016). The coding manual for qualitative researchers (3rd ed.). Sage
Publications.
Searle, M., Ahn, C., Fels, L., & Carbone, K. (2021). Illuminating transformative
learning/assessment: Infusing creativity, reciprocity, and care into higher
education. Journal of Transformative Education, 19(4), 339–365. https://doi.
org/10.1177/15413446211045160
Stiggins, R. J. (1991). Assessment literacy. Phi Delta Kappan, 72(7), 534–539.
Turner, C. E. (2012). Classroom assessment. In G. Fulcher & F. Davidson (Eds.),
The Routledge handbook of language testing (pp. 65–78). Routledge.
Yin, R. K. (2018). Case study research and applications: Design and methods (6th ed.).
Sage.
Part III
Teacher Educators and
Assessment in K–12 Contexts
Pedagogically Hacking 10
the System
Developing a Competency-
based Digital Portfolio
Kathy Sanford, Hong Fu, Timothy Hopper, and
Thiago Hinkel

In the last decade, the British Columbia (BC) provincial curriculum has
undergone a seismic restructuring, from a prescribed learning outcomes
curriculum with end-of-school provincial exams to a competency-based
curriculum with performance measures and proficiency scales. The
intent of the restructuring by the BC Ministry of Education and Child
Care (BCMoE) was to enable student success through “a concept-based
approach to learning and a focus on the development of competencies,
to foster deeper, more transferable learning” (BCMoE, n.d.-c, p. 4). In
support of competency-based curricula, Trilling and Fadel (2009) called
for educators to shift their practices towards “a new balance” (p. 38),
which, as we discussed in relation to the BC curriculum, has “indicated a
shift from direct instruction for predetermined and more easily measured
outcomes to evolving and more interactive learning, as well as collective
problem-solving” (Fu et al., 2018, p. 265).
In their book examining the evolution of the new BC curriculum,
Sanford and Hopper (2019) noted that it was developed within the larger
context of a revolution in understanding how people learn, which is
learning about learning. They further stated that for the existing system

DOI: 10.4324/9781003347972-14
188 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

to keep pace with the rapidly changing world, many educators were
working to shift BC education from

an individualistic, competitive, teacher-directed system focused on


test scores and grades to one that embraces the uniqueness of each
child, nurturing their strengths and talents and enabling the infinite
potential of working for and with others in a common pursuit.
(Sanford & Hopper, 2019, pp. 4–5)

In addition, Indigenous knowledge and ways of seeing the world,


recognized as good for everyone, have been authentically and respectfully
integrated into the BC assessment and curriculum framework.
Though the new BC curriculum has reset the capacity of teachers
to direct their practices more to the needs of students and to person-
alize the teaching and learning process, the assessment system is still a
source of confusion, particularly the need to assign grades in the form
of letters or percentages. According to the provincial reporting order,
for Grades 4 and above, “formal reports will include letter grades and
written reporting comments to indicate students’ levels of performance
in relation to the learning standards set out in the curriculum” (BCMoE,
n.d.-a, para. 20). However, in the draft K–9 Student Reporting Policy, “the
Four-point Provincial Proficiency Scale will be used to communicate
student progress in all areas of learning, [and] at Grades 4–9, Boards
will only provide letter grades to parents upon request” (BCMoE, n.d.-b,
p. 2). This suggestion that letter grades may be optional for kinder-
garten to Grade 9 but must be provided for high school students adds to
teachers’ confusion. The implied collapsing of multiple rich formative
assessments to a single summative grade creates an artificial measure of
progress that does little to clarify the students’ learning processes or to
inform students about what they did well and where they need to focus
their attention.
In this chapter, we inquire into the emerging competency-based
assessment practices of teachers in BC, focusing on four teachers from
middle and high schools in one school district during the pandemic. We
describe how they reinvented their assessment practices by utilizing the
communication, organizational, and rich media potentials of freely avail-
able, cloud-based software. These teachers modelled what is termed
pedagogical hacking, which refers to the capacity to repurpose an existing
application system for education (Smith et al., 2018).
Pedagogically Hacking the System 189

Assessment, Competency-based Learning,


and Pedagogical Hacking

We have advocated previously (Fu et al., 2018) that assessment with


the intention to facilitate learning stands at the foundation of BC’s
redesigned curriculum and reporting practices. Summarizing prior lit-
erature reviews on assessment (Black & Wiliam, 2009; Harlen, 2005;
Stiggins, 2008), we concur that assessment should function to inform
educational decision-making and to motivate students to learn. In con-
trast, numerical scores, or grades, as well as other external rewards and
punishment, can focus students’ attention on their abilities compared
to peers rather than on the importance of effort and personal progress.
This comparison can damage the self-esteem and motivation of many
low achievers. Concerns were also raised that a focus on high-stakes
testing encourages high-achieving students to value grades and high
marks over learning, risk-taking, and personal passions (Black & Wiliam,
2018; Harlen, 2005). We propose going a step further by stressing
that although assessment of and for learning still guides assessment
practices for 21st century learning, assessment and learning cannot be
separated from one another. In fact, assessment, when implemented by
the students themselves, is learning.
Several Canadian researchers have explored the implications of the
new BC curriculum for schools and educators in areas such as music,
science, environmental education, and social and emotional education
(Blades, 2019; Fallon et al., 2017; Prest et al., 2021; Storey, 2017). However,
most of these analyses focused on content, competencies, or rationale
behind the curriculum change. There is a dearth of research specifically
examining assessment practices within the new BC curriculum and how
such practices are implemented in schools.
Historically, assessment in schools has been restricted to summa-
tive forms of grading and evaluation (Schneider & Hutt, 2014). These
assessment practices have become so normalized that school success is
frequently measured in terms of percentages and grades, a currency that
allows teachers to wield considerable power over students who become
compliant for a grade regardless of the quality of the instruction, or
indifferent if the grade seems unattainable. As Black and Wiliam (2018)
found, from countless reviews of classroom assessment, there is a need
to embed a theorizing of assessment within a broader theorizing of peda-
gogy, in which assessment drives the learning process and is not used to
190 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

justify inferior teaching practices or to control student engagement with


classroom activities.
Competency-based learning, in contrast to content-based learning,
focuses on the processes students learn to achieve an educational out-
come (Neelakandan, 2020). This model of learning is described in the
BC curriculum as a know, understand, and do process. The doing is
seen as a critical element in generating evidence through action or a
product that can be used as evidence of learning. In the traditional
content-based learning, based on facts and predetermined right answers,
learning is implied through passing tests and scoring high marks on
timed examinations. Competency learning focuses on students demon-
strating knowledge, skills, and attitudes of predetermined competencies
in the application of concepts and related content in real-life situations.
Essentially, a competency-based curriculum is learner-centric, outcome
based, and differentiated. Assessment is driven by evidence of learning
through an array of rich media artifacts and performance indicators.
In BC, boards of education must still provide parents of students with
“a minimum of five reports describing students’ school progress” (BCMoE,
n.d.-b, p. 1) following a traditional assessment reporting system. The BC
Teachers’ Federation (2021) noted that schools and educational systems
are not set up for a competency-based type of rich media and highly
differentiated curriculum. School systems still report student learning
through a report card summary periodically sent home to parents with
students’ learning collapsed into numerical percentages or letter grades
based on subject-area designations. Teachers in BC are faced with the
dilemma of promoting a competency-based curriculum while grading
in a traditional way as directed by the BC reporting order. To address
this predicament, teachers in this study pedagogically hacked cloud-based
Google Sheets (www.google.ca/sheets/about/) and the school district
reporting systems to avoid the use of grades and percentages, nurture the
learning of all their students, and communicate the learning to parents.
Hacking is often associated with malicious acts using digital
apparatuses. In educational contexts, however, hacking denotes change to
existing systems or tools to advance students’ learning. Smith et al. (2018)
described the mid-20th century use of the word “hack” to mean using
things for purposes different than those for which they were originally
intended. In connection to education, Smith et al. added that “hacking
is ultimately pedagogical, an act wherein people seek to unravel, decon-
struct, devise and create” (2018, p. xii). Along these lines, hacking is used
in this chapter to refer to getting into the heart of things, understanding
Pedagogically Hacking the System 191

how tools and systems work and the powers that control them, and
then enacting novel uses for them based on pedagogical objectives. To
enact a competency-based curriculum, teachers often must overcome
both systemic and technological barriers and hack their way through by
“subverting constraints that undermine authentic knowledge generation”
(Hagerman et al., 2018, p. 122). In this sense, educationally intended
hacking is a response to a system that tends to compartmentalize people
and skills and, in the context of digital technologies, seeks to produce
a workforce that is “compliant and instrumental in their thinking and
practice” (Smith et al., 2018, p. xv). The concept of pedagogical hacking
aligns with a competency-based curriculum by considering the needs and
interests of students as individuals in their specific contexts.

Methodology

This study uses a narrative case study approach (May, 1997), considering
ways in which assessment can be not only more supportive of stu-
dent learning but also attentive to equity and the diversity of learners.
Drawing on Merriam (2009) and Smith and Sparkes (2020), we used a
qualitative descriptive process focused on participant interviews and
anecdotal insights. We also focused on the narratives generated by the
participants to explain their motivations and gained insights from their
experiences.

Data Collection and Analysis

Four teachers participated in this study, with the pseudonyms Cathy,


Greg, Alice, and Christine. The teachers knew each other and formed
a professional learning community to share knowledge and skills about
their practice. Two rounds of interviews were carried out in spring and
fall 2020. We reverted to the use of Zoom meetings during the pandemic
for interviews and recorded these meetings as data. The four teachers
were at different places on a continuum of technology competency, but
all were embracing digital technology for its potential uses in education.
They taught different subject areas, including English, French, dance,
applied design, skills and technology, accounting, science, and math,
and they all used Google Sheets to develop what they called “compe-
tency charts” that outline different competency levels from “emerging,
192 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

developing, proficient, to extending” (BCMoE, n.d.-b, p. 2) as descriptors


of students’ assessment information.
In this study we expanded on our previous use of online tools for
data collection and analysis (Hopper et al., 2021). The YouTube closed
captioning feature, shared Google Docs, and Highlight Tool (https://
workspace.google.com/marketplace/app/highlight_tool/65634221687)
allowed us, among other things, to work collaboratively while maintaining
physical distance, a possibility that was crucial throughout the COVID-
19 pandemic in 2020 to 2022. The cloud-based tools that we employed
in this study made it possible for us to record and transcribe interviews,
share transcripts with participants for member checking (Hopper et al.,
2021), categorize and analyze data both individually and collaboratively,
and communicate research findings. Further details can be located at our
digital assessment research website (https://sites.google.com/view/
digitalassessmentresearch/home). Interview data from the participants
were coded for stories that described the specific pedagogical and
assessment practices of these teachers.

Teachers’ Emerging Hack Using Google Sheets

The significant technology hack that emerged for the participants was
the use of Google Sheets. In this hack, they took the freely available
spreadsheet program designed for numerical calculations and accounting
tasks to create a matrix that represented curricular competencies. See
Figure 10.1 for an example from the Grade 8 math curriculum.
In the matrix chart, the teacher broke the subject-area curriculum
(knowing) into curricular competencies; for example, in Figure 10.1,
one competency was thinking and communicating. Then, based on the
learning standards for that subject and grade level, the teacher divided
the curriculum into topic areas of understanding to be shown through
class activities and assignments that generated artifacts as evidence of
learning. Students then, in consultation with the teacher and based on
the evidence they generated and how they generated that evidence,
self-assessed themselves in relation to the topic areas as “emerging,”
“developing,” “proficient,” or “extending.” As shown in Figure 10.2,
the Google Sheets app allowed the student to shade in an area with the
comment tool to indicate they had achieved that level in a topic either
with hyperlinks to evidence or with reference to an experience witnessed
or confirmed by the teacher.
Pedagogically Hacking the System 193

Figure 10.1 Sample of Grade 8 math matrix.

As demonstrated in Figure 10.2, the high school curriculum standards


were listed, and each student offered evidence that the teacher responded
to (see small shading in top right corner of cell). Google Sheets automat-
ically sent entries and responses to the teacher and the students. Through
the dialogue feature in the Google Sheets comment tool, the teacher could
confirm assessment and shade in the corresponding cell to yellow, suggest
higher assessment, or ask for additional evidence before confirming the
student’s self-assessment. Hyperlinks in the comment box could lead to
an array of digital artifacts collected by students in their Google Drives.
Shaded areas indicate passed by the teacher and unshaded aspects were
to be addressed in topics generated by the teacher or proposed by the
student.

Findings: Case Descriptions of Key Pedagogical Practices

Case 1: Alice’s Competency Chart and Proficiency Scales

Alice described her chart and scales as the “opportunity for students
to grow and to show their learning in these specific areas or to recog-
nize that they went way above and beyond the expectations.” Alice had
been teaching at a local high school for 14 years, and she was skilled in
194 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
Figure 10.2 Example of high school Grade 10 English chart with shading and commenting focused on new media.
Pedagogically Hacking the System 195

educational technology. She watched for new trends in education over


the years, looking at what she considered the most significant aspect
of education: assessment. A lightbulb moment for her was reading
Schimmer’s (2016) book on standards-based assessment, with ideas such
as differentiated learning to accommodate all learners, student owner-
ship of learning and choice, continuous feedback loops, acknowledged
natural patterns in learning from emerging to developing to proficient,
and encouraging creativity.
Together with Christine, a like-minded colleague, Alice started
developing an alternative assessment system. The two teachers developed
a giant spreadsheet or chart for each course. The focus was always on the
learning process; that is, “what has been achieved and what still has to be
done until students have reached a proficient level as described in the cur-
riculum and on the spreadsheet.” As Alice commented, another advan-
tage of the chart was that it enabled personalized assessment:

It also recognizes the strength of different ways of assessment too.


Because some [students] are great at doing tests and some want to
do a final exam, but others can show me the same thing with an
assignment or a project or even just a conversation I have with them.

The challenge for Alice was to report on students at the midterm point of
her courses. Previously, with support from administration, she could put
no mark at midterm and report at the end of term when students had had
the opportunity to demonstrate their ongoing competence. However,
due to a change in administration, she had to fulfil the reporting order and
struggled with identifying a midterm number to report. Alice explained:

I think when we’re assessing based without numbers or any


calculations or percentages to fall back on, and you’re kind of going
to what you would call professional judgment; but I think some
people are afraid that when it goes to that maybe it can’t be backed
up or it’s not valid enough for them; so they need these bins of
percentages and calculations they can go back to.

Other teachers in the district were interested in trying out the compe-
tency chart. After doing a 10-minute presentation at a professional devel-
opment session for 200 secondary teachers, Alice was “flooded with
emails from high school teachers that were like, ‘I’ve been really trying
to figure something out, and we really want to do something like it.’ ”
196 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

Teachers needed support to start competency-based assessments like the


chart. Alice and her colleagues’ four years of work with their competency
charts has proved to be successful and could be done by more teachers in
the district.
Alice also found that the competency chart was connected to the port-
folio type of assessment, as Google sites were often used to represent
student artifacts in courses:

You would just put a link from the portfolio if it’s online into the
chart, and if you want to see while you’re developing in this area,
you just click and then there is the portfolio evidence right there. So,
it’s perfect in that sense.

She agreed that the competency chart was a kind of educational hacking:

I would definitely say, in the positive term of hacking, for sure.


I think actually Google products themselves lend to educational
hacking so well; there’s so many things that you can do that are not
within the structure of Google Sheets or Google Slides. . . . [Greg]
uses slides like a website presentation tool, essentially, and a lot of
teachers do this. So, there’s just so many things within Google that
allows you to educationally hack them.

Behind all these assessment practices, the new BC curriculum stood as


the larger context for why teachers were doing what they were doing.
In this way, the assessment system set up the course, and it became the
first step in designing a program of study. The assessment became the
learning process.

Case 2: Greg’s Combined Google Hacks

Starting as a social studies teacher in Calgary, Greg had been teaching


a variety of subjects and recently drafting and design at the local high
school for five years. After entering this new school, he became aware of
Christine and Alice’s alternative assessment and was also introduced to
Schimmer’s (2016) ideas.
The competency chart Greg used in class was “basically a roadmap for
the course to show their progression from the emerging levels through to
Pedagogically Hacking the System 197

proficiency in a range of different competencies.” He updated the chart


with shaded yellow sections to indicate that students have shown compe-
tency at certain levels:

What I do is I leverage Google Sheets and Google Slides so that


in the chart I provide direct links to evidence of their work that
would indicate that they’re at that level of competency; so, it’s like
an interactive document where they can click, I can click, [and] they
can also add comments. We use the document to communicate at
different points in the semester; there’s some check-ins surrounding
the chart.

Greg also found it best not to provide a number grade at midterm when
students have moved only halfway through their learning. He believed
the competency chart was “a kind of portfolio.” For Greg, “It’s really
about using technology to have ease of access to their work; just two
clicks and I can see their work.”
Using the comment function in Google was also a way for students to
upload and link their evidence to the chart. Greg started using the term
“hacking” after he went to an educational technology conference where
other teachers who were presenting were saying, “hack this, hack that”:

Things got hacked to make an improvement or to improve the


device or to take it in a different direction; so, I think you have the
platform of Google Sheet, and it’s a rich spreadsheet and it can do
all kinds of things.

Greg also realized from other teachers that Google Slides could be
combined with Google Sheets to become a powerful assessment tool. As
he stated,

What really launched me into another path was a presentation by


another high school teacher at one of the conferences about Google
Slides and how you could hyperlink everything in Google Slides,
everything, and basically you can take Google Slides and make up
a website out of it, which is what I started to do with my students.
It’s almost like I’m creating “textbooks” with Google Slides, and
then I encourage the students to do the same . . . communicating
the level of competency. Technology plays a role, I think, in being
198 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

able to communicate that learning, and using the Google products


makes it instantaneous, live, and quick.

Like Alice, Greg believed that different from traditional marking, using
the competency chart could reduce students’ focus on comparing marks
and put more focus on their own learning; the change of the curriculum
and Greg’s assessment shift happened at the same time. The rollout of
the BC new competency-based curriculum created the conditions, along
with Greg’s knowledge of how to hack Google apps, to radically shift his
assessment practices.

Case 3: Christine’s Perspective on Assessment

“If you’re just getting three out of ten all the time, you never feel
successful.” Christine came to teaching with a different perspective on
assessment because she used to be a nurse. This experience influenced
her understanding of assessment in the best interest of students:

As a nurse, you spend your whole day assessing people from head
to toe, and you are not looking to give them a seven or a nine. . . .
This is where you are at; this is what I’m going to do as my course
of treatment; where are you going to get to? . . . When I came to
teaching after my nursing career, I looked at assessing my learner
like a patient. This is where you are at; this is where I want you to
go. . . . Everybody around me was using numbers, but it just did not
fit for me. . . . I worked with a lot of struggling learners, and I really
liked to work with struggling learners, so I know that one of the
things they really need is to experience success, and if you’re just
getting three out of ten all the time, you never feel successful.

Christine started trying alternative forms of assessment that did not focus
on giving numbers:

When I first started playing around, I also wasn’t sure where that
was gonna take me, and I didn’t see how easy it would be to not
use numbers; if you don’t want kids to ask you about numbers, you
have to not use them. It’s the only way; otherwise, it only seems
reasonable that they’ll ask you about them, and that will matter to
them.
Pedagogically Hacking the System 199

The new curriculum seemed to lend itself to diverse forms of authentic


assessment, and Christine, with the support of the administration and
like-minded colleagues, “literally went home and built them [the compe-
tency charts] over the weekend and was right in with my classes.” Along
the way, the competency chart went from a spreadsheet hard copy to
digital form, “which has more potential in being accessible.” She was not
particularly competent with Google, but working with her colleagues
soon shifted her to a digital medium, hacking Google apps for educa-
tional purposes.
Christine’s aim was to get every student to the proficient column in
every competency. With the competency chart, the teacher and the stu-
dent could easily see what had been achieved and what still needed to be
done, thus helping to pedagogically plan their learning.
When it came to students’ reaction to the competency chart, Christine
identified two types of students:

I think for the high achievers, their struggle is that they want to
quantify usually and so they ask questions like, “What does a 90
look like on here?” That tends to be their focus. I must shift them
to think about their learning as opposed to what the number is
because there’s so much more driven by that carrot at the end. The
second time I teach them it’s no problem at all because they know
their mark is not in jeopardy; it’s their learning that we’re focused
on, and they understand that with me. . . . For the struggling kids,
I think it’s probably kind of the bigness of the chart, the language
of the chart.

Therefore, building trust with students was of vital importance, as


Christine put it, through “assessment in relation to the whole child.”

Case 4: Cathy’s Challenging Letter Grades

“I’d ask the students, ‘Where do you think you’re at?’ So we get them to
self-reflect to decide the grade.” After completing her education degree
in Victoria, Cathy went to Calgary to teach middle school, where she
experienced an assessment change to outcome-based assessment. The
students were still getting a grade at the end of the term, but in a variety
of categories (e.g., practical skills or theory). After moving back to BC
from Calgary, she noted,
200 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

I’m back to percentages and letter grades, and it kind of boggles my


mind because I was marking things and being like, “Well, how do
I decide if this, especially for English, . . . is a 73 or 74?” You know?
Where 72 and 73 were in BC, that’s like a difference of a letter grade
in one percentage, right, so it felt so uncomfortable and so wrong.

Cathy met the other participants in this study as a collaborative group


of teachers using competency charts. She read Schimmer’s (2016) text
with them and knew she had found her professional learning community.
Cathy admitted that the language of the chart was a little intimidating,
but as she took on the language from the new curriculum, it started to
make sense:

All of these [competencies] are things that they’re expected to be


able to do by the end of the term, and we tried our best to kind
of change the language to make it more student friendly. When
we do an assignment, say an assignment in French class, I would
pull up the chart and I would tell the students, “Okay we’re going
to be working on rows 11 and 13 today for this assignment,” and
then each of the students gets their own personal chart. Once
I assess something, I would go in and look to see where they’re
at, and then we would shade it in . . .. It’s very interactive too;
we get them to self-assess. Then they would look at it and they
would go, “Okay, I think I’m in developing,” and so they can
click on it, and with this I can comment, right? So, they’d be like,
“I believe I’m here,” and then they tell me exactly why, giving
their evidence.

By using this approach, the teacher wanted to get students away from
working at a grade or being driven by a grade to thinking about what
they were learning, what they could do, and what they needed to work
on.
Cathy also talked about the reporting requirement to give a final grade.
With the chart, she found that the process could be more personalized:

I have to give them a letter grade, . . . so I use exit interviews. I would


sit down with them; we’d look at the shading in their charts. . . . I’d
ask the students, “Where do you think you’re at?” So, we get them
to self-reflect to decide the grade. . . . Like what percentage they
Pedagogically Hacking the System 201

thought they were at. And then we would have a discussion around
it, and most of the time the kids know, like they know where they’re
at based on what they’ve seen from their chart.

Students could show their level in a way of their choice. For example,
when Cathy worked at the high school, students could still prepare for
a final exam as evidence of learning if they chose to take one. The chart
also allowed for continuity in learning and assessment for students. Cathy
gave the following example:

It worked out well, too, with some of our kids who you know didn’t
pass; so [Christine] and I both taught English 9. So, I had one student
the first semester and he didn’t end up passing English 9. . . . I think
transition to high school was kind of a tough transition for him, so
he just didn’t get a lot of work done, but I just passed his chart on
to her so she could see what he’d already accomplished. She could
just go from there.

Although Cathy believed that it was important for students to learn about
technology in positive ways, her motivation for doing the competency
chart was mainly the assessment and learning pieces:

I just love the authenticity of it, if that makes sense; like for me,
I said it felt so wrong to be just choosing this percentage, but this is
like actually they’re seeing their learning. There’s the evidence of it;
it’s not just like this is a 12 out of 20. It’s, you know, actually looking
at what they’re working on and what they’re able to do, and then
where is the next step? Where do I go from there?

Cathy saw her students moving away from a focus on percentages and
grades to talking about what they could do and what they needed to
work at.

Discussion

Our inquiry focused on the emerging competency assessment practices


of teachers in BC in response to the competency-based curriculum
during the pandemic. Several conditions have supported the teachers
202 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

who changed both their thinking and practices in relation to competency-


based curriculum and assessment:

1. The new BC curriculum and its competency-based assessment


principles focusing on personalized assessment outcomes created
a challenge that disrupted established assessment practices. The
teachers in this study formed a professional learning community to
support one another to embrace the shift in assessment.
2. The onset of the COVID-19 pandemic and the subsequent closing
of face-to-face learning in schools for a large proportion of the year
in 2020 prompted the need to use digital technologies to main-
tain ongoing schooling. This need precipitated and augmented
participants’ use of the innovative assessment method, which worked
well with online learning platforms.
3. The teachers embraced digital technology for pedagogical hacking
purposes for assessment and learning as they formed their pro-
fessional learning community (Stoll et al., 2006). Their common
understanding of pedagogical purposes and intentions was inter-
woven with technological tools and strategies and shared within
the group, which enabled them to creatively adapt their assessment
practices during COVID-19.
4. Current educational research on assessment and learning has echoed
the teachers’ educational beliefs on what makes authentic and valid
assessment and 21st century learning (Black & Wiliam, 2018).

Conclusion

During the pandemic particularly, but also more generally, educators could
easily lose sight of the significance of authentic assessment as an inte-
gral aspect of learning. Educators tend to focus on learning activities and
course content while overlooking assessment as a driver for meaningful
learning. However, the four teachers in this study utilized assessment as
dialogue around learning experiences, enabling learning to be perceived
as ongoing, connected, and developmental. They personalized assessment
and worked with students to support their individual and collective needs
rather than viewing assessment as something done to students (Fu et al.,
2018; Schimmer, 2016).
The innovative assessment approaches outlined in this paper were
important throughout the two years of COVID-19, which demanded
Pedagogically Hacking the System 203

accessible and flexible forms in learning and communication of progress.


The pandemic required educators to reconsider how they were assessing
students’ skills and proficiencies to provide meaningful and engaging
learning and how to work online with parents in connection to students’
work (Hargreaves & Fullan, 2020). In this study, this challenge was
facilitated through an accreditation digital portfolio (Fu et al., 2022). The
participants’ assessment approaches fostered dialogue between teachers
and students about their learning in relation to curriculum competen-
cies—dialogue which was at risk of breaking down during the pandemic
due to interrupted face-to-face schooling. As Abaci et al. (2021) stated,
the pandemic hastened the need for appropriate technologies to be used
for supporting student learning, authentic assessment, and in the case
of our participants, enhanced use of digital technologies through mul-
tiple forms of apps within a coherent system of curriculum delivery and
assessment. Although they were required to assign final numerical grades
at the end of their courses, the participants hacked the assessment system
and digital tools to move practice and understanding forward. Forming
clear and ongoing connections between student learning and assessment
is critical in successful teaching for meaningful learning; in fact, educators
must recognize that assessment is learning, and it should be the core basis
for meaningful pedagogy.

References

Abaci, S., Robertson, J., Linklater, H., & McNeill, F. (2021). Supporting
school teachers’ rapid engagement with online education. Educational
Technology Research and Development, 69(1), 29–34. https://doi.org/10.1007/
s11423-020-09839-5
BC Teachers’ Federation. (October 27, 2021). BCTF response to the draft K–12
reporting order. www.bctf.ca/whats-happening/news-details/2021/10/27/
bctf-response-to-the-draft-k-12-reporting-order
Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment.
Educational Assessment, Evaluation and Accountability, 21(1), 5–31. https://doi.
org/10.1007/s11092-008-9068-5
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.1
080/0969594X.2018.1441807
Blades, D. (2019). Science education in British Columbia: A new curriculum
for the 21st century. In C. Tippett & T. Milford (Eds.), Science education in
204 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel

Canada: Consistencies, commonalities, and distinctions (pp. 13–36). https://doi.


org/10.1007/978-3-030-06191-3
British Columbia Ministry of Education and Child Care. (n.d.-a). Curriculum
redesign: Introduction to BC’s curriculum redesign. Government of British
Columbia. https://curriculum.gov.bc.ca/rethinking-curriculum
British Columbia Ministry of Education and Child Care. (n.d.-b). Draft K–9 stu-
dent reporting policy (2019) for use in 2019/20 pilot 1. Government of British
Columbia. https://curriculum.gov.bc.ca/sites/curriculum.gov.bc.ca/files/
pdf/reporting/draft-k-9-student-reporting-policy.pdf
British Columbia Ministry of Education and Child Care. (n.d.-c). Introduction
to British Columbia’s redesign curriculum. Government of British Columbia.
https://curriculum.gov.bc.ca/rethinking-curriculum; https://24.files.edl.io/
L9ruyqmdMokYjQhTqXr3ijdnSLAPr4NQpF5P0OcsWtfHXiA6.pdf
Fallon, G., Van Wynsberghe, R., & Robertson, P. (2017). Conceptions of sustain-
ability within the redesigned K-12 curriculum in British Columbia, Canada:
Mapping a disputed terrain. Journal of Sustainability Education, 16.
Fu, H., Hopper, T., & Sanford, K. (2018). New BC curriculum and communi-
cating student learning in an age of assessment for learning. Alberta Journal
of Educational Research, 64(3), 264–286. https://doi.org/10.11575/ajer.
v64i3.56425
Fu, H., Hopper, T., Sanford, K., & Monk, D. (2022). Learning with digital
portfolios: Teacher candidates forming an assessment identity. Canadian
Journal for the Scholarship of Teaching and Learning, 13(1), Article 10. https://
doi.org/10.5206/cjsotlrcacea.2022.1.11108
Hagerman, M. S., Wolf, L. G., & Woods, H. (2018). Hacking structures:
Educational technology programs, evaluation, and transformation. In
B. Smith, N. Ng-A-Fook, L. Radford, & S. Pratt (Eds.), Hacking education
in a digital age: Teacher education, curriculum, and literacies (pp. 121–137).
Information Age Publishing.
Hargreaves, A., & Fullan, M. (2020). Professional capital after the pandemic:
Revisiting and revising classic understandings of teachers’ work. Journal of
Professional Capital and Community, 5(3/4), 327–336. https://doi.org/10.1108/
JPCC-06-2020-0039
Harlen, W. (2005). Teachers’ summative practices and assessment for learning—
tensions and synergies. Curriculum Journal, 16(2), 207–223. https://doi.
org/10.1080/09585170500136093
Hopper, T., Fu, H., Sanford, K., & Hinkel, T. (2021). YouTube for transcribing
and Google Drive for collaborative coding: Cost-effective tools for collecting
and analyzing interview data. The Qualitative Report, 26(3), 861–873. https://
doi.org/10.46743/2160-3715/2021.4639
Pedagogically Hacking the System 205

May, S. (1997). Critical ethnography. In N. Hornberger (Ed.), The encyclopedia of


language and education, volume 8: Research methods and education (pp. 197–206).
Kluwer.
Merriam, S. B. (2009). Qualitative research: A guide to design and implementation.
John Wiley & Sons.
Neelakandan, N. (April 25, 2020). What are the key concepts of competency-based
learning? eLearning Industry. https://elearningindustry.com/key-concepts-
of-competency-based-learning
Prest, A., Goble, J. S., Vazquez-Cordoba, H., & Tuinstra, B. (2021). Enacting cur-
riculum ‘in a good way:’ Indigenous knowledge, pedagogy, and worldviews in
British Columbia music education classes. Journal of Curriculum Studies, 53(5),
711–728. https://doi.org/10.1080/00220272.2021.1890836
Sanford, K., & Hopper, T. (2019). Educational transformation: The BC story.
Houlihan.
Schimmer, T. (2016). Grading from the inside out: Bringing accuracy to student
assessment through a standards-based mindset. Solution Tree.
Schneider, J., & Hutt, E. (2014). Making the grade: A history of the A–F marking
scheme. Journal of Curriculum Studies, 46(2), 201–224. https://doi.org/10.108
0/00220272.2013.790480
Smith, B., Ng-A-Fook, N., Radford, L., & Smitherman Pratt, S. (Eds.). (2018).
Hacking education in a digital age. Information Age Publishing.
Smith, B., & Sparkes, A. (2020). Qualitative research. In G. Tenenbaum & R.
C. Eklund (Eds.), Handbook of sport psychology (4th ed., pp. 999–1019). John
Wiley & Sons. https://doi.org/10.1002/9781119568124.ch49
Stiggins, R. (2008). Assessment manifesto: A call for the development of balanced
assessment systems. ETS Assessment Training Institute.
Stoll, L., Bolam, R., McMahon, A., Wallace, M., & Thomas, S. (2006). Professional
learning communities: A review of the literature. Journal of Educational
Change, 7(4), 251–258. https://doi.org/10.1007/s10833-006-0001-8
Storey, M. (2017). Considering core competencies: Social and emotional learning
in British Columbia’s redesigned curriculum. Journal of Contemporary Issues in
Education, 12(2), 34–49. https://doi.org/10.20355/C5J91R
Trilling, B., & Fadel, C. (2009). 21st century skills: Learning for life in our times.
Jossey-Bass/Wiley.
Malfunction 11
Regressive and Reductive Online
Assessment Practices
Shannon D. M. Moore, Bruno de Oliveira
Jayme, and Kathy Sanford

Since the mid-2000s, private companies offering learning management


systems (LMS), applications, and platforms (EdTech) have been clamouring
for influence in the K–12 context. EdTech is sold as a cost-effective solu-
tion to modernize education. The cost-cutting label is misleading in both
the literal and figurative senses (Moore et al., 2021). Over the past two
decades, K–12 classrooms have become graveyards to costly gadgets that
were meant to revolutionize education. Educators have been inundated
with corporate hype supported by provincial governments about the
latest EdTech that will innovate classrooms; these messages are delivered
by corporate employees with no pedagogical background and are about
EdTech not designed by educators. Like the pedalling of textbooks in
previous generations, these standardized solutions divert public funds to
private corporations. However, these new digital tools also create oppor-
tunities for surveillance, data mining, and targeted advertising (Durrani &
Alphonso, 2022). The surveillance is not an unlucky consequence but
rather the intentional design of EdTech that mistakenly conflates data
collection with teaching and learning (Morris, 2018). These design elem-
ents are worrisome for multiple reasons: (a) There are risks to student
well-being, safety, and privacy when corporations drive practice; (b)
the tendency towards surveillance and data mining undermines neces-
sary trusting relationships between students and teachers; and (c) the

DOI: 10.4324/9781003347972-15
Malfunction 207

functionality of EdTech imposes and encourages transmissive pedagogy


and regressive assessment practices. Despite these concerns, neoliberal
rationale is used to increase investments in EdTech and advance online
learning.
The pandemic had a dual impact: By necessity, it increased the role
of EdTech in public education, and as governments and administrators
increasingly turned to corporations to offer educational solutions, it
increased the influence of private companies in the future direction
of public education. Investments in e-infrastructure made through
the pandemic are now being used to mobilize online learning across
Canada (Moore & de Oliveira Jayme, 2022b) even though mandatory
online learning previously faced widescale opposition from parents,
students, and teachers (Parker, 2020). Before a global pandemic
required school boards across Canada to move online, online learning
was neither provincially imposed nor widely adopted; rather, most
often, high school students self-selected online courses to supplement
their school schedules or improve their grades. As a result, critical
scholarship in the K–12 context to support increased online learning is
lacking (Farhadi, 2019).
In this chapter, we outline our concerns about online learning gen-
erally and online assessment practices specifically. Current assessment
literature encourages practices that decentre content, centre students,
and require teacher professionalism. We draw on data fragments from
interviews we conducted with teachers who pivoted online through the
pandemic to demonstrate how teaching online motivated regressive
assessment practices. Finally, we offer a few suggestions for how to main-
tain humanizing assessment practices online.
The warnings offered through this research, in particular about the
way online environments motivate regressive pedagogy and assessment
practices, are important to the teacher-education environment. That is,
although educators may be invested in practices that support student
learning, the (dis)functionality of EdTech can interfere with practice.
As such, teacher-education programs need to ensure that preservice
teachers have a solid understanding of the purpose of education, current
curricula, and sound assessment practices. Teacher-education programs
also need to highlight the way schooling structures and teaching environ-
ments can encourage pedagogy and assessment that undermines current
theories in teaching and learning. As such, reconceptualizing assessment
frameworks for preservice and in-service teachers involves making the
208 Moore, de Oliveira Jayme, and Sanford

default structures online overt, resisting functionality that reduces edu-


cation to individual completion of tasks and summative assessments, and
offering alternatives that humanize pedagogy online.

Online Pedagogy

Education, at its best, supports all learners to be critical, engaged, active


citizens. Over the past decade, shifts to enduring understanding in the
curriculum and student-centred pedagogy in the classroom better reflect
students’ diverse and complex needs. Current curricula invite com-
plexity, deep understanding, and meaningful applications of knowledge
(Manitoba Government, 2022; Province of British Columbia, 2022). The
curriculum engages pressing ecological matters, invites students to con-
sider how Canadians might collectively address historical injustices, and
challenges the primacy of Western knowledge systems. These shifts
challenge commodified understandings of knowledge, transmissive
forms of pedagogy, and the primacy of traditional literacies (Manitoba
Education, Citizenship and Youth [MECY], 2006). Despite curricular and
pedagogical progress, the tools meant to modernize education encourage
regressive forms of teaching and outdated understandings of learning.
Online learning, in its most common form, reduces learning to the
consumption of prescribed, standardized, linear content (Farhadi, 2019;
Stommel, 2020). Online pedagogy that is propelled by teachers, with their
students in mind, has the potential to reject some of the individualized,
standardized, content-focused instruction that is motivated online.
Some online pedagogues are fostering criticality, particularly about the
very technology they use for teaching and learning (see Stommel et al.,
2020). However, by and large, the default online is towards credential-
ling rather than education. Learning becomes an economic transaction
in which the student is the consumer, the teacher is the provider, and
education is a commodity (Biesta, 2005; Ramiel, 2019). By design, online
learning encourages “individualized surveilled data-driven learning [and
discourages] . . . collective curiosity and risk-taking” (Farhadi, 2019,
pp. 7–8). When efficiency is prioritized, students lose the opportunity and
the patience for deliberation and dialogue with a diverse group of people.
Yet, education is necessarily relational and collaborative (Stommel, 2018).
Without relationships, online learning inculcates individualism and com-
petition, and delimits empathy (Boyd, 2016; Fain, 2019; Farhadi, 2019;
Moore et al., 2021; Ross, 2008).
Malfunction 209

Although many pedagogical elements elucidate the way online


learning reduces and regresses pedagogy, assessment practices are par-
ticularly emblematic. Assessment, of course, is untetherable from peda-
gogy; assessment practices demonstrate a teacher’s understanding of
and approach to knowledge, learning, and students. Sound assessment
bridges teaching and learning, informs practice, and enhances student
learning (Black et al., 2004; MECY, 2006; Wiliam, 2011).

Sound Assessment

There are three forms of assessment: assessment of, for, and as learning
(Earl, 2003). Assessment of learning is a form of summative assessment
that often translates to marks and grades, whereas assessment for learning
centres learning over grading (Black & Wiliam, 1998) and assessment as
learning invites students into the assessment process (Earl, 2003). Current
perspectives discourage competition, judging, and ranking, and instead
recognize that assessment should support ongoing active learning and
joy (Fu et al., 2022). In this way, assessment is no longer something that
is done to students, but is done with, by, and for students (MECY, 2006).
This approach to assessment is in keeping with the etymological roots
of the term assessment—to sit beside: “Assessment seen as ‘sitting beside’
implies particular roles and relationships for learner and teacher, different
from those associated with assessment as ‘standing in front of,’ ‘looking
down on,’ or ‘peering over the shoulder” (Swaffield, 2011, p. 440).
Research has repeatedly demonstrated that AfL is a powerful prac-
tice to promote learning (Earl et al., 2011); for this reason, AfL should
represent the largest share of a teacher’s practice (Earl, 2003). The pri-
macy of AfL is evident in the policy directives, resources, and professional
development of nearly every Canadian province and territory (Earl et al.,
2011; MECY, 2006). AfL is anchored to three key elements: decentre con-
tent, centre students, and teacher professionalism.

Decentre Content

AfL decentres the primacy of content. Content is at direct odds with


learning (Stommel, 2018). Curricula in British Columbia and Manitoba,
the provinces in which we live, have taken a significant turn away from
privileging content and measuring output: “A common but problematic
210 Moore, de Oliveira Jayme, and Sanford

scenario when considering a student’s work is to pay particular attention


to the content knowledge that the student has demonstrated in the unit
or course” (MECY, 2006, p. 10). Although the Manitoba curriculum
focuses on enduring understandings, the British Columbia curriculum
uses the term essential learning, which places an “emphasis on deeper
understanding and application of concepts” (British Columbia Ministry
of Education, 2022, p. 2).

Centre Students

AfL focuses on how students learn (MECY, 2006). Current approaches


to assessment emphasize the role of the student in the assessment pro-
cess (Suurtamm & Koch, 2014). In this way, pedagogy recognizes and
accommodates students’ unique learning patterns, strengths, and
challenges (MECY, 2006). AfL centres the students’ experiences, cultures,
and ways of knowing: “Pupils are cast as partners in the learning process,
rather than as passive recipients of knowledge transmitted or delivered by
the teacher” (Swaffield, 2011, p. 440). AfL does not privilege or validate
certain types of learning or evidence of learning. Instead, it encourages
a strength-based approach to feedback that “recognizes that student
learning is dynamic and holistic, and that students demonstrate their
learning in different ways and rates” (Dueck, 2021, p. 58). When students
have choice in their learning, they feel a sense of ownership and are more
likely to be motivated to learn (MECY, 2006). In this way, teacher feed-
back is intended to highlight strengths and offer explicit guidance for
improvement; it also encourages students to develop an understanding
of their learning and to take responsibility for it (Swaffield, 2011). This
type of feedback changes traditional roles in the classroom and requires
meaningful, trusting relationships between students and teachers.

Teacher Professionalism

AfL is a complex process that requires teachers’ professional judgement


in order to modify pedagogy and be responsive to student needs (MECY,
2006). “AfL depends on the knowledge and expertise of teachers—their
knowledge of students, of unlocking students’ thinking, of feedback,
of curriculum, of teaching, of pedagogical content knowledge, and of
Malfunction 211

learning theory” (Earl et al., 2011, para. 17). All elements of this process
require strong, trusting relationships between teachers and students.
Because EdTech is designed within and for a neoliberal frame, it
perpetuates a culture of individualized task completion, grading, com-
petition, surveillance, and standardization that directly contradicts
assessment literature.

Context for Study

In a research project conducted by two of the authors (Moore &


de Oliveira Jayme, 2022a, 2022b), 14 K–12 educators were interviewed
about the transition to online environments due to COVID-19. The
research had three principal objectives:

• to understand how the move to online platforms through the pan-


demic impacted educators’ pedagogical methods and curricular
engagement;
• to consider how the move to online platforms unleashed or delimited
creativity and criticality; and
• to consider the increasing role of EdTech companies in public
education.

Participants in that project spoke of the “hostility” of online learning,


the impact of tech functionality on pedagogy, and the absence of care or
ethics within “the machine.” The teacher participants overwhelmingly
resisted the ways technological functionality and imposition regressed and
reduced their pedagogy and consequently dehumanized, demoralized,
and deprofessionalized their practice (Moore & de Oliveira Jayme, 2022a,
2022b). This chapter summarizes findings from that research.

Findings

Within the interview transcripts, we identified two major themes that


elucidate the coercive and corrupting tendencies of EdTech: The func-
tionality of EdTech, and the overall environment online, interfered with
necessary relationships in teaching and learning. As a result, teachers
defaulted to regressive practices that centred content, task completion, and
212 Moore, de Oliveira Jayme, and Sanford

summative assessment. In turn, the teacher participants felt demoralized,


as these practices did not align with their informed understandings of
pedagogy and assessment. Teacher participants also indicated that EdTech
interfered with the visceral, lively, and relational aspects of teaching and
learning. Online, teaching and learning are dehumanized, teachers are
deprofessionalized by coercive technology, and teachers feel complicit in
systems that promote distrust and surveillance of students.

Regressed and Reduced

The educators in our study made clear that the intent of education is to
encourage critical and creative thinking, motivate student inquiry, foster
relationships, develop problem-solving skills, and practice discussion and
deliberation. With this understanding in mind, the participants could
not reconcile their participation in transmissive pedagogy that deepened
inequities and abandoned students with exceptionalities. They spoke
of the time they spent making videos, sending emails, uploading con-
tent, and tallying assignments. Most of their time was spent preparing
and uploading content and materials. These activities did not reflect the
participants’ understandings of pedagogy, assessment, or the purpose of
education. This incongruity was especially true for assessment:

It [online learning] made it so much more about summative


assessment. Formative assessment was so impossible to gather
because a lot of it is just done informally . . .. The formative stuff
was near impossible. I couldn’t just go around and take a look over
someone’s shoulder and ask them to hand in a draft of what they
were working on or whatever. It became all or nothing, which was
super inauthentic, and it felt really gross. (Dakota, a pseudonym, as
are all participant names in this chapter)

Despite the participants’ views on teaching and learning, education online


was often reduced to the dissemination and completion of assignments.
The teacher participants reverted to practices that centred the teacher and
fetishized individual products of learning: “Actually, I would give them
like a menu board, and they would choose which activities they wanted
to do and then submit those” (Alex). Although Alex’s use of student
choice aligned with the assessment literature, the way in which learning
was reduced to individual assignment completion did not. As the focus of
Malfunction 213

their pedagogy became about the completion of assignments, assessment


became more about product than process. This emphasis on the product
counters the focus on process in the assessment literature: “Students
should be afforded multiple opportunities throughout the learning pro-
cess to demonstrate what they know and can do” (Suurtamm & Koch,
2014, p. 264). AfL requires teachers to be helping students throughout the
learning process, not just creating assignments and marking submissions.
Otherwise, education becomes what another of our participants called
“self-checkout education” (Dakota).
The banking nature of LMS encourages repositories of resources in
which students complete tasks and master content (Boyd, 2016). Yet,
students need meaningful interactions with people in their classroom
community, not just access to materials (Protopsaltis & Baum, 2019). The
turn to summative assessment online, a single event in which students
express knowledge, ignores decades of research and writing on the essen-
tial role of AfL. Our findings parallel a recent report released about the
impact of the pandemic on pedagogical practices (Doucet et al., 2020);
authentic formative assessment is more difficult online. Because group
collaboration, dialogue, drama, and art making are more difficult online,
there is a risk of “slipping back to a reliance on traditional summative
assessments like tests, exams, and essays” (LaPointe-McEwan et al., 2021,
para. 3).

Dehumanized, Demoralized, and Deprofessionalized

[Online teaching has] severely impacted my relationship with students,


you know. I don’t see their faces. I don’t always recognize the sound of
their voice. I can’t identify their work by looking at it. I don’t know what
they’re thinking. I can’t see facial expressions, so I don’t know how to read
any of them anymore. I don’t know how to anticipate their needs.
(Riley)

Teacher participants spoke of being cut off from the bodily, the visceral,
experience of teaching and learning. These comments follow Stommel’s
(2018) critiques of the ways LMS “replace the playful work of teachers
and students with overly simplified algorithms that interface with far too
few of the dynamic variables that make learning so visceral and lively”
(p. 78). As Freire (2005) wrote, “We study, we learn, we teach, we know
214 Moore, de Oliveira Jayme, and Sanford

with our entire body” (p. 5). Teaching is a full-blooded, social, human
process (Connell, 1993). Teaching is “fundamentally emotional work
that involves getting up close to students and drawing heavily on social,
emotional resources and energy necessary for continual improvisation”
(Smyth, 2012, p. 15). The lack of visceral and emotional connection
online impacted the teachers’ capacity to respond to the particular
needs of their students. As Riley’s quote above elucidates, they were left
searching for facial expressions, voices, a sense of who their students
were. Without these connections, teachers lack the knowledge required
to properly engage AfL. Riley could not read their students, and in turn
did not feel the connection required to guide learning. Riley’s sentiments
were echoed by other participants:

And at home it’s really hard because I can’t see their little furrowed
brow, you know. Like I don’t know how to do it . . .. I don’t know how
to explain it, but like you know what, teachers know what kids need
from their body language. I know who needs help with something by
how they’re sitting or how they’re looking. You don’t know that online.
(Noah)

Online teachers lack the necessary interaction with students to make the
daily professional assessments required. AfL requires teacher professional
judgement, which requires the ability to see, understand, and connect
with students.
Beyond a lack of recognition of their students, the teacher participants
often did not recognize themselves: “And I had no interest in interacting
with my students, which is the absolute opposite of what I’m normally
like” (Dakota). For Dakota, the inability to be in the same physical envir-
onment with their students, to teach with their full body—alongside the
goosebumps, smiles, furrowed eyebrows, crossed arms, and exchanged
glances—destroyed their desire to connect with students. This disem-
bodied form of teaching contributed to demoralized feelings, as the teacher
participants could not fulfil the ethical requirements of their profession.
The functionality of EdTech not only interfered with the participants’
professional obligations but also often served as a reminder of these
failures:

Having the freaking Google Hangout [https://hangouts.google.


com] notification. I think it will be in my nightmares for the rest
Malfunction 215

of my life, where I want to help each and every one of those kids,
and the tyranny of that noise, and they make it seem so efficient
and so fluid. And it’s actually so awkward and time-consuming and
restrictive.
( Jules)

As Jules’s response demonstrates, they were haunted by the sounds of the


machine that restricts and regulates their interactions with students. The
rigid structures of LMS strictly control and reduce honest interactions
(Stommel, 2018). Ignoring the human connection and relationships
required in a classroom, EdTech redefines teaching and learning by what
can be measured and rewarded. These neoliberal approaches supplant
teacher knowledge and ethical responsibility with output and production.
Moreover, they mistake automation for professionalism. Connell (2013)
referred to this approach as a system of remote control that undermines
the capacity of teachers to make autonomous professional judgements in
the interests of their students.
Beyond hindering relationships, EdTech functionality corrupted
relationships in the classroom. Teachers could see how many times
students had logged on, the duration, and the exact time that they sub-
mitted their assignments. The machine audits, manages and surveils, and
fundamentally changes the relationship between teachers and students.
Yet, the assessment literature stipulates that teachers must sit beside, not
surveil from above (Swaffield, 2011). The insidious managerialism, built
directly into technology, recasts teaching as the regulation of students
(Ball, 2016). In turn, teachers lack the necessary trusting connection with
their students to foster learning.
The EdTech industry is built on a distrust of students, evidenced in
surveillance testing software and plagiarism detection. These tools erode
necessary relationships between teachers and students. Moreover, these
tools place student privacy at risk, subject students to commodification
and dataveillance, penalize students for socioeconomic factors, per-
petuate racism, and regulate exceptionalities (Swauger, 2020).
The priority for EdTech developers is the collection and analysis of
student data; student value is derived from the commodification of data
assemblages (Lupton & Williamson, 2017): “The data collected profits
EdTech companies; however, it also alters students’ understanding of
educational worth, of societal values, and of themselves. Students start to
perform to please the indicators, and teachers come to see their students
216 Moore, de Oliveira Jayme, and Sanford

as data points” (Moore & de Oliveira Jayme, 2022b, para. 23). This focus
on data fundamentally changes the relationship between students and
teachers.
Beyond our serious concerns about students’ rights and necessary
classroom relationships, we question the value and validity of assessment
that is administered through testing software. Put simply, if one can cheat
so easily (a fear fuelled by marketing), does the assessment tool really
align with the current assessment literature? Or is the teacher simply
using a fancy tool to evaluate student recall? Moreover, these techno-
logical solutions miss the foundations of assessment: to be fair and valid.
Is it fair to expect someone to be recorded while they are taking a test?
Moreover, if the research shows that students are motivated and engaged
when they feel they are being treated fairly (Tierney, 2014), how would
this nonconsensual surveillance impact student achievement? When one
is being recorded, are their test results valid? Does any of it truly reflect
what a student knows or understands?

Discussion

“You know, technology does not lead to better outcomes; better teachers
will lead to better outcomes” (Riley). As educators who have embraced
selected technology in our classrooms, we recognize certain pedagogical
advantages. Online pedagogy, however, goes beyond the integration of
technology to be reliant on it. In this way, education risks becoming a
mere exercise of technology (Freire, 2014). In turn, teachers must actively
work against the default settings that centre teachers and content, regu-
late and surveil students, and conflate data collection with teaching
and learning. Moreover, as LaPointe-McEwan et al. (2021) reiterated,
although the principles of sound assessment remain, the way teachers
achieve these principles may look different online. When students and
teachers, rather than technology companies, propel curricula and
methods online, learning better reflects current understandings of cur-
riculum and pedagogy.

Humanizing Assessment

Standardization, objectification, dehumanization, control, surveillance,


and measurement are traits of neoliberalism; current schooling practices,
Malfunction 217

particularly those commonly practised and encouraged online, naturalize


these traits. A truly liberating and emancipatory education offers alter-
native ways of being in the classroom and in the world (Freire, 2001).
Emancipatory education, as opposed to banking education, inspires
students and teachers to take ownership of their dreams and curiosities,
and to assume responsibility for their own learning. This emancipation is
achieved by bringing to the classroom what students already know; that
is, their intra- and interpersonal experiences (Vygotsky, 1962).
This shift is possible only if teachers reject the ranking systems that are
encouraged by design in LMS. Instead, teachers need to invite learning
and assessment that are intimately interwoven and produced with active
student participation. When students are active participants in how they
want to be assessed, what kind of questions deserve their answers, and
how they can demonstrate links between what they learn in the class-
room and what they already know (and vice versa), more inclusive and fair
learning opportunities can be contemplated. If, during AfL, students have
the opportunity to share their dreams, curiosities, fears, and doubts about
the subject matter and the world, teachers and students can all experi-
ence stronger and healthier relationships. There will be no empty minds
and no knower in this process: Everybody knows, everybody teaches, and
everybody learns.
To illustrate emancipatory assessment practices that decentre con-
tent, centre students, and value teachers’ professionalism, we provide an
example of how one of the authors developed assessment practices that
operate upon (a) centredness, (b) consciousness, and (c) responsive con-
nectedness to the whole collective. These practices follow a framework
that Freire (1994) identified as dialectical methodology (see Figure 11.1).
Freirian dialectical methodology starts with people (Freire, 1994); in
this case, students and teachers verbalizing and reflecting upon their
own existence and how they experience their worlds. For this reflection,
students and teachers bring to the classroom their previous knowledge,
personal histories, and how they practice their existence in the world.
Once participants share this background, students and teachers are
invited to analyze individual practices. One easy way to start the analysis
of the individual practices is to encourage students to ask themselves,
“Why do I operate in the world in this way?” and “Why do I think this
way in this situation?” Students are then invited to act and react upon
these experiences by breaking patterns and traditional ways of experien-
cing the world, thus developing new practices. The dialectical relation-
ship between old practices and new systems of operation, according to
218 Moore, de Oliveira Jayme, and Sanford

Figure 11.1 Freirian dialectic methodology.

Freire (1978), leads to individual and collective transformation. And this


transformation is paramount in providing a different reality for every-
body involved in this emancipatory assessment practice.
In this light, the next example is the materialization of this dialectic
relationship and an example of an alternative way students can demon-
strate what they know: a counter narrative to extensive and exhausting
written documents (read only by the teacher) and standard exams. In add-
ition, this emancipatory assessment practice, according to Sanford et al.
(2012) aims to (a) put the learning of others before the teacher’s own,
(b) deconstruct banking models of education, (c) benefit future generations,
and (d) help participants to find and explore their own passion to enhance
the well-being of their community.
Emancipatory assessment was the framework of a course that due
to the COVID-19 pandemic was facilitated online once a week for three
months with graduate students who were also practising teachers.
Although the course was online, the graduate students were teaching in
person in their respective schools. As part of the course, these graduate
students were tasked with using Freirian dialectical methodology to
engage their own students in conversations that mattered to them and
to develop a community arts project that reflected these conversations.
The conversations started from the graduate students’ personal
experiences during their time in the school and became broader as
their peers started noticing patterns and differences in everybody’s
responses. This dialogue amongst graduate students and their students
in their respective schools generated keywords and themes for the
Malfunction 219

community art project. The themes included a wide range of topics,


such as LGBTQ2+ identities and experiences, anxiety about the pan-
demic, social media, and body image. The community art projects
produced during this task included murals, theatre performances,
poetry, and music.
Once the graduate students completed this process, the class started
our online emancipatory AfL. The graduate students did not submit a
written report or a reflective paper about their experience. Rather, they
made short audio recordings where they had the opportunity to express
their personal experiences (practice) while developing their community
art project, their reflection upon their experiences (theory), and their
reaction upon their experiences (practice). At this time, the graduate
students were invited to create a series of questions that they would like
to answer. This process aligned with the principles of assessment outlined
by MECY (2006) because it opened space for students to have owner-
ship of their own learning and to express what mattered to them. The
graduate students generated questions in three categories designed to
decentre content and centre students (the first two categories) and to
value their judgement and professionalism as teachers (the last category).
In the personal experiences category, the graduate students created
these questions:

1. What excited me about this project? Why?


2. What was I scared of ? Why?
3. What was my previous knowledge that was useful in this project?
4. What did I learn while conducting this project?

For reflection upon experiences, decentring content and centring students,


the graduate students created the following questions:

1. What does our community art project communicate and/or


represent?
2. What did I notice my students had difficulty with?
3. How did they overcome such obstacles?

Finally, for valuing teachers’ professionalism, the graduate students listed


these questions:

1. What did I do to help them overcome such obstacles?


2. What are the things I wish I had done differently and why?
220 Moore, de Oliveira Jayme, and Sanford

3. What will I teach my following generations about the knowledge


that was cocreated during this project?

Approaching online AfL from an emancipatory perspective moved dia-


logue to the forefront of the learning process while opening oppor-
tunities for the graduate students to exercise their agency by inviting
them to decide what questions were worth their answers. In turn, this
humanized and centred the graduate students over content because
emancipatory AfL valued their previous experiences, cultures, and
knowledge. It embraced the graduate students’ strengths and challenges.
Moreover, it valued their judgement and professionalism as teachers by
moving them from the object of assessment to subject of their own his-
torical process of becoming emancipated teachers. This humanization
was possible because emancipatory online AfL helped them to critic-
ally analyze what they did, how they did it, and what they could have
done differently. Emancipatory AfL that is student centred, dialogical,
and dialectical (re)establishes and perpetuates strong and trustworthy
relationships between teachers and students. This type of assessment,
in turn, opens spaces for students to develop their problem-solving
skills and, most important, is strong evidence of an alternative way of
humanizing assessments that are fair and inclusive for both teachers and
students.

Conclusion

This emancipatory online experience online was possible because


the teachers in the graduate class entered with a clear understanding
of the purpose of education and an informed understanding of fair,
valid, and ethical assessment practices. Moreover, the pedagogy
of this course worked against the default functionality online and
instead focused on humanizing pedagogy and centring relationships.
This focus is a reminder that instead of teacher educators focusing on
how to use technology, they need to first ensure that educators have a
clear vision of teaching and learning and are steeped in the literature
on pedagogy and assessment. Otherwise, the risk is that technology
will undermine necessary relationships and reduce pedagogy. In add-
ition, all teachers need to be attuned to the ways that technology
can corrupt and coerce trusting relationships between teachers
and students, and actively resist that corruption. As professionals,
Malfunction 221

educators need to step back and ask whether we should be moving


our classrooms online. When online, teachers should drive the peda-
gogical practices.
The professionalism of teachers cannot be programmed, and pedagogy
should not be confined and defined by either the limits and functionality
of software or the interests of private corporations. Teacher educators
and educational researchers must challenge the default functionality and
ideological impositions of EdTech. Educators need to maintain their vigi-
lance against the erosion of their professionalism, the dismissal of their
experience and education, and the tendency online to reduce education
to credentialling. Although the classroom context is obviously different
online, the principles of sound pedagogy and assessment remain.
Educators need to reassert their professionalism:

Rather than simply transplanting the Lego castle of education from


one platform to another, we need to start dismantling it piece by
piece . . .. Nothing can be taken for granted. Everything must be
broken in order to be creatively and ethically rebuilt.
(Stommel, 2018, pp. 78–79)

This rebuilding is the work of professional educators, not corporations or


neoliberal reformers.

References

Ball, S. J. (2016). Neoliberal education? Confronting the slouching beast. Policy


Futures in Education, 14(8), 1046–1059. https://doi.org/10.1177/14782103
16664259
Biesta, G. (2005). Against learning. Reclaiming a language for education in an
age of learning. Nordisk Pedagogik, 25, 54–66. www.idunn.no/doi/10.18261/
ISSN1891-5949-2005-01-06
Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Assessment for
learning: Putting it into practice. Open University Press.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment
in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/
0969595980050102
Boyd, D. (2016). What would Paulo Freire think of Blackboard: Critical peda-
gogy in an age of online learning. The International Journal of Critical Pedagogy,
7(1), 165–186.
222 Moore, de Oliveira Jayme, and Sanford

British Columbia Ministry of Education. (2022). BC’s redesigned curriculum: An


orientation guide [Infographic]. Government of British Columbia. https://
curriculum.gov.bc.ca/sites/curriculum.gov.bc.ca/files/pdf/supports/curric
ulum_brochure.pdf
Connell, R. (1993). Schools and social justice. Temple University Press.
Connell, R. (2013). The neoliberal cascade and education: An essay on the
market agenda and its consequences. Critical Studies in Education, 54(2),
99–112. https://doi.org/10.1080/17508487.2013.776990
Doucet, A., Netolicky, D. M., Timmers, K., & Tuscano, F. J. (2020). Thinking about
pedagogy in an unfolding pandemic (Version 2.0). Education International. https://
issuu.com/educationinternational/docs/2020_research_covid-19_eng
Dueck, M. (2021). Giving students a say: Smarter assessment practices to empower and
engage. ASCD.
Durrani, T., & Alphonso, C. (May 24, 2022). Technology used by educators in
abrupt switch to online school shared kids’ personal information, investi-
gation shows. The Globe & Mail. www.theglobeandmail.com/canada/
article-online-school-kids-privacy-data
Earl, L. (2003). Assessment as learning: Using classroom assessment to maximize stu-
dent learning. Corwin Press.
Earl, L., Volante, D. L., & Katz, S. ( June 17, 2011). Unleashing the promise of
assessment for learning. EdCan Network. www.edcan.ca/articles/unleashing-
the-promise-of-assessment-for-learning/
Fain, P. (2019). Takedown of online education. Inside Higher Ed. www.inside
highered.com/digital-learning/article/2019/01/16/online-learning-fails-
deliver-finds-report-aimed-discouraging
Farhadi, B. (2019). “The sky’s the limit”: On the impossible promise of e-learning in
the Toronto District School Board [Doctoral dissertation]. University of Toronto.
TSpace Repository. https://tspace.library.utoronto.ca/handle/1807/9744
Freire, P. (1978). The pedagogy of the oppressed. Seabury.
Freire, P. (1994). Pedagogia da esperança: Um reecontro com a pedagogia do oprimido
[Pedagogy of hope: A reunion with the pedagogy of the oppressed] (3rd ed.).
Paz e Terra.
Freire, P. (2001). Pedagogia dos sonhos possíveis [Pedagogy of possible dreams].
UNESP.
Freire, P. (2005). Teachers as cultural workers: Letters to those who dare teach. Taylor &
Francis.
Freire, P. (2014). Pedagogy of commitment. Routledge. www.routledge.com/
Pedagogy-of-Commitment/Freire/p/book/9781594519734
Fu, H., Hopper, T., Sanford, K., & Monk, D. (2022). Learning with digital
portfolios: Teacher candidates forming an assessment identity. The Canadian
Malfunction 223

Journal for the Scholarship of Teaching and Learning, 13 (1). https://doi.


org/10.5206/cjsotlrcacea.2022.1.11108
LaPointe-McEwan, D., Lam, D. C. Y., & DeLuca, D. C. (May 27, 2021). The
challenge of online assessment. EdCan Network. www.edcan.ca/articles/
the-challenge-of-online-assessment/
Lupton, D., & Williamson, B. (2017). The datafied child: The dataveillance of
children and implications for their rights. New Media & Society, 19(5), 780–794.
https://doi.org/10.1177/1461444816686328
Manitoba Education, Citizenship and Youth. (2006). Rethinking classroom
assessment with purpose in mind: Assessment for learning, assessment as learning,
assessment of learning. www.edu.gov.mb.ca/k12/assess/wncp/full_doc.pdf
Manitoba Government. (2022). Assessment and evaluation. www.edu.gov.mb.ca/
k12/assess/report_cards/grading/competence.html
Moore, S. D. M., & de Oliveira Jayme, B. (2022a, April 2022). Control, alt,
deprofessionalize: The ethical consequences of online learning and the advancement
of neoliberalism [Conference presentation]. AERA Annual Meeting.
Moore, S. D. M., & de Oliveira Jayme, B. ( January 6, 2022b). Self-checkout edu-
cation: The deprofessionalizing, dehumanizing and demoralizing impacts
of online education. The Monitor. https://monitormag.ca/articles/self-check
out-education
Moore, S. D. M., de Oliveira Jayme, B., & Black, J. (2021). Disaster capitalism,
rampant EdTech opportunism, and the advancement of online learning in
the era of COVID19. Critical Education, 12(2), 1–21. https://ices.library.ubc.
ca/index.php/criticaled/article/view/186587
Morris, S. M. (2018). Reading the LMS against the backdrop of critical pedagogy.
In S. M. Morris & J. Stommel (Eds.), An urgency of teachers: The work of critical
digital pedagogy (pp. 91–101). Hybrid Pedagogy.
Parker, L. (March 9, 2020). Mandatory e-learning is a problem in Ontario
high schools. The Conversation. http://theconversation.com/mandatory-e-
learning-is-a-problem-in-ontario-high-schools-133041
Protopsaltis, S., & Baum, S. (2019). Does online education live up to its promise?
A look at the evidence and implications for federal policy. Center for Educational
Policy Evaluation.
Province of British Columbia. (2022). Search BC’s course curriculum. https://cur-
riculum.gov.bc.ca/
Ramiel, H. (2019). User or student: Constructing the subject in Edtech incu-
bator. Discourse: Studies in the Cultural Politics of Education, 40(4), 487–499.
https://doi.org/10.1080/01596306.2017.1365694
Ross, E. W. (2008). E-learning. In S. Mathison & E. W. Ross (Eds.), Battleground
schools (pp. 221–228). Greenwood Publishing.
224 Moore, de Oliveira Jayme, and Sanford

Sanford, K., Williams, L., Hopper, T., & McGregor, C. (2012). Indigenous
principles decolonizing teacher education: What we have learned. Education,
18(2), 18–33. https://doi.org/10.37119/ojs2012.v18i2.61
Smyth, J. (2012). Problematising teachers’ work in dangerous times. In
B. Down & J. Smyth (Eds.), Critical voices in teacher education: Teaching for
social justice in conservative times (pp. 13–25). Springer. https://doi.org/10.1007/
978-94-007-3974-1_2
Stommel, J. (2018). A user’s guide to forking education. In S. M. Morris &
J. Stommel (Eds.), An urgency of teachers: The work of critical digital pedagogy
(pp. 77–82). Hybrid Pedagogy.
Stommel, J. (May 11, 2020). Love and other data assets. www.jessestommel.com/
love-and-other-data-assets/
Stommel, J., Friend, C., & Morris, S. M. (Eds.). (2020). Critical digital pedagogy:
A collection. Hybrid Pedagogy.
Suurtamm, C., & Koch, M. J. (2014). Navigating dilemmas in transforming
assessment practices: Experiences of mathematics teachers in Ontario,
Canada. Educational Assessment, Evaluation and Accountability, 26(3), 263–287.
https://doi.org/10.1007/s11092-014-9195-0
Swaffield, S. (2011). Getting to the heart of authentic assessment for learning.
Assessment in Education: Principles, Policy & Practice, 18(4), 433–449. https://
doi.org/10.1080/0969594X.2011.582838
Swauger, S. (April 2, 2020). Our bodies encoded: Algorithmic test proctoring in higher
education. Hybrid Pedagogy. https://hybridpedagogy.org/our-bodies-encoded-
algorithmic-test-proctoring-in-higher-education/
Tierney, R. D. (2014). Fairness as a multifaceted quality in classroom assessment.
Studies in Educational Evaluation, 43, 55–69. https://doi.org/10.1016/j.stue
duc.2013.12.003
Vygotsky, L. S. (1962). Mind in society: The development of higher psychological
processes. Harvard University Press.
Wiliam, D. (2011). Embedded formative assessment. Solution Tree Press.
Leveraging the 12
Relationship Between
Assessment, Learning,
and Educational
Technology
Katrina Carbone, Michelle Searle,
and Lori Kirkpatrick

Teachers need to understand the theoretical and philosophical


underpinnings of how to measure student learning and use assessment
appropriately (Fives & Barnes, 2020). In Ontario, this responsibility
is mandated through Growing Success, a policy document that clari-
fies approaches to assessment, evaluation, and reporting (Ministry
of Education, 2010). The policy outlines formative and summa-
tive assessment purposes, data-driven assessment, quality feedback,
reporting, and communication. Teacher assessment practices impact stu-
dent learning, motivation, well-being, and future admission opportun-
ities, all while informing future instructional decisions (Black & Wiliam,
1998; Moss, 2013). Assessment is of critical importance, and it becomes
complex when mediated through technology. Technology has been
increasingly present in education over the past decades, and it can provide
students with access to educational opportunities beyond the physical
spaces of teaching and learning (Khan & Jawaid, 2020). Research suggests
that technology has the potential to support student learning (DeCoito &
Richardson, 2018) and make learning more accessible and inclusive for
students with different needs (Amiel & Reeves, 2008; Kirkpatrick et al.,

DOI: 10.4324/9781003347972-16
226 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

2017). Across various grade levels and subject content, technology can
also make learning more engaging and collaborative, increase motivation,
and encourage self-paced learning with student independence (Ciampa,
2014; Palaigeorgiou & Papadopoulou, 2018; Raja & Nagasubramani,
2018). However, the positive impacts of technology in learning environ-
ments remain contested by some (D’Arcy et al., 2014), and debates con-
tinue as researchers and educators caution that technological tools can be
distracting, increase opportunities for cheating, disconnect learners, and
lead to the deterioration of students’ competencies in reading, writing,
and mathematics (Alhumaid, 2019).
When the COVID-19 pandemic erupted and physical school spaces
were closed, much of education became mediated through technology.
North America and many other regions quickly transitioned to fully
remote schooling with a huge reliance on synchronous platforms such
as Zoom (https://zoom.us), Microsoft Teams (www.microsoft.com/
microsoft-teams), and Google Classroom (https://classroom.google.
com; LaBonte et al., 2021). In keeping with this shift, educators needed
to modify their assessment methods so that assessment could be offered
through technology. Research from prior to the pandemic found that
many educators reported feeling unprepared or uncomfortable using tech-
nology for teaching and assessment ( Johnson et al., 2016). As restrictions
related to the pandemic have eased, the predominance of technology in
education has remained (Willcott, 2021). For example, Ontario continues
to offer online classes as well as fully virtual schools.
Given the predominance of technology in education, it is imperative
to question how technology could support and enhance educational
practices, including assessment. Teachers need to adapt to the digital
environment and provide students with opportunities to develop the
skills required to be successful in the technological era. Figure 12.1 shows
the substitution, augmentation, modification, and redefinition (SAMR)
model developed by Puentedura (2006) to provide a heuristic for exam-
ining technology in assessment, which we mean to include all forms
of assessment delivered or completed with a device. Because limited
research has explored the use of the SAMR model in K–12 contexts, the
distinctions between each category would benefit from empirical evi-
dence about assessment and technology (Hamilton et al., 2016). With this
model in mind, in this chapter we ask, how do classroom teachers leverage
technology to support the evolution of their classroom assessment
conceptualizations and practices, and how might these understandings
translate to online learning contexts?
Assessment, Learning, and Educational Technology 227

Figure 12.1 The SAMR model. Adapted from The SAMR Model Explained (With
15 Practical Examples), by J. Best, 2015 (www.3plearning.com/blog/
connectingsamrmodel/) and from Learning, Technology, and the
SAMR Model: Goals, Processes, and Practice, by R. Puentedura, 2014
(www.hippasus.com/rrpweblog/archives/2014/06/29/.pdf ).

Research Context

Data were collected from a three-year collaborative program evaluation


(Shulha et al., 2016) in a rural school district in Ontario. We, the authors
of this chapter, are teacher educators who conducted this research in a
K–12 context. The program evaluated was a ministry-funded technology
initiative that enabled the district to purchase iPads for every teacher and
student in Grades 7 to 10. Over 2,000 iPads were distributed and preloaded
with basic district-supported software, and individual teachers and school
communities decided on their specific use. The district retained the own-
ership of the devices, but they temporarily became the property of the
teacher or student for the school year.
One element of the technology initiative with which we were particu-
larly involved, and which is the focus of this chapter, was a voluntary
professional learning community (PLC) about assessment. In the PLC,
participants engaged with us as lead evaluators and with other educators
from the district about assessment with technology.
PLCs are viewed as a model for professional learning in Ontario
(Ontario Educators, 2007). PLCs are continuous, job-embedded learning
opportunities where educators meet regularly to share expertise, discuss
student work, and plan for instruction (Woodland & Mazur, 2015). The
primary purpose of this PLC was to provide structure, encouragement,
228 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

and resources for participating teachers while simultaneously gathering


data about emerging classroom assessment practices with the devices.
Participation was voluntary; teachers participated because of an interest
in learning and sharing about assessment with technology.

Methodology

Initially, data related to the assessment PLC were collected for district
purposes related to understanding how teachers leveraged technology
as they developed assessment conceptualizations and practice. The PLC
was the driver for understanding how teachers integrated technology
with assessment. Now, we are analyzing the same data through quali-
tative secondary analysis (QSA) so that the findings can be shared with
academic and research audiences. QSA has a practical advantage of maxi-
mizing the use of existing data and extending original studies (Tate &
Happ, 2018).

Participants

Although all teachers in the district who received an iPad were invited to
join the PLC, only a small subset participated. In total, 61 teachers from
24 schools got involved. Participants were predominantly female (74%;
n = 44), and the majority of teachers were experienced, with 85% (n = 52)
indicating they had between 5 and 25 years of experience. Representation
across divisions was fairly even, with 43% (n = 26) elementary teacher
participants and 52% (n = 32) secondary teacher participants, and 5%
(n = 3) of participants identified themselves as program support coaches
assisting with implementation. Participants represented an array of cur-
riculum areas; the three most common were English, science, and math.

Data Collection

A substantial amount of data were collected from the PLC as part of the
program evaluation. For the purposes of this chapter, data from surveys,
online discussion board posts in the PLC, and arts-based activities are
included.
Assessment, Learning, and Educational Technology 229

Survey Data

Prior to the launch of the PLC, the inquiry team co-constructed a survey
for use at the launch and wrap-up of the PLC. At the introductory session,
participants completed the anonymous survey with a total of 11 questions:
nine closed and two open-ended. Questions included demographic infor-
mation and iPad-specific questions (e.g., comfort using the iPads). During
the concluding session, participants completed an anonymous survey
(n = 28) that included seven questions (one closed and six open-ended) to
share their experiences with the PLC and key takeaways about personal
and professional shifts. Both surveys were offered online using Qualtrics
(www.qualtrics.com); the online offering was in alignment with the
overall PLC experience.

Discussion Board Postings

The discussion board was an online platform that allowed for asyn-
chronous engagement in the PLC (Osborne et al., 2018). Participants
posted an answer to a question posed by the inquiry team (e.g., When
thinking about assessment and the devices, what are you celebrating and
what is challenging for you?), uploaded resources or exemplars, described
their experiences, reflected on their growth, and interacted with one
another using the comment feature on the forum. There were 167 posts
in total (n = 55); 64% of participants posted 10 or more times, and 30% of
participants posted 20 or more times. The length of the postings varied;
some were short comments and others were in-depth paragraphs.

Arts-informed Data

Arts-informed inquiry is “a mode and form of qualitative research that


is influenced by, but not based in, the arts broadly conceived” (Cole &
Knowles, 2008, p. 6). Incorporating artful inquiry allowed the inquiry team
to “explore, understand, represent and even challenge human action and
experience” (Savin-Baden & Wimpenny, 2014, p. 1). During face-to-face
sessions, participants engaged in artful activities such as image elicitation
(at the launch of the PLC) and creative poetry (at the conclusion of the
PLC) that provoked thinking about using technology-assisted assessment.
230 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

During the session to launch the PLC, image elicitation (Harper, 2002)
was used with images provided by the inquiry team to develop community
amongst the PLC members and generate data about participant perceptions
of assessment and technology. Participants were provided with prompts
to provoke their selecting and sharing with images. Prompts included
“Thinking about assessing with technology makes me wonder . . . ,”
and “A question that lingers for me is . . .” Participants responded to the
prompt with words, word clusters, or full sentences. At the last session of
the PLC, poetic inquiry was used. Poetic inquiry is an emergent research
methodology that has no fixed definition “because the work undertaken
through the methodology is not limited solely to artistic, aesthetic, educa-
tional or research-focused spaces” (Vincent, 2018, p. 49). In the PLC con-
text, poetic inquiry was used during the concluding face-to-face session so
that participants could reflect upon their experiences. Participants were
divided into groups and each group created folded poems, which as pieces
of anonymous collaborative writing, by responding to a series of prompts
(e.g., “Three words that best describe my assessment practices with the
devices,” and “One wish I still have is . . .”).

Data Analysis

Data (survey data, discussion board postings, arts-informed data) from the
PLC were analyzed using an inductive three-step coding cycle (Saldaña,
2013) with sessions for the authors of this chapter to engage in collabora-
tive dialogical sensemaking. In the first cycle, precoding, which involved
circling, highlighting, bolding, underlining, or colouring rich or significant
participant quotes or passages (Layder, 1998), was used to become familiar
with the specificities of the data and key technology and assessment ideas.
Next, the first author used the initial coding method to provide analytic
indications for future exploration (Charmaz, 2014). Using findings from
the initial coding approach, focused coding allowed for categorization
based on thematic or conceptual similarities (Saldaña, 2013).

Results

Two salient themes arose from the data. The first was related to the
PLC community. The second and most dominant theme was related
to expanded assessment practices. Subthemes for expanded assessment
Assessment, Learning, and Educational Technology 231

practices were student choice, meaningful feedback, differentiated


instruction, and a learning culture.

Community in the PLC

The PLC was a place where teachers shared their excitement for learning,
and how inquiry in the form of questioning can nourish passion. The
intentional efforts to build a community through the blended model
were well received by participants because they “felt a real sense of com-
munity with everyone” and explained “it was REALLY important to see
and talk with colleagues at the beginning and end of the project.” When
asked at the end of the PLC survey if the community was a useful space
for professional collaboration, 91% of respondents indicated they agreed
(n = 23) or strongly agreed (n = 7). Teachers were asked if they would
engage in a similar online community in the future, and 88% (n = 30)
responded “yes.”

Expanded Assessment Practices

Expanding assessment practices was coded as the most salient theme


because although there were differences and nuances in the individual
assessment practices, most participants identified that technology had
contributed to growth in how they thought about and enacted assessment.
Teachers noted that technology allowed their assessment practices to
move beyond traditional assessments methods but serve similar and
expected purposes regarding feedback and grading. Some teachers self-
described their “old school” formative assessment methods but reflected
on how their teaching style was improved upon by locating “apps [they]
can use to achieve the same tasks.” Other teachers indicated that tech-
nology enabled them to rethink the relationship between teaching,
learning, and assessment.

Student Choice

Technology facilitated the importance of offering assessment choice. In


many examples, teachers offered assessment criteria, expectations, or a
problem, and students had the opportunity to choose the application or
232 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

website they wanted to use to demonstrate their understanding, learning,


or process. By incorporating choice, teachers noted higher levels of stu-
dent engagement, motivation, comfort, and creativity. One teacher
described: “I find that the students are more engaged in daily writing and
responding through this method versus traditional writing prompts and
journaling.” Another participant wrote that “when planning, [I] try to
offer variety for the students. . . . This is what has allowed students to
be engaged, and as a result, the output is much greater.” A third teacher
explained that student choice “provided information around the students’
comfort and ability level based on the programs they choose for their
degree of customization and creativeness.”
Although offering student choice emerged as significant in the col-
lective data, one teacher said that student choice was not always benefi-
cial; they shared this insight:

[We are] overcoming the myth that students are proficient at all
these neat apps/formats and are itching to use a variety of them
if we give them choice. I find that is true for a very small number
of them. Most are either overwhelmed by the choice, and/or
revert back to what they know. So mandating certain formats, and
instructing them on the effective use of each, is still a focus in my
junior classes.

The quote is a reminder that although choice may enhance education, it


is important to include explicit teaching that embeds technology support
into instructional and assessment processes.

Meaningful Feedback

Throughout the PLC, teachers identified meaningful feedback as a


core component to improve student learning. One teacher described
a process with separate due dates “in an attempt to provide more for-
mative feedback that promotes self-assessment. The first due date is to
receive feedback, not editing but comments that will require students
to reflect, self-assess and make changes.” Similarly, another participant
noted “assessing with descriptive feedback allows a greater focus on what
learning needs improvement.” Yet, one teacher identified that “students
need time to reflect on their returned assignments with feedback. Rather
than requiring students to correct and resubmit the same day, extra time is
Assessment, Learning, and Educational Technology 233

required to digest the feedback and work on their next steps.” Comments
related to teacher feedback signalled an overall desire to provide mean-
ingful feedback to promote learning and a recognition that this process
required an investment of time, for both educators and students.
Providing meaningful feedback through technology presented a
challenge for many teachers. Statements from two teachers included
“I still have work to do with regards to kids using feedback provided,”
and “Writing feedback on a screen is difficult and can be hard to read.”
Additional challenges noted included “screen fatigue” and technology as
“tedious.” There were attitudes related to the frustration of learning a
new approach that may or may not be useful or effective in promoting
student learning. Perennial issues around getting students to read and
apply feedback persisted in the online space with additional issues around
navigating technology and streamlining across multiple applications.
One way participants addressed the use of feedback was to distribute
classroom leadership by encouraging students to engage in peer-to-peer
teaching. Teachers used the devices as a way for students to “becom[e]
great leaders both with each other in class and with other classes.” By pro-
viding leadership opportunities to students, students had increased access
to and investment in feedback as well as an opportunity to develop skills
that they could carry into adulthood.

Differentiated Instruction

Differentiated instruction is an expected part of Ontario classrooms that


aims to maximize each student’s growth through an individualized approach
(Landrum & McDuffie, 2010). Participants engaged in image elicitation to
identify their beliefs of the past, present, and future of technology-assisted
assessment. The most common image to describe the past depicted sharks
sitting in a row reading a book with one teacher who described their selec-
tion by asking, “Could we be doing things differently?” When thinking
about the future, the most common image represented space.
The future of technology and assessment was imagined as full of
unknowns and possibilities with innovation in education soaring. One
quote provided a portrayal of the future of technology with assessment:

Going into a black hole. Uncertain future for assessment. Endless


possibilities + tech advancing. A whole new world x2. Students
won’t be afraid to explore. . . . Finding what works and building
234 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

a relationship. Propel learning to new heights. . . . Tech makes


thinking more endless, helping kids be the best they can be rather
than teaching to an assessment.

With the images, teachers spoke to technology as a space of possibility,


enabling education to better meet the diverse needs of learners and adapt
for unknown changes. Teachers described how technology could support
assessment practices, with one teacher stating, “This assessment . . .
allows me to know my [students] as readers and to individualize my
instruction for each student. [Many students] are excited because they
have never read so much on their own.” Using technology, teachers were
able to celebrate success as they saw learners revealing new aspects of
themselves and making progress.
Technology enabled opportunities for differentiated instruction by
offering explicit teaching time dedicated to learning how to use tech-
nology. One teacher noted “a need to teach students how to appro-
priately decipher and critically analyze information from the internet
and encourage them to do more than cut and paste.” Before teachers
could use the devices for differentiation and assessment purposes, it was
important to teach all students how to access content online and strat-
egies to identify misinformation and good sources. Once teachers had
taught key competencies, they could more easily manipulate technology
to better differentiate for learners.
Given the foundation students developed while learning about tech-
nology, there were additional benefits to using various websites and
applications for assessment. For instance, assessment with technology
connected learner progression and student engagement; teachers stated,
“Student engagement is up and there is daily and consistent access to a
customized tool that fits the needs of all diverse learners and learning styles,”
and “Differentiation is still key . . .. It is important that students know they
have options on how to represent their knowledge and understanding.”
Technology propelled learner progress by aligning instructional and
assessment strategies with student needs and goals. Differentiated instruc-
tion was commonly discussed in conjunction with learning culture, and
the next subtheme relates to expanded assessment practices.

Learning Culture

The use of technology altered the relationship between assessment and


the learning culture within the classroom. According to Shepard (2019),
Assessment, Learning, and Educational Technology 235

practices need to change so that students and teachers view assessment


as a source of insight and support instead of the opportunity to reward or
punish. This shift towards assessment was evident when teachers described
movement from an emphasis on grading towards a culture of learning.
One teacher explained how they promoted learning dialogues through
conferencing. Another teacher identified: “No marks are provided or
recorded; instead students receive feedback that identifies areas that need
improvement and asks reflective questions. . . . Previously if students
were satisfied with the mark, there was no incentive to improve.” A real-
ization that learning culture and feedback work cooperatively was also
noted: “Descriptive feedback on class work submitted, as opposed to a
grade, became an important learning tool. . . . As a result of this culture
of feedback and improvement, the student performed better on summa-
tive tasks.” By offering descriptive feedback efficiently online, students
were aware of the next steps in their learning and were able to adjust as
they progressed.
Learning culture extended beyond grades; teachers articulated how
assessing with technology altered their view of assessment. One teacher
described a eureka moment:

My reflection on assessment has been at an all-time high in the past


few months. . . . I am realizing that assessment in everything. Sixteen
years ago when I started teaching, I thought I was doing a good job
because I had fun lessons. When I reflected, however, I had no evi-
dence that they were actually learning. And, more importantly, the
students who struggled the most and needed my most support in
assessment probably didn’t actually learn as much as I would have
liked.

The inclusion of technology challenged teachers to change the way


they perceived and applied assessment in their classroom, which in turn
altered the learning culture. Teachers incorporated assessments that
merged higher-order thinking questions with critical thinking oppor-
tunities; they expanded their assessment practices to include questions
that could not, as one teacher explained, “just be googled.” As a result,
an increase in tasks that involved real-world and authentic applications
were being integrated. One teacher noted they were “rethinking [their]
assessment practices, making them more authentic to real problems that
would engage the students” and another expressed the desire to “create
rich assessment tasks [because they] know where to start and who to ask
for help.” Teachers also began to realize that technology was a tool that
236 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

went beyond being merely a flashy reward for students; technology could
be used to make “assessments meaningful, relatable, and authentic.”
In sum, teachers supported their reconceptualization of assessment
through expanded assessment practices by providing students with
choice, meaningful feedback, opportunities for differentiated instruction,
and emphasizing a culture of learning. Although these ideas are often
valued by educators, data showed that some participants were resistant
because the impact and benefit of these strategies were classroom specific,
depending on learner perceptions, interests, and skillset. Nevertheless,
the PLC provided an opportunity to explicitly integrate assessment and
technology, and over time, devices were no longer seen as free-time activ-
ities; rather, they were perceived as powerful educational tools by both
teachers and students.

Discussion

Participants included in the PLC represent a subset of educators with an


interest in learning about assessment with technology. At the time of the
PLC, technology in the K–12 classroom was considered to be emerging
(Herold, 2016). Technology is predominantly viewed as an influential
and evolving tool for transforming learning (Bush & Mott, 2009; Huang
et al., 2019) that, not surprisingly, remains a growing area of interest for
researchers, teachers, and educational stakeholders (Amiel & Reeves,
2008). Over the last few years, the implementation of devices has been
critical throughout the pandemic and related school closures. Using the
SAMR model as a heuristic for examining assessment with technology
(Puentedura, 2006), the findings in this chapter demonstrate the ways
in which technology can be used to support elementary and secondary
classroom assessment practices in a digital context.
The learning culture is not simply directed at the students; everyone
in education can be included in the value of continuous learning and
reflection. At the outset, teachers described incorporating technology
predominantly through methods of substitution (Puentedura, 2006).
Teachers would swap out previously established assessment practices
to use technology. In the theme of student choice, teachers identified
how technology could replace past assessment practices to afford more
opportunities for student creativity and by doing so, enhance student
engagement. When swapping in technology as a part of the assignment,
teachers noticed that student motivation increased. Additionally, the use
Assessment, Learning, and Educational Technology 237

of technology provided a context for easily sharing developing ideas in


ways that enabled teachers to provide targeted and timely feedback to
support learning.
As the PLC experience unfolded, teachers recognized and experimented
with ways technology could enhance student learning and sought out
different experiences to augment and modify previous assessment practices
(Puentedura, 2006). Teachers provided examples of assessment practices
that spoke to differentiation; most often, teachers expressed that the tech-
nology facilitated easier and less visible support for diverse learners. An
example that stands out is a teacher who took an existing activity that used
a performance assessment to demonstrate a skill and incorporated tech-
nology that involved individual practice, video recording, and reviewing.
The video enabled a functional improvement because students could
review the skill as part of their self-assessment to continue to improve
before sharing the final video for peer and teacher feedback. Although
a performance task would normally involve episodic assessment (in the
moment), videos enabled students to stop and restart segments to identify
and target improvement. This example, among others described by PLC
teachers, shows how tasks were initially augmented by the integration
of technology and could also be modified by creating cycles of learning,
reflecting, and feedback that would not otherwise have been available
without the technology.
Data from the end of the PLC show a definite shift, in that teachers
recognized and valued how the devices afforded different opportun-
ities than traditional resources for teaching and expanded assessment
practices. Many PLC teachers were working on redefining assessment
(Puentedura, 2006) as they created and supported new tasks for students to
demonstrate learning. For example, teachers redesigned and accelerated
learning through the creation of new open-ended tasks with a range of
applications. Technology also enabled different kinds of teaching with
visual and virtual demonstrations of learning, access to works-in-progress,
visible thinking processes, and multidirectional exchanges of feedback
that included peers and educators. As part of the redesign, teachers noted
that a breadth of technology could be both invigorating and potentially
overwhelming as students required explicit teaching and ongoing support
to use different applications effectively. Teachers have the capacity to act
as change agents for the increased demand for technology, supporting
the progression of technology for learning and growth. As devices and
applications evolve, “teachers need to be prepared to constantly learn—
and relearn” (Hubbard, 2018, p. 2). Data showed that the teachers were
238 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

enthusiastic to discover the possibilities of the devices for their instruc-


tional and assessment practices.
This study makes clear that technology influences not only student
learning, but also the ways in which teachers enact education. Teachers
who are willing to explore technologies and their role in assessment
may be better positioned to embrace the near-constant change required
and continue to identify how technological tools can be integrated into
assessment practices and purposes. Teachers must be included in the
shift by educational stakeholders (e.g., policymakers, administrators)
and recognized as pedagogues who are experimenting with new ideas,
sharing knowledge around developing practices, and taking risks to make
technology and their learning visible (Hattie, 2008; Palmer, 2017). PLC
teachers were asking questions to support their continued development
with technology integration in assessment.
Although our goal was to look at the ways teachers leverage tech-
nology to support assessment practices and conceptions, we now recog-
nize that it is difficult to extrapolate the conceptualizations of teachers.
It seems evident that assessment with technology enabled the teachers in
the PLC to embrace all the different purposes—assessment for, of, and as
learning—in the practices they embedded in the classroom (Ministry of
Education, 2010). The emphasis on formative assessment has been part of
the research and policy for the past decade, but implementation has been
persistently difficult because of a lack of professional development to bring
coherence to assessment practice (DeLuca, 2012). Teachers’ conceptions
of assessment were at the heart of the PLC, where assessment with tech-
nology was the focus, and technology was also a method for professional
development that enabled a greater level of connectivity for educators.
In this case, the PLC teachers not only used the devices to improve stu-
dent learning, but also found the technology shifted conceptualizations
of assessment with technology.
Participants experienced and confronted uncertainties while engaging
in the PLC, but they formed an online community of support, explor-
ation, and conversation that contributed to new ideas and understandings
of how to assess with technology. Teachers showed that they think care-
fully about how they are using technology to extend existing practices
while seeking ways to improve educational processes that are responsive
to students. The trajectory of experiences within this PLC aligns with
stakeholders who suggest that technology in classrooms can “enable,
expand, and accelerate learning in ways previously unimaginable”
(Fullan & Langworthy, 2014, p. 30).
Assessment, Learning, and Educational Technology 239

Conclusion

Educational stakeholders must keep in mind that “technology cannot


be effective in the classroom without teachers who are knowledgeable
about both the technology itself and its implementation to meet edu-
cational goals” (DeCoito & Richardson, 2018, p. 362). Whether using
technology in person or in online environments, it is important to rec-
ognize the need for continual learning. Interweaving assessment and
educational technology denotes the possibilities inherent in the ongoing
learning about technology in the field of education. Teachers in the
PLC demonstrated a desire to learn more about the ways in which tech-
nology could support assessment, and through assessment, learning.
We used Puentedura’s (2006) SAMR model in our research as a reflexive
lens for understanding teacher assessment with technology. It would
be interesting to examine similar constructs using the SAMR model
from the outset to explore the many options for integration between
assessment and technology.
We also discovered that how teachers used technology varied by sub-
ject, learning objectives, available resources, and the skills and expertise
of students within the classroom. Future research might use the SAMR
model (Puentedura, 2006) as an iterative framework that can be coupled
with other learning frameworks, such as assessment identity (Looney
et al., 2018), assessment competencies (Popham, 2009), or assessment lit-
eracy (Xu & Brown, 2016), rather than used in isolation. Using and inte-
grating these models as assessment with technology is reimagined in a
postpandemic era can provide support for professional learning and col-
laboration for educators who are ready to contribute to leveraging tech-
nology for assessment.

References

Alhumaid, K. (2019). Four ways technology has negatively changed educa-


tion. Journal of Educational and Social Research, 9(4), 10–20. https://doi.
org/10.2478/jesr-2019-0049
Amiel, T., & Reeves, T. C. (2008). Design-based research and educational
technology: Rethinking technology and the research agenda. Educational
Technology & Society, 11(4), 29–40.
Best, J. (2015). The SAMR model explained (with 15 practical examples). 3P Learning.
www.3plearning.com/blog/connectingsamrmodel/
240 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment
in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/
0969595980050102
Bush, M. D., & Mott, J. D. (2009). The transformation of learning with tech-
nology: Learner-centricity, content and tool malleability, and network effects.
Educational Technology, 3–20. www.jstor.org/stable/44429655
Charmaz, K. (2014). Constructing grounded theory (2nd ed.). Sage.
Ciampa, K. (2014). Learning in a mobile age: An investigation of student motiv-
ation: Learning in a mobile age. Journal of Computer Assisted Learning, 30(1),
82–96. https://doi.org/10.1111/jcal.12036
Cole, A., & Knowles, J. (2008). Arts-informed research. In J. G. Knowles & A. L.
Cole (Eds.), Handbook of the arts in qualitative research: Perspectives, methodolo-
gies, examples, and issues (pp. 55–71). SAGE Publications.
D’Arcy, J., Gupta, A., Tarafdar, M., & Turel, O. (2014). Reflecting on the “dark
side” of information technology use. Communications of the Association for
Information Systems, 35(5), 109–118. https://doi.org/10.17705/1CAIS.03505
DeCoito, I., & Richardson, T. (2018). Teachers and technology: Present practice
and future directions. Contemporary Issues in Technology and Teacher Education,
18(2), 362–378.
DeLuca, C. (2012). Preparing teachers for the age of accountability: Toward a
framework for assessment education. Action in Teacher Education, 34(5–6),
576–591. https://doi.org/10.1080/01626620.2012.730347
Fives, H., & Barnes, N. (2020). Navigating the complex cognitive task of class-
room assessment. Teaching and Teacher Education, 92, Article 103063. https://
doi.org/10.1016/j.tate.2020.103063
Fullan, M., & Langworthy, M. (2014). A rich seam: How new pedagogies find deep
learning. Pearson.
Hamilton, E., Rosenberg, J. M., & Akcaoglu, M. (2016). The substitution aug-
mentation modification redefinition (SAMR) model: A critical review and
suggestions for its use. TechTrends, 60(5), 433–441. https://doi.org/10.1007/
s11528-016-0091-y
Harper, D. (2002). Talking about pictures: A case for photo elicitation. Visual
Studies, 17(1), 13–26. https://doi.org/10.1080/14725860220137345
Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. Routledge.
Herold, B. (2016). Technology in education: An overview. Education Week. www.
edweek.org/technology/technology-in-education-an-overview/2016/02
Huang, R., Spector, J. M., & Yang, J. (2019). Educational technology: A primer for the
21st century. Springer.
Assessment, Learning, and Educational Technology 241

Hubbard, P. (2018). Technology and professional development. In J. Liontas


(Ed.), The TESOL encyclopedia of English language teaching (pp. 1–6). Wiley &
Sons. https://doi.org/10.1002/9781118784235.eelt0426
Johnson, A. M., Jacovina, M. E., Russell, D. E., & Soto, C. M. (2016). Challenges
and solutions when using technologies in the classroom. In S. A. Crossley &
D. S. McNamara (Eds.), Adaptive educational technologies for literacy instruction
(pp. 13–29). Taylor & Francis.
Khan, R., & Jawaid, M. (2020). Technology enhanced assessment (TEA) in
COVID-19 pandemic. Pakistan Journal of Medical Sciences, 36(COVID19-S4),
S108–S110. https://doi.org/10.12669/pjms.36.COVID19-S4.2795
Kirkpatrick, L. C., Brown, H. M., Searle, M. J., Sauder, A. E., & Smiley, E. (2017).
The impact of a school board’s one-to-one iPad initiative on equity and inclu-
sion. Exceptionality Education International, 27 (2). https://doi.org/10.5206/
eei.v27i2.7751
LaBonte, R., Barbour, M. K., & Nagle, J. (November 2021). Pandemic pedagogy
in Canada: Lessons from the first 18 months. Canadian E-Learning Network.
https://canelearn.net/wp-content/uploads/2021/11/CANeLearn-
Pandemic-Pedagogy-in-Canada.pdf
Landrum, T., & McDuffie, K. A. (2010). Learning styles in the age of
differentiated instruction. Exceptionality: The Official Journal of the Division
for Research of the Council for Exceptional Children, 18(1), 6–17. https://doi.
org/10.1080/09362830903462441
Layder, D. (1998). Sociological practice: Linking theory and social research. Sage.
Looney, L., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualising
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.108
0/0969594X.2016.1268090
Ministry of Education. (2010). Growing success: Assessment, evaluation, and
reporting in Ontario schools. Government of Ontario. www.edu.gov.on.ca/
eng/policyfunding/growSuccess.pdf
Moss, C. M. (2013). Research on classroom summative assessment. In J. H.
McMillan (Ed.), Handbook of research on classroom assessment (pp. 235–255).
Sage.
Ontario Educators. (2007). Professional learning communities: A model for Ontario
Schools (Capacity Building Series). https://drive.google.com/file/d/1IVUN
NqsyQGBW2RmmKOojNF_0YG1Q8m68/view
Osborne, D. M., Byrne, J. H., Massey, D. L., & Johnston, A. N. (2018). Use of
online asynchronous discussion boards to engage students, enhance critical
thinking, and foster staff-student/student-student collaboration: A mixed
242 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick

method study. Nurse Education Today, 70, 40–46. https://doi.org/10.1016/j.


nedt.2018.08.014
Palaigeorgiou, G., & Papadopoulou, A. (2018). Promoting self-paced learning in
the elementary classroom with interactive video, an online course platform
and tablets. Education and Information Technologies, 24(1), 805–823. https://
doi.org/10.1007/s10639-018-9804-5
Palmer, P. J. (2017). The courage to teach: Exploring the inner landscape of a teacher’s
life. John Wiley & Sons.
Popham, W. J. (2009). Assessment literacy for teachers: Faddish or fundamental?
Theory into Practice, 48(1), 4–11. https://doi.org/10.1080/00405840802577536
Puentedura, R. (2006). Transformation, technology, and education [Blog post].
http://hippasus.com/resources/tte/
Puentedura, R. (2014). Learning, technology, and the SAMR model: Goals, processes, and
practice [Blog post]. www.hippasus.com/rrpweblog/archives/2014/06/29/
LearningTechnologySAMRModel.pdf
Raja, R., & Nagasubramani, P. C. (2018). Impact of modern technology in edu-
cation. Journal of Applied and Advanced Research, 3(1), 33–35. http://dx.doi.
org/10.21839/jaar.2018.v3iS1.165
Saldaña, J. (2013). The coding manual for qualitative researchers (2nd ed.). SAGE.
Savin-Baden, M., & Wimpenny, K. (2014). A practical guide to arts-related research.
Sense.
Shepard, L. (2019). Classroom assessment to support teaching and learning. The
Annals of the American Academy of Political and Social Science, 683(1), 183–200.
https://doi.org/10.1177/0002716219843818
Shulha, L., Whitmore, E., Cousins, J. B., Gilbert, N., & al Hudib, H. (2016).
Introducing evidence-based principles to guide collaborative approaches to
evaluation: Results of an empirical process. The American Journal of Evaluation,
37(2), 193–215. https://doi.org/10.1177/1098214015615230
Tate, J., & Happ, M. B. (2018). Qualitative secondary analysis: A case exemplar.
Journal of Pediatric Health Care, 32(3), 308–312. https://doi.org/10.1016/j.
pedhc.2017.09.007
Vincent, A. (2018). Is there a definition? Ruminating on poetic inquiry, straw-
berries and the continued growth of the field. Art/Research International:
A Transdisciplinary Journal, 3(2), 48–76. https://doi.org/10.18432/ari29356
Willcott, J. (2021). Pandemic and post-pandemic use of immersive learning tech-
nology. In G. Panconesi & M. Guida (Eds.), Handbook of research on teaching
with virtual environments and AI (pp. 1–16). IGI Global.
Woodland, R. H., & Mazur, R. (2015). Beyond hammers versus hugs:
Leveraging educator evaluation and professional learning communities into
Assessment, Learning, and Educational Technology 243

job-embedded professional development. NASSP Bulletin, 99(1), 5–25. https://


doi.org/10.1177/0192636515571934
Xu, X., & Brown, G. T. (2016). Teacher assessment literacy in practice:
A reconceptualization. Teaching and Teacher Education, 58, 149–162. https://
doi.org/10.1016/j.tate.2016.05.010
Isolation/Adaptation/ 13
Education
Moving Hands-on Secondary Visual
Art Classes to a Virtual Platform
Christina Yarmol

Moving from face-to-face, hands-on visual art classes to virtual syn-


chronous and asynchronous learning models is complex. The questions
raised are directly related to the delivery of visual art education online.
The processes, teaching methods, assessment, and evaluation examples
noted in this chapter are not meant to be a prescriptive recipe for online
art education; instead, they offer the reader entry points and possibil-
ities for approaching curriculum delivery, assessment, and evaluation
based on individual experience, comfort level, and goals within specific
art programs. Readers are encouraged to reflect upon their own roles
in elementary and secondary panels or teacher education to rework the
ideas presented as they plan online and in-person art education programs
to suit the specific needs of the student populations they teach.
Organizing a hands-on curriculum in an online setting presented
questions about the equitable and practical reality of students’ access to
art making media that would typically be available in the visual art class-
room. The first question posed was, What art supplies do students have in
their homes and how can I ensure equitable access to art making media?
A variety of posters, anchor charts, technical samples, and art
making media are at hand in the art classroom. Routinely, students
watch the teacher demonstrate a specific technique and take opportun-
ities to practice with art media while accessing visual sources and tools

DOI: 10.4324/9781003347972-17
Isolation/Adaptation/Education 245

available. They pose questions and then move to their own creative art
projects. To expand students’ studio skills, they view examples where
such skills have been historically executed. To invoke critical thinking
skills, teachers also plan art field trips in the community (Greene et al.,
2014). These elements prompted the second question: How can the
diverse aspects of the art curriculum, including art history, art theory,
art critique, technical art skills, and art making, be taught to students
in virtual classrooms?
With in-person learning, art assessments occur in day-to-day forma-
tive judgements made by art teachers about students’ artwork, informal
peer- and self-assessment made to student artists privately (Boughton,
2013), and summative assessments of completed work made publicly
(Boughton, 2013). These assessments were hampered in online learning.
Both teachers and students expressed concerns about assessment and
evaluation methods when first pivoting to online learning, sparking the
final two questions explored in this chapter: In a virtual setting, how does
an art educator mimic informal assessments done organically through
the creative working process in an in-person art classroom? When final
artworks are completed, how can learner outcomes be evaluated and
documented?

Researcher’s Background

I have been a visual artist since my youth. I am currently the head of a


visual art and drama department in a secondary school. As a passionately
committed teacher, I keep students’ needs, interests, and well-being at
the centre of all activities. I try to both nurture and challenge students’
intellectual and imaginative capacities by using art educational theory to
inform my art teaching practice. Students’ willing engagement in new
content, inspiration of self-expression, and mastery of visual learning are
greatly impacted by their teacher’s distinctly human dimensions, including
personality traits, attitudes, and relationship skills (Zehm & Kottler, 1993,
p. 2). I believe that having passion for what I do can have a positive impact
(Hattie, 2012) on all students. To pivot to online learning, I drew upon
my lived experience as an artist, researcher, and teacher (Irwin, 2008),
continually monitoring and revising my teaching practice after viewing
students’ responses to lessons, creative processes, and resulting projects.
I looked to a/r/tographic methodology for this self-study and describe it
in detail in the following sections.
246 Christina Yarmol

Literature Framework

The Creative Process: A Credo for Art Making

The creative process is used by artists around the globe. Lubart (2001)
defined it as a succession of thoughts and actions that result in ori-
ginal and appropriate productions. To establish the curriculum for
online art learning, I looked to the Creative Process section of the
Ontario curriculum (Ministry of Education, 2010, pp. 14–16). The
creative process embodies the basic tenets of art creation, announ-
cing art making as a process requiring both creativity and skill. The
credo sets visual and performing arts aside from other mandatory,
scholastic courses; its components are essential in all stages of art
thinking and creation, and a crucial consideration in art curriculum
design.
The circular creative process is integrated with critical analysis and
thinking entailing questioning, evaluating, making rational judgements,
finding logical connections, and categorizing; critical thinking demands
openness to a plurality of world views. Reflection and feedback from
peers and teachers are at the core of the art making process, branching
outwards in a radial configuration to include challenging and inspiring;
imagining and generating; planning and focusing; exploring and
experimenting; producing preliminary work; revising and refining;
presenting and performing; and reflecting and evaluating. These stages
enable art makers to navigate the process in a studio setting (Ministry
of Education, 2010, p. 16). As students gain confidence in the pro-
cess, they can fluidly, deliberately, and consciously move between the
stages—varying the order, where appropriate, to move art projects to
their fruition.
Art educators habitually witness students actively absorbed in the cre-
ative process in brick-and-mortar environments through classroom activ-
ities like viewing exemplars of work, drawing in sketchbooks, attempting
art skills or techniques, observing peers’ creative process then witnessing
their finished artworks, sharing work with peers, conferencing with the
teacher, and engaging in class critiques. The grand challenge in a virtual
environment was to offer opportunities to exercise aspects of the Ministry
of Education’s (2010) creative process in an online delivery model where
students were working in isolated environments.
Isolation/Adaptation/Education 247

Backward Design for Curriculum Planning

I reviewed Wiggins and McTighe’s (2011) concept of understanding by


design or backward design program planning. Understanding by design
urges scaffolding of learning activities that also support the creative pro-
cess. The concept encourages teachers, as curriculum planners, to think
about assessment prior to designing lessons that fit into larger units.
A three-pronged approach to curriculum planning, assessment, and
evaluation is applied: (a) desired results, (b) evidence, and (c) learning
plan (Wiggins & McTighe, 2011).
In desired results, teachers ask what students should know, understand,
and be able to do according to the curriculum expectations (Wiggins &
McTighe, 2011). For example, an expectation in the Ontario Grade 9
Visual Arts curriculum asks students to “explore elements and principles
of design and apply them to create artworks that express personal
feelings and/or communicate emotions to an audience” (Ministry of
Education, 2010, p. 120). This expectation has three goal types, transfer,
meaning-making, and acquisition, that can be attempted from a variety
of instructional methods, but the key is to ensure goals are approached in
instruction to achieve the desired results.
For evidence, teachers determine assessment substantiation by asking
how they will know if students have achieved the desired results through
performance tasks or other means (Wiggins & McTighe, 2011). Grasping
concepts means that someone can explain concepts, interpret informa-
tion, apply ideas by usage and adaptation, demonstrate perspective, and
show self-knowledge. Teachers consider what assignments are fair and
consistent that can act as evidence to demonstrate how students have
transferred their learning to new situations or to sharpen their teaching
practices.
Employing a learning plan involves planning daily instruction so that
students understand, process, and autonomously transfer their learning
to new situations, projects, or performances (Wiggins & McTighe, 2011).
Teachers think about the knowledge or skills required to perform tasks,
the sequence of activities needed to get students to meet the curriculum
expectations, and the evidence to know they have achieved the goal,
followed by considering how to evaluate authentic student performance in
fair and consistent ways. Modelling basic skills is only a starting point that
requires extension if students are to be actively involved in making meaning,
248 Christina Yarmol

drawing inferences, practising the transfer of their acquired knowledge,


and conveying their learning (Wiggins & McTighe, 2011). Frequent, timely
feedback on performances through both verbal and written assessments
helps students to improve their skills for future performances.

Assessment and Evaluation

Cooper’s (2010) methods offer ways to devise actual assessment and


evaluation strategies to measure the evidence collected from student
artists. Cooper expressed the idea that assessment and instruction are
inseparable, as one informs the other for the adjustment of instructional
strategies. Planning is crucial so that students know how they are going to
be graded. Assessments must be purposeful to align with the curriculum,
and they serve different purposes at different times to inform instruction,
including formative assessment as learning; assessment for learning to
improve achievement, understanding, and motivation; and assessment of
learning to measure a student’s achievement in relation to an established
standard to produce a grade. Assessment should be a collaborative process
done with self, peers, and the teacher to encourage reflective learning.
Cooper asserted that effective assessments are criterion-referenced with
performance standards, and they must be balanced and flexible to include
oral, written, and performance tasks that include proactive, helpful
words outlining positive results and how to improve, not just numerical
scores or letter grades. Using these types of assessments to report stu-
dent achievement is a responsive, human process that requires teachers
to exercise their professional judgement (Cooper, 2010).
Through a global evidence-based study examining factors that improve
student learning, Hattie (2009) asserted that the biggest effects occur
when teachers become learners of their own teaching and when students
become their own teachers. When students become their own teachers,
they exhibit attributes that are most desirable for learning, including self-
monitoring, self-assessing, and self-teaching (Hattie, 2009). Hattie (2012)
also emphasized that positive teacher-student interaction is an important
factor in effective teaching.

Engagement

In his book, Failure to Disrupt: Why Technology Alone Can’t Transform


Education, Reich (2020) shared research showing that learners who are
Isolation/Adaptation/Education 249

most successful in self-paced courses are those who are already successful
in school—being self-motivated and academically well prepared because
they know how to learn and assess the quality of their own work. Reich
contended that educators working online face the issue that a majority of
students are not autodidacts but rather are dependent on their teachers
to direct their learning. Human contact with teachers for instruction,
assessment, feedback, and experience is thus essential for learning. The
goal of developing a growth mindset in students, rather than just meas-
uring learning at the end of the course, is an integral part of the learning
process (Earl, 2012).

A/r/tography

A/r/tography is an arts and education-focused methodology involving


planning, acting, observing, and reflecting without prescribing a con-
crete plan or method. The “a” represents the artist, “/r” represents
the researcher, and “/t” represents the teacher. The slash between the
letters expresses that both the artist, the researcher, the teacher are
equal, coexisting identities (Irwin et al., 2006) and contiguity, or the
concept that ideas exist with one another (Irwin & Springgay, 2008).
A/r/tography is enacted inquiry through six renderings: contiguity,
living inquiry, openings, metaphor/metonymy, reverberations, and
excess (Springgay et al., 2005). Irwin et al. (2006) stated that contiguity
“is a coming together of art and graphy, or image and word,” (p. 900)
which is important when including both visual and written process
and products of a research text; each enhances the others’ meanings
(Irwin & Sinner, 2013).
A/r/tography is related to Aristotle’s three concepts of knowledge:
theoria, praxis, and poesis, integrating knowing, doing, and making
(Irwin & de Cosson, 2004). A/r/t merging artistic understanding with
the continual process of lifelong learning, researching, and teaching
with a commitment to inquiry over time (Irwin, 2008). Who people
are is embedded in what they know and do, so theory and practice
become interlinked and folded together (Irwin, 2008), like a kind
of living inquiry (Irwin & Sinner, 2013). My artistic knowledge and
desire to research, coupled with ideas about how to teach art online,
intertwined into a rhizomatic metaphor (Irwin, 2008) that branched
outward from self-inquiry towards a collective inquiry with my peers
(Irwin, 2008), making a/r/tography a fitting methodological choice
for this study.
250 Christina Yarmol

Methods

I selected a/r/tography because it poses questions, enacts interventions,


gathers information, and analyzes learned information before asking
more questions and enacting further inquiry (Irwin, 2008), deepening the
research in a continual process of interconnected art making and writing
(see Figure 13.1). I examined my daily learning plans, digital handouts,
assessment rubrics, sketchbook notes, and drawings about teaching
online. I also reviewed student-generated assignments and online class-
room data from three visual art classes. These data included students’
studio art and written projects, my online feedback, completion of
checklists and evaluation project rubrics, and students’ digital portfolios of
artwork. I analyzed the data for responses about the teaching techniques
that supported and engaged students in art learning online.
A vital practice in studio-focused art courses is assessment and evalu-
ation of individual portfolio items and the portfolio as a whole; the portfolio
is defined as a purposeful collection of student work often accompanied

Figure 13.1 The a/r/tographic process, 2020. Christina Yarmol, ink, felt tip and
metallic markers, found book. This example is the teacher’s altered
book demonstration shown under the document camera. Artwork
was completed off camera asynchronously.
Isolation/Adaptation/Education 251

by reflective and explanatory written data (Blaikie et al., 2004). A port-


folio exhibits students’ efforts, progress, and final products, and it can
encourage students’ reflections and self-directed learning or be used to
represent a collection of students’ best works and documents, recording
growth and development towards identified outcomes (Haanstra &
Schönau, 2007). Both reflective narrative accounts of students’ responses
about online learning modules at the end of the course and notes about
the discourse with students when we returned to in-person learning
contributed to the results of this self-inquiry. Specific procedures for each
research question are described with the findings.
The belief that all students have the potential to achieve success (Hattie,
2012) and the notion that autonomy and self-expression are central to
students’ art making are concepts that have underpinned art education
since the creative free expression movement of the 1940s (Boughton, 2013).
These premises were the basis for the assessments in this study. Names of
participants have been altered to pseudonyms to preserve confidentiality.

Findings

Access to Materials

The first question to tackle visual art learning online was “What art
supplies do students have in their homes and how can we ensure equit-
able access to art-making media?” Answering this question was necessary
to devise projects that met curricular expectations and could be equitably
accessed by all students. I polled students through the hand-raising feature
on Google Classroom (https://classroom.google.com/) and a Google
Form (https://docs.google.com/forms/). I learned that few students had
access to a printer, but all students had access to a cell phone to photo-
graph their artwork. Many students had minimal art supplies at the ready
and were not able to purchase them with the lockdown in effect. For
students who required supplies, I made art kits that students picked up at
parking lot door of the art room. These kits contained many media: tem-
pera or watercolour paints, paint brushes, charcoal, soft graphite pencils,
pencil crayons, and a range of papers, including Bristol board, water-
colour paper, cartridge paper, and a sketchbook. Other supplies typic-
ally used in the classroom, like cardboard packaging, newspaper, sewing
needles, scissors, thread, and flour or cornstarch, were readily available to
students in their homes. When they were not available, alternative media
252 Christina Yarmol

were found. Encouraging the use of available mixed media with their art
kit materials enticed students to think creatively. Overcoming this initial
challenge set me up for achievable assignment planning, assessment, and
evaluation for students’ virtual learning (Wiggins & McTighe, 2011).

Teaching Techniques: Basic Art Skills Knowledge

With media readily available to them, students had the opportunity to


manipulate and experiment with a given medium before creating more
complex, finished artwork. After formative assessment for learning
through initial drawings, I noticed that Grade 9 students who had joined
in the 2020–2021 school year required confidence building in their art
making abilities before engaging in their own projects. A question
emerged: How can the diverse aspects of the art curriculum, including
art history, art theory, art critique, technical art skills, and art making, be
taught to students in virtual classrooms?
An invaluable asset for teaching art online was access to a document
camera. The document camera enabled me to demonstrate techniques
to students in real-time, even for more complex projects. For example,
Grade 9 students had to create a three-dimensional papier mâché sculp-
ture of a creature who had either physical attributes or habits that they
felt best represented their own (see Figure 13.2). As we worked together
synchronously, students observed my idea-generation process (me
buzzing about the classroom), viewed the step-by-step techniques as
they unfolded, posed questions, and shared their own progress with me
through their cameras or progress checks. I could repeat a skill at a slower
rate or provide suggestions about how to use the skill with their projects
as needed.
Students also engaged in assessment as learning by reviewing, prac-
tising, and improving their skills, then applying them in new contexts.
To promote engagement in art theory and experimentation with spe-
cific media, students uploaded their trials to the Google Classroom at a
checkpoint time. An example demonstrating tempera skills is shown in
Figure 13.3. Through this process, I assessed progress, provided feedback
about how to improve, and commented on what additional skills to try for
specific projects. As an assessment for learning tool, I could ask, What do
I need to review or reteach? This instructional method increased student
engagement with the media as students learned about each medium’s
potential drawbacks and advantages.
Isolation/Adaptation/Education 253

Figure 13.2 Educator’s fly on the wall, papier mâché with Roo’s centipede papier
mâché on right.

Using screen sharing for YouTube (https://youtube.com)


demonstrations, online examples of skills, and art history examples
reinforced technical skill learning. I invited students to share relevant
online examples they had found during asynchronous independent
learning time as an additional way to share new knowledge and engage
with one another through Zoom (https://zoom.us) breakout rooms.
This approach encouraged small-scale social encounters and supported
the cooperative learning usually experienced during in-person studio class
sessions. After spending time online all day, mostly listening to lessons,
students said that they appreciated watching and working alongside me
virtually while actively engaged with hands-on sessions because it was like
“a real in-person class.”
Through online virtual gallery visits, students experienced art history
learning usually acquired through field trips. One example of a virtual
field trip was the exhibit Printed Textiles from Kinngait Studios at the Textile
Museum of Canada (2019–2022), where students observed historical
examples of printmaking by Inuit artists. As a class we watched a video
tour of the exhibit and historical films from the National Film Board of
Canada (2020) about Inuit printmakers. The museum’s interactive app
allowed students to deepen their experience of the exhibition’s content
254 Christina Yarmol

Figure 13.3 Ree’s tempera painting techniques.


Isolation/Adaptation/Education 255

and complete art critiques of artworks they selected. Although not as


immersive as an in-person field trip, students applied the Inuit idea of
portraying subject matter of daily life to their own stencil-printmaking
projects.
With this newfound knowledge of historical artwork, photographic
sources, and a wide range of technical skills available online, students
began extending their learning to the exploration of their self-selected
themes in other projects. Elle’s Year of the Tiger composition in Figure 13.4
is an example of skills learned involving cultural study, colour association,
ink, and painting techniques that yielded successful results in a new con-
text, enabling her to win an online contest.

Engaging Students through Practical Teaching


and Assessment Strategies

With Reich’s (2020) research in mind, I thought about how I could


engage students in performative strategies for online visual art learning.
I tried to tap into the idea that students had selected art as an optional
course. During in-person studio visual arts classes, much time is spent on
assessment for learning (with formative and continuous teacher and peer
feedback) and assessment as learning (through self-assessment) before
assessment of learning. In the online classroom, this normally organic,
flexible, and fluid process was curbed as participants were hidden behind
their screens. The customary chatter—“How does this look? Do you
think that this colour works?” and so on—was hampered by the barrier
of online technological learning and raised further questions: In a vir-
tual setting, how does an art educator mimic informal assessments done
organically through the creative working process in an in-person class-
room? When final artworks are completed, how can learner outcomes be
evaluated and documented?
I knew that I needed to witness and support students’ creative process
so they could successfully complete their artworks. To apply Wiggins and
McTighe’s (2011) curriculum planning framework and Cooper’s (2010)
assessment and evaluation strategies, I carefully considered the purpose
behind assignments, the process of attaining the established curricular
goals, and the use of various assessments as, of, and for learning by
streamlining online tasks while also considering students’ mental health.
I acted as a facilitator of the creative process (Ministry of Education, 2010)
256 Christina Yarmol

Figure 13.4 Elle’s Year of the Tiger, digital art.

by presenting lectures, readings of historical techniques, and themes


linked with the topic of study that we read together as a class. I offered
opportunities for students to engage in sketching what they imagined,
devising assignments that portrayed the self as the key subject matter
Isolation/Adaptation/Education 257

(see Figure 13.5), providing technical instruction, generating opportun-


ities for independently creating through scheduled studio time, holding a
tutorial time for students to ask questions, facilitating art class critiques,
and cocreating assignment rubrics to actively engage students.
Rubric cocreation engaged students in the assessment and evaluation
process. Students cocreated a criterion-based rubric (see Table 13.1) for
a still life at-home project with pencil shading after learning the basic
graphite shading techniques and information about how to create various
compositional arrangements on a two-dimensional picture plane. The

Figure 13.5 Elizabeth’s watercolour paint on paper shows her preferred activ-


ities and favourite objects. This artwork is a historiated initial derived
from a look at medieval illuminated manuscripts.
258 Christina Yarmol

Table 13.1 Grade 9 rubric for observing and drawing still life compositions with
pencil shading.
Level 4 Level 3 Level 2 Level 1
Variety of pencil Variety of pencil Variety of pencil Variety of pencil
shades: shades: shades: shades:
• Excellent • Very good • Good display • Greater display
display of a display of a of a variety of of a variety
variety of variety of tones tones (3–4) of tones (1–3)
tones (+5) (4–5) required
• Some contour
line is used
Composition: Composition: Composition: Composition:
• All objects are • All objects are • Some objects are • Objects need to
extremely well well grounded well grounded be more clearly
grounded on on the picture on the picture grounded on the
the picture plane. plane. picture plane.
plane. • A somewhat • A formally or • Some objects can
• A dynamic, dynamic an informally be recognized by
formally or formally or balanced the viewer, but
informally informally composition it is difficult to
balanced balanced is attempted discern what is
composition composition is so that some represented.
is clearly achieved so that objects can be
achieved so most objects can recognized by
that all objects be recognized the viewer.
can be viewed by the viewer.
and recognized
by the viewer.
Likeness of Likeness of Likeness of Likeness of
drawing: drawing: drawing: drawing:
• Still life objects • Still life objects • Still life objects • Still life objects
are very easily are easily are somewhat are difficult to
recognized due recognized easy to recognize.
to the accuracy due to mostly recognize. • Shading
of the shading accurate, • Shading is requires further
and addition of believable accurate on attention.
shadows below shading some forms. • More attention
objects. • Shading defines • Shading defines on shading
• Shading defines almost all some forms required to give
all forms, not forms, giving giving them the impression
line, giving them a three- some three- of three
them a three- dimensional dimensionality. dimensions
dimensional quality.
quality.
Isolation/Adaptation/Education 259

Level 4 Level 3 Level 2 Level 1


Workmanship: Workmanship: Workmanship: Workmanship:
• Artwork has • Artwork • Artwork has a • Artwork
a professional has a near- semiprofessional could be more
quality professional quality. professional-
• Smoothly quality • Some smooth looking.
blended pencil • Smoothly shading • Rough pencil
shading is blended • Work is finished shading is
everywhere pencil shading in most areas. everywhere
• Work is is almost • Artist’s name is • Work is
extremely everywhere written on the incomplete
highly finished • Work is finished bottom right- • Artist’s name is
• Artist’s name in most areas hand comer not yet on the
is written on • Artist’s name is • Work is partially bottom right-
bottom right- written on the cropped or is hand comer
hand comer bottom right- somewhat out • Work requires
• Work is hand comer of focus rephotographing
extremely well • Work is clearly and/or cropping
photographed photographed
and cropped and cropped
Name: /16 Written Assessment:

rubric was posted, and students created the artwork. They were given
written feedback with the chat function based on the “look fors”: variety
of pencil shades, composition, likeness of drawing, and workmanship.
Students could improve their work after it was assessed; they were
able to resubmit it for final evaluation. In face-to-face learning, I readily
offer this strategy, but it was more difficult during the pandemic
due to the timelines imposed by a fast-paced quadmester schedule.
Nevertheless, many students took the opportunity to better their work,
and our conversations helped to train students to develop independent
self-assessment strategies. Students communicated that without these
conversations, they often felt immobilized to take next steps to adjust or
change their work to become more successful.
As part of the regular assessment process, I asked students to photo-
graph multiple iterations of their work and post them online for me or
others to add written or verbal commentary. In an online platform, exer-
cising the creative process (Ministry of Education, 2010) was activated by
inviting students to turn on their cameras to share their trials with the
class or by asking students to reveal their work through screen sharing.
Some students felt comfortable with this option and frequently exercised
260 Christina Yarmol

Figure 13.6 Ree’s “Dear, Deer: I Am a Beautiful Creature,” tempera on paper. From


left to right, posts showing Ree’s progress in the submission section of
the online classroom. Written feedback was provided at each submis-
sion stage to arrive at the final artwork pictured on the right.

it, whereas others appreciated the private commentary given after posting
in the Google Classroom.
As noted earlier, Figure 13.3 is an example of technical painting skills;
it was completed before the painting project. The same student artist
completed the project in Figure 13.6, which shows the progress on her
landscape painting. Ree requested feedback during tutorial sessions and
in-class critiques, and the chat box enabled that feedback. I acknowledged
that artist-students could take my advice or discard it. It was up to them
to decide whether to alter their work.
In the large virtual class setting, students informally critiqued artwork
through the thumbs up emoji and notes in the chat box. Opening class
critiques in small group settings through breakout rooms, akin to elbow part-
ners in an in-person class, was helpful for students who were still gaining con-
fidence in their abilities. This strategy was helpful for receiving assessment
feedback about initial project ideas, rereading artists’ statements out loud
to one another, or dividing work for group art history projects. I randomly
visited these breakout rooms in session just as I would during an in-person
class setting. These interactive assessment for learning strategies helped build
relationships with my students and supported their sense of inclusion in the
online class community through a period of prolonged isolation.

Submitting Original Artwork? Encouraging


the Creative Process

Students communicated that when they could not readily physically


interact with their peers during the pandemic, they spent a great deal of
time alone online, communicating by cell phone or sharing websites. They
surfed the internet and viewed voluminous amounts of imagery from our
visually saturated world that sometimes tempted them to submit work
Isolation/Adaptation/Education 261

Artist’s Statement: Now that you have seen


Kelsey Viola Wiskirchen and a range of other
drawing with thread artwork and artists’
statements, write your statement about your
work. Use the questions below to guide your
paragraph writing. The viewer can learn
about your subject matter, process, and
technical decisions from this statement. /12

Process: Discuss the process -How did you


go about creating this work? What challenges
did you face while creating it? (2)
Aesthetics: Why did you select a particular
thread/floss colour? Why did you select a
particular medium/media? (2)
Back story: What is the back story? Why did
you select this subject matter? –What creates
a connectionwith the viewer? (4)
Grammar & Syntax: (4)

Figure 13.7 Example of artist’s statement look-fors.

that was not their own. To encourage the posting of original work created
by the student artists themselves, I made specific assessment look-fors
or guiding questions (see Figure 13.7). I requested the scaffolded process
work that I assigned, including sketchbook brainstorming, initial sketches,
completion of proposal sheets, and mounting of final progress imagery.
Preparatory work not only underlined the importance of the creative
process that I would have witnessed in person, but also taught students
that process is imperative in art production. The active recording of images
of their creative process continued throughout the online course delivery
model, and I provided guided templates for posting art processes and artists’
statements. Luke’s artistic process and artist’s statement for the Drawing
with Thread unit of the Grade 11 craft course are shown in Figure 13.8.

Finding Public Exhibition Venues

Exhibition is the ultimate display of students’ successful artwork; it is a


kind of assessment of learning judged by public viewers. Public displays
are paramount in the creative process (Ministry of Education, 2010),
promulgating ways to celebrate the success of students’ work by acknow-
ledging their creative endeavours, providing feedback from individuals
outside the classroom, and showing students their community cares
262 Christina Yarmol

Figure 13.8 Luke: Drawing with thread, Luke met specific look-fors on the
assigned template. Panel A: Source photograph.

about the work they have created. Exhibition supports not only students’
growth as learners, but also their development as individuals; it offers
them an additional opportunity to engage in self-assessment and self-
reflection about their artwork that serves to support their self-regulation
and mental health.
Isolation/Adaptation/Education 263

Figure 13.8 Panel C: Student’s Embroidery Work in Progress.

I usually organize a school art show, partner with outside agencies,


or participate in school board projects to exhibit work, but during the
pandemic, the gallery systems were closed. Two ways I embraced online
exhibition were (a) creating PowerPoint presentations of students’ works
that were presented to the school community and eventually mounted in
264 Christina Yarmol

Figure 13.8 Panel D: Full image of completed work.

the yearbook, and (b) presenting work at the school board level. Grade 11
and 12 students had the opportunity to work virtually with a street artist
in a school board show that was eventually mounted on the web as a still
show and compilation video on the board’s website. Students interpreted
the theme of adaptation in diverse ways, and we mounted the work
as a cohesive whole by using the backdrop of our reality—the Google
Classroom screen. Students expressed their ideas with acrylic paint on
Isolation/Adaptation/Education 265

Figure 13.8 (Continued)

wood panels (see Figure 13.9) and wrote truncated artist statements that
scrolled through the chat box on the right-hand side of the screen as a
1-minute GIF.
A second still frame of the artwork could be opened on the school
board website so that interested viewers could enlarge specific artworks.
The virtual graphic murals were created by a Grade 12 student, Nor. Nor’s
individual work with her artist’s statement is displayed in Figure 13.10.

Digital Portfolios

In my in-person art classes, students reviewed their physical portfolio work


at multiple instances in the school year. At the end of the year, students
266 Christina Yarmol

Figure 13.9 Still of adaptation project, digital file. The exhibition submission was
a PowerPoint of artwork accompanied by a GIF file showing one-
line artists’ statements.

Figure 13.10 Nor’s For Wheelchair Access, Please Go Through This Maze, acrylic on
wood panel.
Isolation/Adaptation/Education 267

reflected on what they had created, providing a commentary beside every


piece of artwork, and selecting their favourite in a brief written activity.
During the pandemic, we moved from a physical portfolio to a digital
portfolio (see Figure 13.11). The ongoing assessment of students’ work in
a portfolio enabled the students to be responsible for their own develop-
ment as it activated the metacognitive awareness for students to monitor
their own learning (Dikici, 2009).
The review of student final course portfolios was instrumental in
answering the question about evaluating and documenting students’ art-
istic achievements online. A few students in the Grade 10 course noted
that making a digital portfolio online was an onerous process, but that
putting the collection of work together in a single file made them feel
positive about what they had created. Students in all my classes shared
that viewing their digital portfolio at the end of the course bolstered their
confidence in their abilities and achievements. The culminating portfolios
are excellent examples of assessment as learning, with students acknow-
ledging the successful artwork they created over the course of the year,
and assessment for learning as I gave marks for the final collection of
work.

Discussion: New Understandings

This chapter outlined ways in which a visual art curriculum was


reimagined, translated, taught, assessed, and evaluated in a vir-
tual realm. By combining Wiggins and McTighe’s (2011) backward
design ideas with Cooper’s (2010) ideas of assessment, Hattie’s
(2012) growth mindset, and Reich’s (2020) concept of students’ need
for human interaction, I scaffolded purposeful curriculum planning
that supported the creative process (Ministry of Education, 2010)
that is at the centre of product-driven art courses. New challenges
that manifested themselves in online classes were overcome with
creativity: providing access to art materials, mitigating disengage-
ment, teaching basic art skills, allowing submission of original art-
work, finding a way to readily see students’ art making processes,
reimagining art assessment and evaluation strategies, enabling public
exhibition opportunities, and presenting culminating portfolios.
I shared solutions in this chapter that worked for me, and they can
be adapted or modified by other educators for online teaching,
assessment, and evaluation. The ensuing curriculum was not equal
268 Christina Yarmol

Figure 13.11 Sample of digital portfolio. This sample depicts four out of 15 items
from Elizabeth’s 2020 Grade 10 culminating digital portfolio.
Isolation/Adaptation/Education 269

to in-person art learning, but it did result in a rich experience for stu-
dent artists that culminated in the acquisition of skills, the develop-
ment of art appreciation, and the creation of successful art projects
by students who made a concerted effort to remain engrossed in
their learning.
This a/r/tographic self-study presents an experienced art educator’s
pivot to deliver a hands-on, in-person art curriculum on a virtual learning
platform, offering innovative ways to mobilize knowledge. Art teachers
need models of ways to service students in both in-person and virtual
school environments as online learning becomes more prevalent. The
theoretical practices of assessment and evaluation of visual arts, and the
practical, inventive ways to deliver the curriculum to meet provincial
standards offered in this chapter, can be used to support in-service or pre-
service teacher programs for visual art, technology, home economics, or
English/media classrooms.

References

Blaikie, F., Schönau, D., & Steers, J. (2004). Preparing for portfolio assessment
in art and design: A study of the opinions and experiences of exiting
secondary school students in Canada, England and the Netherlands.
International Journal of Art & Design Education, 23(3), 302–315. https://doi.
org/10.1111/j.1476-8070.2004.00409.x
Boughton, D. (2013). Assessment of performance in the visual arts: What, how,
and why. In A. Karpati & E. Gaul (Eds.), From child art to visual culture of
youth–New models and tools for assessment of learning and creation in art educa-
tion. Intellect Press.
Cooper, D. (2010). Talk about assessment: Strategies and tools to improve learning.
Nelson.
Dikici, A. (2009). An application of digital portfolio with the peer, self, and
instructor assessments in art education. Egitim Arastirmalari-Eurasian Journal
of Educational Research, 36, 91–108.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize
student learning (2nd ed.). Corwin Press.
Greene, J. P., Kisida, B., & Bowen, D. (2014). Value of field trips. Taking students
to an arts museum improves critical thinking skills, and more. Education Next,
14(1), 79–86.
Haanstra, F., & Schönau, D. W. (2007). Evaluation research in visual arts edu-
cation. In L. Bresler (Ed.), International handbook of research in arts edu-
cation (Vol. 16, pp. 427–444). Springer International. https://doi.org/
10.1007/978-1-4020-3052-9_27
270 Christina Yarmol

Hattie, J. (2009). Visible thinking: A synthesis of over 800 meta-analyses relating to


achievement. Routledge.
Hattie, J. (2012). Visible learning for teachers maximizing impact on learning.
Routledge. https://inventorium.com.au/wp-content/uploads/2020/09/Hattie-
Visible-Learning.pdf
Irwin, R., & Springgay, S. (2008). A/r/tography as practice based research. In
M. Cahnmann-Taylor & R. Siegesmund (Eds.), Arts-based research in education:
foundations for practice (pp. 103–124). Routledge.
Irwin, R. L. (2008). A/r/tography. In L. Given (Ed.), The SAGE encyclopedia of quali-
tative research methods. Sage. https://doi.org/10.4135/9781412963909.n16
Irwin, R. L., Beer, R., Springgay, S., Grauer, K., Xiong, G., & Bickel, B. (2006).
The rhizomatic relations of a/r/tography. Studies in Art Education, 48(1),
70–88. https://doi.org/10.1080/00393541.2006.11650500
Irwin, R. L., & De Cosson, A. (Eds.). (2004). A/r/tography: Rendering self through
arts-based living inquiry. Pacific Educational Press.
Irwin, R. L., & Sinner, A. (Eds.). (2013). A/r/tography and the arts. UNESCO
Observatory Multi-Disciplinary Journal in the Arts, 3(1). www.unescoejournal.
com/wp-content/uploads/2020/03/3-1-11-HUNG_PAPER.pdf
Lubart, T. I. (2001). Models of the creative process: Past, present and future.
Creativity Research Journal, 13(3–4), 295–308. https://doi.org/10.1207/
S15326934CRJ1334_07
Ministry of Education. (2010). The Ontario curriculum Grades 9 and 10: The arts.
Government of Ontario. www.edu.gov.on.ca/eng/curriculum/secondary/
arts910curr2010.pdf
National Film Board of Canada. (November 20, 2020). Printmaking 5. www.nf b.
ca/subjects/visual-arts/printmaking/
Reich, J. (2020). Failure to disrupt: Why technology alone can’t transform education.
Harvard University Press.
Springgay, S., Irwin, R. L., & Kind, S. W. (2005). A/r/tography as living
inquiry through art and text. Qualitative Inquiry, 11(6), 897–912. https://doi.
org/10.1177/1077800405280696
The Textile Museum of Canada. (December 7, 2019–January 29, 2022). Printed
textiles from Kinngait Studios [Art exhibit]. The Textile Museum of Canada.
https://aggp.ca/exhibitions/printed-textiles-from-kinngait-studios/
Wiggins, G., & McTighe, J. (2011). Understanding by design: Guide to creating high-
quality units. ASCD.
Zehm, S. J., & Kottler, J. A. (1993). On being a teacher: The human dimension.
Corwin Press.
Using the SAMR Model 14
to Design Online K–12
Assessments
Sheryl MacMath

This chapter outlines how to use the SAMR model (Puentedura, 2006),
referring to substitution, augmentation, modification, and redefin-
ition to redesign assessments and assignments for K–12 classrooms
for the online environment. The examples and descriptions emerged
out of my literature review and in reflection on my work with teacher
candidates having to complete the remainder of their certifying prac-
ticum in a remote setting given the onset of COVID-19 in the spring of
2020. Although not a formal research project, a great deal of learning
occurred as teacher candidates and teacher educators had to figure out
new ways of teaching and assessing. This chapter aims to share those
lessons learned that I found most valuable when working with teacher
candidates to support them in adapting to remote instruction during
their field experiences.
I look first at the criteria used to implement effective assessment
practices that align with 21st century learning regardless of the envir-
onment: face-to-face, online, or hybrid. Second, I look at how these cri-
teria impact planning and assessment when teaching online or remotely.
Then I explore the SAMR model (Puentedura, 2006) and its implications
for planning and assessment. I also look at how the online environment
impacts both progressive skill development and discrete content know-
ledge attainment. I end with a section on benefits and considerations for
technology use.

DOI: 10.4324/9781003347972-18
272 Sheryl MacMath

Comparing Assessment and Evaluation

It is important that the terms assessment and evaluation are clearly under-
stood. Assessment is the gathering of evidence to determine what
students do or do not know or can or cannot do so that teachers can plan
activities that meet students where they are at (Black & Wiliam, 1998;
Dueck, 2014; O’Connor, 2011; Schwimmer, 2011; Stiggins & Chappuis,
2012; Venables, 2020). Teachers teach, assess, reteach, or move on to the
next concept in a continual cycle designed to maximize student success.
In contrast, evaluation is about making a judgement about students’
learning to date. Does it meet expectations? Is it proficient? What per-
centage does it reflect? In essence, evaluation is summative (Brookhart
et al., 2020; Dueck, 2014; Feldman, 2020; Schwimmer, 2011; Venables,
2020). It represents a moment in time rather than a progression over time.
Conversely, assessment is formative. Recognizing this key difference, it
is important to consider where teachers spend their time: Is most time
spent assessing or evaluating student work? Classrooms that maximize
student success are focused more on assessment activities than evaluative
activities (Black & Wiliam, 1998; Earl, 2003; Himmele & Himmele, 2011).
I often get asked, “What does that look like, really?” It means that there
are usually at least three times as many assessments that are not for marks
as there are ones for marks. Instead of giving an assessment a grade or
score, either look at it and make decisions about the next teaching activity
or provide descriptive feedback for students: There is no grade, score, or
percentage. It also means a focus on observing and listening to students
to adjust instruction; assessment becomes intertwined with the teaching
process (Cooper, 2022).
This distribution of assessment and evaluation time has an interesting
impact on the classroom and the assignments used in class (Feldman,
2020). In an assessment-focused classroom, the focus on learning tends to
increase students’ intrinsic motivation, so they soon stop asking, “Is this
for marks?” In addition, the amount of marking time tends to decrease.
When teachers are not focused on gathering scores, they can minimize
what is collected from students (Dueck, 2014). Rather than a worksheet of
20 questions, one could have students complete three exit ticket questions
to see if they understood the concepts taught that day. Assessment-
focused teachers are more selective about what they collect, as assessment
is about gauging student understanding rather than collecting home-
work or checking completion. An important caveat is that students may
Using the SAMR Model to Design Online K–12 Assessments 273

still be assigned homework and/or practice questions; however, these


assignments may be used for a variety of evaluation purposes. Teachers
can look at how students are doing with their homework, or even better,
homework becomes an opportunity for students to self-assess their own
work (Cooper, 2022).

Conceptual Framework: Assessments as, for,


and of Learning

A powerful conceptual framework to guide assessment and evaluation


activities is the structure of assessments as, for, and of learning (Earl, 2003;
Western Northern Canadian Protocol for Collaboration in Education,
2006). Assessments of learning are traditional summative evaluations.
Assessment for learning activities are formative assessments that are used
to gauge student understanding and plan for future lessons. The interesting
addition is assessments as learning. Often folded into formative assessment
categories, assessments as learning are valuable to consider separately.
Assessment as learning focuses on students learning how to assess them-
selves (Earl, 2003). Self-assessment is a skill unto itself; a skill that needs to
be developed (Clark, 2014). Rather than being an additional assignment,
assessment as learning activities help students understand the tasks they
are engaged in for their assessments for and of learning; if students do
not understand how they are being assessed, they are unable to self-assess.
These activities include sharing an example with students (or on a work-
sheet or test, doing the first one as a class), comparing strong and weak
examples, sharing rubrics or rating scales, allowing peer feedback, using
self-assessment, or having students help generate the criteria or rubric
used for an assignment. These activities (rather than assignments) help
students understand how they are to complete an assessment. Applied to
a classroom, most activities and assessments are assessments as and for
learning, with a few assessments of learning (evaluations) to determine
final grades or proficiency levels (Earl, 2003).

Assessment in the 21st Century

So how do assessments as, for, and of learning connect with assessment


for the 21st century? When I think about education in the 21st century,
274 Sheryl MacMath

the most critical component is being skill driven rather than content
driven; this shift is a huge one for teachers. Traditionally, school has been
about acquiring content: learning dates, times, events, concepts, and so
on. However, with the advent of the internet and search engines, content
is easy to acquire at any moment (Cooper, 2022; Dumont et al., 2010).
Students do not need to memorize facts and concepts when they can be
looked up on the spot. What they need are the critical thinking skills to
evaluate which sources are credible. They need to know how to collab-
orate effectively with others to analyze ideas. They need to be able to
apply concepts to new problems. They need to be able to build, construct,
analyze, evaluate, organize, and present. Teachers have always wanted
students to develop these skills, but with recent technology explosions,
they have become critical. Content still has a place; students need con-
tent to analyze, but there is no need to acquire content that is not used to
develop skills.
How does this consideration impact assessments and evaluation? To
start, teachers do not need the traditional long tests that require students
to regurgitate memorized content. They need more projects where
students are applying what they know (Fogleman et al., 2011; Halvorsen
et al., 2012; Han et al., 2015). Tests have a place (especially if they are
more skill focused), but they should not be the primary source of evi-
dence. Some practical applications include the following:

• Science assessments are less about tests of content and more about lab
reports and analyses.
• Social studies assessments focus on students critically thinking through
historical and current events and how these events inform the types of
citizens students will become, rather than tests of dates, names, and
places.
• Math assessments have more real-life problems and applications than
procedural questions.
• Language literacy is about writing for purpose and audience and
reading to analyze and critique.

With this focus on critical thinking, evaluating, and problem-solving


skills, assessments as, for, and of learning can be extremely powerful.
Assessment in the 21st century is most often about learning a skill over
time; it is a progression. As such, there should be an emphasis on for-
mative assessment rather than evaluation (Cooper, 2022). Given that
Using the SAMR Model to Design Online K–12 Assessments 275

evaluation (or assessment of learning) is a snapshot in time rather than


development over time, it is important to delay this step if possible
(Brookhart et al., 2020; Feldman, 2020; MacMath, 2017; Venables, 2020).
As a result, there is a synchronicity between the conceptual framework
of assessment as, for, and of learning and 21st century learning and
education. Consequently, what does this synchronicity look like in an
online environment?

Online Assessments

The good news is that everything mentioned above applies to online or


remote assessments. The majority of time is spent on assessments rather
than evaluations; the more students are involved in the assessment pro-
cess (i.e., assessment as learning), the better; and emphasis is on skill
development rather than isolated content. A bonus when working with
online assessment and evaluation is that it can often be easier for teachers
to let go of years of experience with more traditional views of assessment
and evaluation (Feldman & Reeves, 2020). There are, however, some key
things to remember when designing assessments and evaluations for
online environments.

• When I think about teaching in an online environment, I often use


the terms consumption and creation (Tucker, 2020). Consumption is
how students are learning or acquiring knowledge and skills (e.g.,
videos, books, lectures, discussions). Creation refers to students
demonstrating their knowledge and skills and creating evidence for
assessment. This aspect is the focus in this chapter.
• Given the newness to working online that occurred with the onset of
COVID-19, I found that most instruction took longer. Therefore, it
was important to truly live the idea of less is more. The focus should be
on quality over quantity.
• Teacher candidates need to purposefully think through which
assessments are to be synchronous (whole class or groups coming
together online at a specific time) and which are to be asynchronous
(students working independently).
• Teacher candidates need to be aware of how much screen time students
have to put in. Is there asynchronous time that can be spent offline?
Can students interact with books rather than virtual environments?
276 Sheryl MacMath

Can they go outside? Can they visit areas in their community? Can
they be building or experimenting with their hands?

One of the big considerations when working online (or anytime when
choosing to work with digital technologies) is the selection of programs,
apps, and tools (Burns, 2017). When making their selections, teacher
candidates should maintain focus on the types of evidence they are
gathering. They want to select a program, app, or tool that best aligns
with their evidence, not one that seems cool. Teacher candidates should
therefore not start by looking at a program or app. Instead, they should
focus on the type of evidence they want to gather and then look for an
option that enables them to get that evidence in an easy-to-use format
that also (one hopes) enables some variation for multimodalities. Teacher
candidates may also want to think about how they will organize or track
that evidence and remain focused on assessments as and for learning.
A fantastic resource for this is the book Tasks Before Apps (Burns, 2017).
A conceptual framework that can help teacher candidates focus on the
evidence they need is the SAMR model (Puentedura, 2006).

SAMR Model for Assessments

When my teacher candidates had to move online with their teaching, they
often asked me, “How do I take what I usually do and move it online?”
First, it is important to remember that the online environment required
some alterations to tasks. Teacher candidates were not able to move
through a classroom to observe, so consideration needed to be given to
how they would ensure regular check-ins as well as time for collaboration
and peer work. They needed to think about how they could use side-by-
side assessments and station rotations with their assessment tasks. Second,
technology enabled teacher candidates to engage students in multimodal
assessments. As such, rather than seeking to simply move an assessment
online, they could think about how they could engage students in more
creative ways. For this purpose, the SAMR model became very useful
(Puentedura, 2006).
The SAMR model’s (Puentedura, 2006) elements of substitution,
augmentation, modification, and redefinition represent a progres-
sion for classroom teachers as they move from assigning paper and
pencil tasks to using technology: Each step moves further from the
Using the SAMR Model to Design Online K–12 Assessments 277

original task to maximize multimodal opportunities. Notably, these


applications of digital technology are not isolated to an online envir-
onment. These alterations can be used in face-to-face classrooms that
integrate the use of technology in day-to-day lessons. I look at each
stage with supporting examples. In the next section, I include specific
student examples.

Substitution

Substitution is the first stage in the model and involves just a replacement
of tools (Puentedura, 2006). The actual task does not change. For example,
rather than creating a poster in class, students could use a program like
Adobe Express (www.adobe.com/express/) or Poster Creator by Crested
Developers (bit.ly/3HnXFKC) to make an online poster. Other than
completing the assignment using technology, no additional multimodal
opportunities are integrated compared to the original task.

Augmentation

The augmentation stage involves a tool replacement with an added


benefit—something students would not be able to do with a paper and
pencil task (Puentedura, 2006). For example, teacher candidates could
have students complete four or five multiplication problems on paper. If
students simply completed them using technology (e.g., whiteboard.fi;
https://whiteboard.fi/), that would be considered substitution. To move
up to augmentation, students could use a program like SeeSaw (https://
web.seesaw.me/) that enables them to add a voice thread. That way, as
students complete their math problems, they could talk through their
decision-making processes. This augmentation gives the teacher candi-
date a better understanding of why students completed the problems the
way they did, and it provides students with the opportunity to “show
what they know” both verbally and symbolically. Another example would
be having students create a poster or essay online (substitution) but
include hyperlinks to online content to augment what they are summar-
izing. This task could also be an individual placemat task. Both examples
take the original tasks and incorporate technology such that there is an
added benefit.
278 Sheryl MacMath

Modification

The third stage in the SAMR model is modification (Puentedura, 2006).


This step involves significant alterations to the original task. These alter-
ations usually enable students to engage in more collaborative work
and/or significant multimodal opportunities. For example, students
can use shared Google Docs (https://docs.google.com/) to create joint
presentations, essays, or brainstorming sessions. These shared documents
could be part of a station rotation or exist in a work plan that details how
and when students worked together. Alternatively, students could use a
program such as Book Creator (https://bookcreator.com/), whereby
rather than just writing a story, they add videos, voice-overs, hyperlinks,
pictures, and text in creative ways. In this way, students leverage multi-
modal opportunities to engage a variety of senses while providing mul-
tiple ways to show their learning. In both examples, students move beyond
an added benefit (augmentation) to collaboratively and/or multimodally
create different assessment evidence.

Redefinition

The last stage of the SAMR model involves the creation of a task that
does not exist without the chosen digital technology (Puentedura, 2006).
At this stage, teachers are not simply altering a previous assignment; they
are creating a new one. For example, students could create a blog, a social
media post, a website, or an iPhone (e.g., “Show what was learned in
an author study by creating the iPhone of that individual. What music
and apps would the author have on their phone, who would be in their
contacts, and what pictures would be in their photos?”). These tasks,
which were not used in schools until those digital technologies came
into existence, have students think about what they are learning in
different ways. Students may find these tasks engaging given the novelty
and potential creativity involved. They also require students to work at
a much higher level of critical thinking: Rather than just summarizing,
they are having to apply, evaluate, and critique in a variety of multimodal
ways. Redefinition tasks can exist within a work plan or as placemat tasks.
I found that as teacher candidates moved through the SAMR model
(Puentedura, 2006), opportunities increased for students to collaborate
and show their learning in multimodal ways. Whether using technology
in the face-to-face classroom or online, digital technologies can help
Using the SAMR Model to Design Online K–12 Assessments 279

students (and teacher candidates) focus on critical thinking, skill develop-


ment, collaboration, and creation that move classroom assessments into
the 21st century (Burns, 2017; Dumont et al., 2010; Farmer & Ramsdale,
2016; Puentedura, 2006; Selwyn, 2012; Tucker, 2020).
I now turn to the actual design and implementation of online
assessments the teacher candidates and I used during the early months
of the pandemic. The first step I shared with teacher candidates was that
when designing online assessments, they should think through whether
they were gathering evidence on progressive skill development or dis-
crete skills and content. This organization of evidence impacted the types
of assignments teacher candidates would design.

Progressive Skill Development

I define progressive skills as those that are developed over time—and the
order in which they are learned matters. For example, when teaching
addition, teachers may start with using manipulatives to add, then draw
pictures of addition sentences, and then use symbols. After that, teachers
may introduce regrouping using 10 frames, manipulatives or pictures,
and finally symbols on a place value chart. The skill progression scaffolds
over time. Another progression example would be teaching students how
to write an essay. They learn how to write a thesis statement, they create
an outline for their paragraphs, they research for their paragraphs, they
write the body, they add in connecting statements and imagery, and then
they edit and publish. Math and language literacies tend to have a great
deal of progressive skill development. There are many different learning
progressions; the two above are just examples.
Beyond being a progression, it is important to note that here I am
talking about skills, not just knowledge. In these instances, how a stu-
dent completes the task is the focus, not just what a student knows.
To evaluate a skill, teacher candidates should focus on observing and
talking with students. In the literature, this was most evidenced in math
interviews, such as having a student solve a problem and talking out loud
while they did it (Forbringer & Fuchs, 2014), and using guided reading
groups (Fountas & Pinnell, 2016). As such, evidence is usually gathered
in small group settings or individually to enable feedback (Tucker, 2020).
The focus is on connecting with students: Assessments need to occur on a
regular basis to maintain the focus on student creation of evidence rather
than how the teacher candidates will teach.
280 Sheryl MacMath

Small Group Assessments: Station Rotation

In small group assessment, students are organized into small groups


(maximum of five per group) that meet synchronously in breakout
rooms. Each room has students involved in different activities. Every 15
minutes, either the teacher candidates or the students rotate through
the groups. This structure is known as a station rotation (Tucker, 2020).
An example of this type of assessment is guided reading groups. One
room could be with the teacher candidate, where students are reading
and discussing their book. Other breakout rooms could have a writing
task for students to do independently, a listening activity, and students
reading together. To use a math example, the different groups could be
watching a video on addition, playing a game involving addition, com-
pleting a worksheet, giving each other problems to solve, and being with
the teacher candidate. The group that is with the teacher candidate could
be given problems to answer in real-time.
In this informal study, whiteboard.fi enabled the teacher candidates to
observe each student’s whiteboard area so they could see what students
were doing in real-time (type of substitution). Teacher candidates could
ask questions of students while they worked, or they could choose to
share one student’s whiteboard with the whole group to illustrate either
a problem or solution. During this time, teacher candidates kept a check-
list of the progression steps. While they were watching students com-
plete their work and asking them questions to clarify, they could check off
which skills students had acquired and which ones needed more work.
The more specific the checklist, the easier it was to assess in the moment.
The checklist that teacher candidates used could be for the whole class or
for individual students. Figure 14.1 gives an example of a class checklist
for math.
The recording sheet (or checklist) in Figure 14.1, designed for an entire
term, was completed while teacher candidates were working with small
groups of students rotating through different stations. It covered multiple
grade-level expectations given the different student capabilities that exist
in any one class. As individual students demonstrated success with one
step, they moved to the next. As a result, this recording sheet could be
used to help teacher candidates redesign groups into similar ability levels
if appropriate.
In contrast, Figure 14.2 is a recording sheet for an individual student
for a single term. It covers all the learning standards (or outcomes) for
that term in English language arts, aligned with the British Columbia
Using the SAMR Model to Design Online K–12 Assessments 281

K-2 Numereacy Assessment Recording Sheet


Number Sense Addition Subtraction

Date
Tested:

Different wavs to
Wavs to make 17
Correct order ud
Correct symbols
Highest number

Different wav to
Wavs to make 9

Make a storv

Make a storv
Regroup 10s
Student
Left to right

Build 20 - 6
Build 34-18
Skip bv 10s

Build 13-8
5 + ? = 11
Skip bv 2s
Skip bv 5s

Un-group
Names

Start at 3

Build 7-3
Symbols

Symbols
24 + 37
13 + 8
6+3

Figure 14.1 Checklist for K–2 math term.


Name: ______________ Term: ______________
Decoding
Word patterns at, an
Match letters to sounds
Basic long vowels
Word wall words
Ready for first sound
Picture clues
Stretch sounds
Looks right
Sounds right
Makes sense
Comprehension
Prediction
B-M-E
Text to self
Prior knowledge
Retell events
Use pictures to make references
Authors message
Oral/Listen
Listens and talks to peers
Give 2 criteria for good speaking
Give reasons why listening is important
Follow 2 oral instructions
Give 2 criteria for good listening

Figure 14.2 Individual recording sheet for an English language arts term.


282 Sheryl MacMath

Name: ______________ Term: ______________


Writing
Chooses personal topic
Write likes/dislikes
Remains on topic
Experiments with 2 forms of writing (diary,
list, letter)
Writes full sentence with spaces between
words (grl)
Beginning and end sounds of words (grl)
Uses familiar words (grl)
Detailed pictures (grl)
Letter sounds to spell words (initial, middle,
final consonants, short vowels) (2)
Five full sentences (2)
Capitals and periods (2)
Graphic organizers to organize ideas (2)

Figure 14.2 (Continued)

curriculum (British Columbia Ministry of Education, n.d.). It could be


used for one-on-one meetings with students as well as station rotations.
If teacher candidates were meeting with a group of five students, they
would have five sheets in front of them at that time. Whereas columns
and rows could be added in, this example illustrates blank areas for writing
notes. The single column after each progression step enabled teachers to
check off when that student had demonstrated that step.

Side-by-Side Assessments

Alternatively, sometimes assessments are best completed one-on-one. In


this case, a more in-depth conversation usually needed to take place as
students updated the teacher candidates on their progress. During these
side-by-side assessments (Tucker, 2020), teacher candidates focused on
one specific task progression, such as the development of an essay, the
creation of a poster or presentation, the completion of a series of science
labs, or the completion of a series of history projects. In each of these
cases, students were making their way through a larger task (or collection
Using the SAMR Model to Design Online K–12 Assessments 283

of smaller tasks). It was critical for the teacher candidate and student to
connect on a regular basis to check progress and discuss next steps. When
teaching face-to-face, it is easy to walk around the room and check on
students’ progress. When teaching remotely, teacher candidates needed
an alternative to that movement through a room.
These check-ins were accomplished through individual meetings
when students presented what they had done, received brief and
descriptive feedback on what they had completed, and discussed next
steps (see Brookhart, 2017). These meetings were usually kept short:
no more than 5 to 10 minutes. This timeline was achieved only if both
the teacher candidate and the student were clear on what they were
reviewing, with specific things to look for. As a result, the recording
sheet for the side-by-side assessment needed to clearly outline the steps
needed so that the teacher candidate and the student could focus on only
one or two things at a time. Figure 14.3 illustrates a recording sheet for

Language Arts Grade 1: Side-by-Side Assessments


Name: __________________ Term: __________________
Writing

Chooses personal topic


Includes likes/dislikes
Remains on topic
Experiments with two forms of writing (e.g., diary,
list, letter)
Writes full sentences with space between words
Knows beginning and end sounds of words
Uses familiar words
Uses detailed pictures
Uses letter sounds to spell words (initial, middle,
final consonants, short vowels)
Uses capitals and periods
Includes five full sentences
(Grade 2)
Uses graphic organizers to organize ideas (Grade 2)

Figure 14.3 Grade 1 side-by-side checklist for a personal writing assignment.


284 Sheryl MacMath

an elementary-level side-by-side assessment. Side-by-side assessments


for intermediate and secondary grades often included a few different
pieces and were usually connected to discrete units; rather than a pro-
gression for a skill development, students were working through a
body of content and skills as part of a unit. This then involved a work-
flow plan for students.

Discrete Units

A discrete unit involves a set number of learning standards or outcomes


to be achieved. When the unit is done, teacher candidates should have
enough evidence to report on those outcomes and move on to a different
unit. After the switch to remote learning, teacher candidates completed
side-by-side assessments on a regular basis to keep track of student pro-
gress. These meetings were critical for students as the focus was on feed-
back and the development of next steps. These meetings represented
assessments for learning. Teacher candidates wanted to keep track of
how students were progressing and the feedback that they had provided.
As such, students had a workflow plan to guide them, and the teacher
candidate had a recording tool.
In middle school and secondary grades, it was powerful for the stu-
dent to also have a recording tool to keep them on track. This tool acted
as an assessment as learning opportunity, enabling students to monitor
and set goals for their learning. When looking at building a workflow
plan for these grades, working remotely with technology offered a var-
iety of opportunities for students to demonstrate their knowledge and
skills in multimodal ways. Figure 14.4 provides an example of a placemat
assessment. It outlines how students were expected to demonstrate what
they had learned. Notice how this placemat enabled students to choose
how they wanted to demonstrate what they had learned. Across the top
are the learning outcomes being addressed in that unit, with the first row
representing a task that everyone had to complete and the bottom row
representing a variety of assignments (substitution, augmentation, and
modification) that students could complete.
Whereas Figure 14.4 outlines the assignments to be completed,
Figure 14.5 is the workflow plan that students used for each assignment.
In this example, because students completed six assignments in total,
they had six workflow plans: one for each assignment. Notice how the
teacher candidates included specified teacher check-ins. These check-ins
Using the SAMR Model to Design Online K–12 Assessments 285

Grade 9 Social Studies


Unit: War of 1812 Piacemat Assessments
Unit Historical Significance Colonialism Development of the
Outcomes Nation State
Create an ebook using Book Creator of a timeline of important events
leading up to and occurring during the War of 1812. For each event,
Everyone
indicate who was involved, why it was important, and what was the
Must
consequence. Include an image for each event. The final page of the book
should include the references that were used.
Complete Write a three-paragraph Create an audio Choose the three
5 of these essay justifying who of recording to Tecumseh most important
activities, the following was the providing reasons antecedents leading
ensuring you greatest hero: Brock, for/against joining to the War of 1812.
do at least one Secord, Tecumseh, or the war. Provide a Represent, explain,
per outcome Salaberry. recommendation and and justify them in an
justification. Adobe infographic.
Use Poster Creator Write a response to In a concept map,
comparing two of the YouTube clip from compare the three-
the following events: "The Agenda with pronged battles
Battle of Tippecanoe, Steve Paikin" on who along Upper Canada,
Capture of Detroit, really won the War of Lower Canada,
Laura Secord's Walk, 1812 ( June 14, 2012). and Oceanside and
or the Battle of What were the key explain how they
Lundy Lane. Include points raised by each each impacted
a map, description panelist? Was anyone the formation of
of both events, why missing from the Canada (BrainPop
they are significant, panel? Make-a-Map).
and similarities and
differences.
Use Screencast-O- Using Adobe Spark, Write a three-
Matic to compare and create a four- paragraph essay on
contrast the Battles of square comparison how the War of 1812
Lake Erie, Thames, and reviewing the causes contributed to the
Chateauguay. What and consequence of creation of Canada.
were they, which was Indigenous Peoples
most significant, and going to war. Include
why? statistics and maps as
part of your evidence.

Figure 14.4 Placemat assessment activities for a unit on the War of 1812.

were completed as a side-by-side assessment meeting. Students reported


on their progress, received feedback on the work they had done, and
developed their next steps. This workflow plan functioned best as a
shared Google Doc (or similar) so that the teacher candidate had real-time
information on how students were progressing. If a teacher candidate
286 Sheryl MacMath

Grade 9 Social Studies


War of 1812
Example Workflow Plan
All steps to be kept in your digital portfolio.
Placemat Task: ____________________
Steps Date What’s next? Keep in mind?
completed?
1 Identify two key teacher-supplied
resources that will be needed for this
task.
2 Make key points from those
resources.
3 Identify and evaluate two additional
resources that you find yourself.
4 Make key points from your two
additional resources.
5 Teacher check-in
6 Organize key points to be included
into a graphic organizer (this may
include a draft blueprint for visual
tasks).
7 Teacher check-in
8 Create your first draft.
9 Get peer feedback.
10 Self-assess: what are your strengths
and challenges?
11 Complete your final draft.
12 Do side-by-side assessment with
teacher.

Figure 14.5 Workflow plan for each placemat task.

noticed that a student was not adding to their workflow plan, the former
contacted the latter to check on them. This responsiveness was important
during remote schooling given that students were often working on their
own. An added value of these one-on-one check-ins was that there was no
additional marking after class was done: It took place during the meeting.
In addition, the workflow plan included a specific step for peer feedback.
This was one way to incorporate a variety of peer-to-peer interactions
that also represented an assessment as a learning opportunity. Given the
isolation of remote schooling, it was important to plan specific opportun-
ities for peer-to-peer interaction.
Using the SAMR Model to Design Online K–12 Assessments 287

When designing workflow plans for K–5 students, students needed to


have a clear set of steps to complete. Although there could still be choices,
it was often easier for students if the teacher candidates included both the
consumption (learning) and creation (assessments) activities in the work-
flow plan. This approach gave students greater independence, and parents
or guardians could guide them if they were stuck. Figure 14.6 includes
the start of a PowerPoint presentation used with primary students for
a discrete unit on water resources. Each underlined section is a hyper-
link to take students to a document, video, or reading (substitution). The
assessments that students completed are clearly identified. This sequence
of activities provided opportunities to work away from the computer to
reduce screen time.

Figure 14.6 Example workflow PowerPoint for primary unit.


288 Sheryl MacMath

Figure 14.6 (Continued)

It is important to note the first task in Week 2: Students were to listen to


audio feedback on their previous week’s assignments (see Figure 14.6). This
feedback was audio or video recorded and provided descriptive feedback
for students (augmentation): what the teacher candidate noticed about the
work students had completed and things to focus on for the next week’s
assignments. The assessments listed in this example were all assessments
for learning. It would not be until later in the unit that students completed
assessments of learning. Given that this sequence of activities was asyn-
chronous, the teacher candidate also scheduled a side-by-side assessment
conversation to connect with students on their progress.

Benefits and Considerations for Technology Use

As illustrated in the examples provided, technology can enable teacher


candidates (and teachers) to increase the number of opportunities for
students to engage in multimodal activities (Burns, 2017). Although this
is also true for consumption (teaching) activities, this chapter has focused
on opportunities for multimodal assessments (creation). Technology can
enable students to show what they know via written, audio, or video
means, or combinations thereof. For example, using programs such as
Book Creator, students can include all three modalities in their work.
This capability can increase student engagement and interest in their
schoolwork; as well, it gives students who may be challenged to write the
opportunity to show what they have learned.
Using the SAMR Model to Design Online K–12 Assessments 289

These benefits and opportunities need to be balanced with the recogni-


tion that it takes time to learn any new technology; that, in and of itself,
is a skill. Learning to use different technologies aligns with 21st century
learning needs and will benefit students in the long run (Dumont et al.,
2010; Farmer & Ramsdale, 2016; International Society for Technology
in Education, 2019; Mishra & Koehler, 2006; Selwyn, 2012). However,
students must be given the chance to work with a new technology a
number of times. As a result, I believe it is wise to choose just a few tech-
nologies for students to work with, and then use them repeatedly. This
strategy enables students to develop their proficiency so they can compre-
hensively show what they have learned. Other things to consider when
choosing technologies include cost of use; compatibility with different
computers, laptops, and iPads; and ease of use. Table 14.1 includes a list
of different technologies for specific types of tasks that may be worth
exploring.
As a final note, one of the key things I learned from working with
teacher candidates who had to move to online teaching in the COVID-19
pandemic was the importance of taking things one step at a time. I rec-
ommend teacher candidates start with one assessment they would like to
alter; this experience will then lead to a second and a third. Start with a

Table 14.1 Technologies to investigate.


What do you want them to make? Possible programs/apps
An ebook Book Creator
Collaborative or shared whiteboard Whiteboard.fi
where you can see individual work
A poster Adobe Spark Poster; Canva; Piktochart;
Poster Creator by Desyne
Models or 3D projects Tynker; Minecraft (education edition)
Multimedia poster creation Explain Everything ($); Glogster ($)
Audio recorded over a whiteboard or Screencast O-Matic; Easy Teach
screen
Timeline and collaborative Padlet
multimedia loading
Video or audio SeeSaw; Adobe Spark Video
E-portfolios Wakelet; LiveBinders
Concept mapping Mural by LUMA Institute
Note. Adapted from Tasks Before Apps: Designing Rigorous Learning in a Tech-Rich Classroom,
by M. Burns, 2017, p. 25. Copyright 2017 by ASCD.
290 Sheryl MacMath

substitution, then try an augmentation, then a modification. Not every-


thing has to be completely redefined. Although remote assessment does
require some rethinking, strong assessment principles apply in both face-
to-face and remote classrooms. By using the SAMR model (Puentedura,
2006), the teacher candidates I worked with in the spring of 2020 were able
to rethink assessments and incorporate assessments as and for learning in
their remote classrooms.

References

Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through
classroom assessment. Phi Delta Kappan, 92(1). https://doi.org/10.1177/
003172171009200119
British Columbia Ministry of Education. (n.d.). English language arts: Get
started. Government of British Columbia. https://curriculum.gov.bc.ca/
curriculum/english-language-arts
Brookhart, S. (2017). How to give effective feedback to your students (2nd ed.). ASCD.
Brookhart, S., Guskey, T. R., McTighe, J., & Wiliam, D. (2020). Eight essen-
tial principles for improving grading. Educational Leadership, 78(1). https://
uknowledge.uky.edu/cgi/viewcontent.cgi?article=1050
Burns, M. (2017). Tasks before apps: Designing rigorous learning in a tech-rich class-
room. ASCD.
Clark, S. (2014). Outstanding formative assessment: Culture and practice. Hodder
Education.
Cooper, D., & Catania, J. (2022). Rebooting assessment: A practical guide for balan-
cing conversations, performances, and products. Solution Tree.
Dueck, M. (2014). Grading smarter, not harder: Assessment strategies that motivate
kids and help them learn. ASCD.
Dumont, H., Istance, D., & Benavides, F. (2010). The nature of learning: Using
research to inspire practice: Practitioner guide to the innovative learning environ-
ments project. Centre for Educational Research and Innovation. www.oecd.
org/education/ceri/50300814.pdf
Earl, L. M. (2003). Assessment as learning: Using classroom assessment to maximize
student learning. Corwin.
Farmer, H. M., & Ramsdale, J. (2016). Teaching competencies for the online
environment. Canadian Journal of Learning and Technology, 42(3), 1–17. https://
files.eric.ed.gov/fulltext/EJ1110313.pdf
Feldman, J. (2020). Taking the stress out of grading. Educational Leadership, 78(1).
www.ascd.org/el/articles/taking-the-stress-out-of-grading
Using the SAMR Model to Design Online K–12 Assessments 291

Feldman, J., & Reeves, D. (2020). Grading during the pandemic: A conversation.
Educational Leadership, 78(1). www.ascd.org/el/articles/grading-during-
the-pandemic-a-conversation
Fogleman, J., McNeill, K. L., & Krajcik, J. (2011). Examining the effect of teachers’
adaptations of a middle school science inquiry-oriented curriculum unit on
student learning. Journal of Research in Science Teaching, 48(2), 149–169.
Forbringer, L., & Fuchs, W. (2014). Response to instruction (RtI) in math:
Evidence-based interventions for struggling students. Routledge. https://doi.
org/10.4324/9781315852270
Fountas, I. C., & Pinnell, G. S. (2016). Guided reading: Responsive teaching across the
grades (2nd ed.). Heinemann.
Halvorsen, A.-L., Duke, N. K., Brugar, K. A., Block, M. K., Strachan, S. L., Berka,
M. B., & Brown, J. M. (2012). Narrowing the achievement gap in second-
grade social studies and content area literacy: The promise of a project-based
approach. Theory and Research in Social Education, 40(3), 198–229. https://doi.
org/10.1080/00933104.2012.705954
Han, S., Capraro, R., & Capraro, M. M. (2015). How science, technology, engin-
eering, and mathematics (STEM) project-based learning (PBL) affects high,
middle, and low achievers differently: The impact of student factors on
achievement. International Journal of Science and Mathematics Education, 13,
1089–1113. https://doi.org/10.1007/s10763-014-9526-0
Himmele, P., & Himmele, W. (2011). Total participation techniques: Making every
student an active learner. ASCD.
International Society for Technology in Education. (2019). ISTE standards:
Coaches. www.iste.org/standards/iste-standards-for-coaches
MacMath, S. (2017). Assessment driven planning. Pearson Canada.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content know-
ledge: A framework for teacher knowledge. Teachers College Record, 108, 1017–
1054. http://one2oneheights.pbworks.com/f/MISHRA_PUNYA.pdf
O’Connor, K. (2011). A repair kit for grading: 15 fixes for broken grades (2nd ed.).
Pearson.
Puentedura, R. R. (August 2006). Transformation, technology, and education
[PowerPoint slides]. Maine Department of Education. http://hippasus.com/
resources/tte/puentedura_tte.pdf
Schwimmer, T. (2011). 10 things that matter from assessment to grading. Pearson
Canada.
Selwyn, N. (2012). School 2.0: Rethinking the future of schools in the digital age. In
A. Jimoyiannis (Ed.), Research on e-learning and ICT in education (pp. 3–16). Springer.
Stiggins, R. J., & Chappuis, J. (2012). An introduction to student-involved assessment
for learning (6th ed.). Pearson.
292 Sheryl MacMath

Tucker, C. R. (2020). Balance with blended learning. Corwin.


Venables, D. R. (2020). Five inconvenient truths about how we grade. Educational
Leadership, 78(1). www.ascd.org/el/articles/five-inconvenient-truths-about-
how-we-grade
Western Northern Canadian Protocol for Collaboration in Education. (2006).
Rethinking classroom assessment with purpose in mind: Assessment for learning,
assessment as learning, assessment of learning. Manitoba Education, Citizenship,
and Youth. www.wncp.ca/media/40539/rethink.pdf
Adapting Classroom 15
Assessment Practices
for Online Learning
Lessons Learned From
Secondary School Teachers
in the Early Days of COVID-19
Michael Holden, Christopher DeLuca, Stephen
MacGregor, and Amanda Cooper

In this chapter, we focus on contemporary research in formative class-


room assessment and its implications for online learning. We examine
the question, how did secondary school teachers adapt their classroom
assessment practices for online learning at the onset of COVID-19?
We take this focus for two reasons. First, we argue that the widespread
disruptions of the pandemic exacerbated, rather than caused, preexisting
challenges with equal and equitable access to education (Corbett &
Gereluk, 2020; Doucet et al., 2020; National Academy of Education,
2021). Second, we recognize that online learning is not new, nor will
it disappear completely once the pandemic subsides (Hodges et al.,
2020; Weller, 2020). Many teachers—including us, the authors—used
online environments to supplement in-person education long before
COVID-19. In some jurisdictions, policies mandate online learning, and
therefore online assessment, for large groups of students and teachers

DOI: 10.4324/9781003347972-19
294 Holden, DeLuca, MacGregor, and Cooper

(e.g., Ministry of Education [MOE], 2022). Rather than viewing the pan-
demic as a temporary issue, this context suggests that lasting research
into online and classroom assessment is critical.
As we have written elsewhere (Cooper et al., 2022), changes to
assessment and evaluation practices during the pandemic contributed
to widespread concern about the quality of teaching, learning, and
assessment opportunities available to students in online environments
(Education Endowment Foundation [EEF], 2020). We refer interested
readers to our synthesis of these changes in Cooper and colleagues (2022).
Likewise, Gallagher-Mackay et al. (2021) have offered an especially useful
description of the online learning landscape in Canada during COVID-19.
Gallagher-Mackay and Brown (2021) provided a similar Ontario-specific
account. This chapter examines how 17 Ontario secondary school
teachers used formative classroom assessment to meet their students’
learning needs and grapple with systemic challenges during an especially
difficult time for educators. Our findings describe emergency remote
teaching and emergency assessment (Cooper et al., 2022; Hodges et al.,
2020), and offer direct insights into how teachers, teacher educators, and
other stakeholders can adapt classroom assessment practices to diverse
online learning environments.

Centring Formative Classroom Assessment


in Online Learning

We assert that classroom assessment is pedagogy (Black & Wiliam, 2018).


The primary purpose of classroom assessment is to improve student
learning (Baird et al., 2017; MOE, 2010). Assessment certainly has other
purposes—documentation, communication, accountability, feedback,
and so on (Andrade, 2013; Bennett, 2011; McMillan, 2013)—but funda-
mentally, classroom assessment aims to further student learning itself
(Bonner, 2013; Stobart, 2008). In discussing formative assessment, we use
the Council of Chief State School Officers’ (2023) definition:

Formative assessment is a planned, ongoing process used by all


students and teachers during learning and teaching to elicit and
use evidence of student learning to improve student understanding
of intended disciplinary learning outcomes and support students
to become self-directed learners. Effective use of the formative
Adapting Classroom Assessment Practices for Online Learning 295

assessment process requires students and teachers to integrate and


embed the following practices in a collaborative and respectful class-
room environment: (a) clarifying learning goals and success criteria
within a broader progression of learning; (b) eliciting and analyzing
evidence of student thinking; (c) engaging in self-assessment and
peer feedback; (d) providing actionable feedback; and (e) using evi-
dence and feedback to move learning forward by adjusting learning
strategies, goals, or next instructional steps.
(pp. 2–3)

This approach to formative assessment is consistent with widespread


consensus among classroom assessment researchers that teachers and
students should use formative classroom assessment to guide teaching
and learning efforts (Andrade, 2013; Shepard et al., 2018; Swaffield,
2011), and to support students and teachers in using actionable feed-
back to achieve students’ learning goals (Cowie & Harrison, 2016;
DeLuca & Johnson, 2017; Shepard, 2019). Thus, when we consider
adapting classroom assessment practices for online learning, we must
ensure that the adaptive, inferential spirit of formative assessment is
upheld (Crossouard, 2011).
The notion of adaptation in assessment merits further discus-
sion. Formative assessment is not merely summative assessment that
happens more frequently (Baird et al., 2017; Newmann et al., 1996).
A central crux of formative assessment pedagogy is to use assessment
information to adapt instruction, curriculum, and learning based on
students’ current needs in context (Cuban, 2013; DeLuca & Johnson,
2017; Mason, 2008). In formative assessment, teachers use assessment
data to inform next steps in their instruction, and students use
assessment data to inform next steps in their learning (Andrade, 2013;
Baird et al., 2017; Bonner, 2013). Formative assessment posits that
teaching and learning are still being formed: The course of instruction
may change based on what the assessment data yield and how teachers
and students interpret that information. As Friesen (2009) contended,
this process “require[s] teachers to enter an iterative cycle of defining,
creating, assessing, and redesigning that is essential in creating effective
learning environments” (p. 5). This iteration goes beyond making
minor adjustments or correcting student errors. Teachers must listen
closely to how students are engaging with key concepts and make in
situ decisions about how to guide and further each student’s learning
296 Holden, DeLuca, MacGregor, and Cooper

(Black & Wiliam, 2018; Cowie & Harrison, 2016). The validity of
those decisions depends upon a teacher’s ability to modify their view
of students’ learning and adapt that understanding into new contexts
(Kane, 2006).
Formative assessment, or its contemporary designation, assessment
for learning, necessarily leads to a view of assessment that is teacher- and
student-led. Research has consistently shown that teacher-led efforts,
rather than top-down initiatives, are some of the most effective means
to advance sustainable change in education (Hargreaves & Fullan,
2012; McNamara & O’Hara, 2004; Tschannen-Moran, 2009). Moreover,
research in classroom assessment suggests that students must be actively
involved in monitoring, regulating, and assessing their own learning in
addition to the feedback they receive from teachers and peers (Andrade,
2013; Shepard et al., 2018). Brookhart (2013) argued that “self-regulation
strategies and capabilities, or the lack of them, may be the defining
feature that separates successful and unsuccessful students” (p. 43).
Classroom assessment is therefore a contextualized endeavour, one
rooted in the local, changing needs of teachers and students (McMillan,
2013). Our firm belief in formative classroom assessment underpins this
chapter. By examining how individual teachers adapted their classroom
assessment practices for online learning, we contend that researchers
and policymakers will be better positioned to support and enhance the
assessment practices taking place both online and in brick-and-mortar
schools.

Methods

The present study was designed as part of a broader investigation into


the circumstances and impacts of the COVID-19 pandemic for Ontario
teachers (see Cooper et al., 2021, 2022). In both this chapter and in the
broader study, we used a qualitative, phenomenological design (Patton,
2002; Sokolowski, 2000). As Patton (2002) explained, phenomenology
focuses on “how human beings make sense of experience and trans-
form experience into consciousness . . . how they perceive it, describe it,
feel about it, judge it, remember it, make sense of it, and talk about it
with others” (p. 104). This design was appropriate for our focus on the
lived experiences of teachers as they sought to adapt their assessment
practices to the online environment necessitated by the onset of the
pandemic.
Adapting Classroom Assessment Practices for Online Learning 297

Sample and Data Collection

Following ethics approval, secondary school teachers were invited to


participate via social media networks, allowing for a broad sampling
of educators across Ontario. Ontario was chosen as a site of study for
three reasons: (a) Ontario is Canada’s most populous province and has
an articulated assessment framework (MOE, 2010); (b) to extend existing
analyses of the effects of COVID-19 on teaching, learning, and assessment
in the province (e.g., Gallagher-Mackay et al., 2021); and (c) because the
authors are closely familiar with the assessment contexts under investiga-
tion (e.g., DeLuca & Volante, 2016; Earl et al., 2015).
All teachers who volunteered to participate were included in the study.
The participants (N = 17) taught a range of subjects and grade levels across
Ontario’s intermediate and senior divisions (Grades 7–12, ages 12–18; see
Table 15.1), and were drawn from schools in urban, suburban, and rural
contexts. Participants had an average of 16 years of experience teaching
in Ontario secondary schools. Although these professional characteristics
may suggest a variety of perspectives among participants, the data are not

Table 15.1 Participants’ subject areas and grade levels.


Pseudonym Subjects taught Grades taught
Anita Guidance 9–12
Anna English, ESL 10, 11
Aziz French, physical education 9, 11
Hala English 9–12
Helena Math 10, 12
Jodi Math 10, 12
Kiley Math, science, physical education 7, 8
Kim Business studies and co-op 11, 12
Luke Religion 10
Martha English, library 9–12
Maryanne English, social science 10, 12
Raphael Math 9, 10
Robin Music 11, 12
Simon All subjects 7, 8
Stuart French, law 9–12
Tim Tech (construction), computers 9–12
Wanda Math 9–12
Note. ESL = English as a second language.
298 Holden, DeLuca, MacGregor, and Cooper

meant to generalize across all teachers in all classrooms (Stake, 2010). For
example, the potential biases of sampling using social media (e.g., demo-
graphic tendencies, see Morstatter & Liu, 2017) mean that the sample
should not be read as representative of all Ontario secondary school
teachers. Nor do we suggest that the extant data allow for generalizations
about assessment variations within specific subject areas or grade levels.
Rather, participants offered insights into their individual practices during
the initial weeks of the pandemic—insights which we contend will be of
interest to researchers, teacher educators, and policymakers.
Semistructured interviews (Merriam & Tisdell, 2015; Patton, 2002)
lasting upwards of an hour were conducted with each participant using
Zoom, focusing on topics related to planning, instruction, and assessment
during the remote learning period of the pandemic (see Table 15.2).

Table 15.2 Major dimensions of the semistructured interview protocol with


example questions.
Major dimension Example questions
Demographics What is your current role? What grades and subjects are
you currently teaching?
How many years of teaching experience do you have at
the current grade level?
Resources and As COVID-19 has continued, what supports have you
supports from the found most beneficial to your teaching practice and to
school or school students?
board What have you found to be the most significant teaching
challenges over the course of the pandemic?
Resources and How has your relationship between the ministry and
supports from your school board developed over the course of the
the Ministry of pandemic? Have supports or guidance been coming
Education from the ministry?
How has the way the ministry rolled out policies and
communications been for you and your colleagues?
Impacts of assessment How do you assess learning that takes place remotely?
in the online What new assessment strategies have you used as a
environment result of the pandemic conditions? Have these changes
been beneficial and valuable? Provide examples.
What role has formative assessment played in your
practice over the course of the pandemic?
In what ways have you used assessment and teaching
to monitor and support student well-being, if at all,
throughout the pandemic?
Adapting Classroom Assessment Practices for Online Learning 299

Each interview was audio recorded, transcribed, and supplemented with


field notes that aided in developing rich descriptions of the experiences,
perspectives, and contexts represented in the data (Corwin & Clemens,
2020; Phillippi & Lauderdale, 2018). Participants were also asked to share
recent artifacts relevant to their practice. Pseudonyms and gender-neutral
pronouns are used for all participants.

Data Analysis

Transcripts were thematically analyzed using the constant compara-


tive method and NVivo 12, a qualitative analysis software (Glaser &
Strauss, 1967). To enhance reliability and provide a systematic pro-
cess for analyzing data across interviews, we created a coding manual
that mirrored major dimensions and elements from the interview
protocol. As DeCuir-Gunby et al. (2011) highlighted, “codes can be
developed a priori from existing theory or concepts (theory-driven);
they can emerge from the raw data (data-driven); or they can grow
from a specific project’s research goals and questions (structural)”
(pp. 137–138). Our coding manual was both theory driven and struc-
tural in relation to our research goals. Theoretical areas of interest
were developed following a general scan of the emerging scholarly
and grey literature examining the influence of the pandemic on
teaching and learning in online environments (Cooper et al., 2021,
2022), particularly the planning, implementation, and outcomes of
assessment practices.
After the interview protocol and coding manual were created, two
members of the research team independently analyzed the same five
interview transcripts (30% of the full data set). Each analyst followed
a systematic process of reading and rereading the transcripts to break
down, examine, conceptualize, and categorize the data (Thomas,
2006). They subsequently met to scrutinize and compare their coding
results. Intercoder reliability of the coding scheme was assessed
using Cohen’s (1960) kappa, which takes chance agreement into
account. The resulting value of .71 indicated a substantial strength
of agreement between the coders (Landis & Koch, 1977). Minor
refinements were made to descriptive codes and categories before the
primary investigator used the revised coding scheme to analyze the
full data set.
300 Holden, DeLuca, MacGregor, and Cooper

Findings and Lessons Learned

Extending from our previous analyses of teachers’ emergency assessment


efforts (Cooper et al., 2022), this chapter highlights three core themes
related to how teachers adapted their classroom assessment practices
for online learning: teachers’ (a) successes and challenges in adapting to
online learning; (b) use of performance assessments as authentic, defens-
ible practice; and (c) engagement of students in cycles of feedback and
self-assessment. Table 15.3 offers a summary of the frequency of these
practices among our participants. Rather than presenting our findings
separately from a literature-linking discussion, the following sections pre-
sent these themes alongside key areas of resonance with contemporary
classroom assessment research.

Successes and Challenges in Adapting to Online Learning

Despite common pedagogical underpinnings—particularly around the


central goal of improving student learning—adapting assessment for
online learning is no simple task (König et al., 2020). Teachers across our
sample repeatedly emphasized needing to make changes to what and how
they assessed to support students’ learning needs. Anna, who has more

Table 15.3 Summary of participants’ formative assessment practices.


Formative assessment practice Participants using the practice, n (%)
Adapted their assessment practices for the 17 (100.0)
online environment
Authentic performance assessments 6 (35.3)
Cyclical feedback processes 6 (35.3)
Emphasizing context and relationship 13 (76.5)
Frequent, just-in-time feedback 11 (84.6)
Recording audio or video feedback 6 (35.3)
Reduced emphasis on policing plagiarism 5 (29.4)
Reduced emphasis on summative testing 4 (23.5)
Student feedback on teacher practices 7 (41.2)
Student self-assessments 4 (23.5)
Note. N =17.
Adapting Classroom Assessment Practices for Online Learning 301

than a decade’s experience teaching online, explained that “it’s a com-


pletely different time and space. The entire reality of what learning looks
like in eLearning is completely antithetical to what we do in a face-to-face
environment.” Similarly, Raphael argued that the online environment “is
absolutely not a digital version of your course” because of fundamental
differences in the opportunities and limitations of both mediums.
We were troubled to observe wide disparities in teachers’ experiences
adapting to these differences. Wanda admitted the experience was “pretty
brutal” and involved “a lot of tears,” recalling “I felt incompetent because
I couldn’t learn this stuff quickly enough.” Raphael, likewise, worried
that “everything we know about teaching in the classroom is completely
irrelevant” in the context of teaching, learning, and assessing in an online
environment. Often, teachers’ success in adapting to online learning was
predicated on (a) their knowledge of learning management systems,
(b) their comfort with learning new technology routines, and (c) the
extent to which their pre-pandemic experiences involved technology,
online learning, and teaching and assessment beyond face-to-face
practices. Participants with prepandemic online learning experiences
tended to report greater ease in adapting to the emergency assessment
routines necessitated by COVID-19 than those without related prior
experience. Such teachers referenced teacher-, board-, and ministry-
created resources for online learning (Kim, Jodi); had already developed
content using various digital tools (Helena, Maryanne); and recognized
that their learning curve was not as steep as others’ because teaching and
assessing in online environments was already part of their professional
practice (Tim, Kiley).
We are not suggesting that teachers with limited experience in online
learning performed poorly. We reiterate that teachers’ challenges at the
onset of the pandemic reflected the reality that emergency remote teaching
and conventional online learning are fundamentally different (Doucet
et al., 2020). This chapter is firmly rooted in the belief that teachers are
capable professionals whose ongoing professional learning is central to
student success (Cochran-Smith, 2003; Timperley, 2015). Acknowledging
teachers’ difficulties enables classroom teachers, teacher educators, and
other stakeholders to consider how best to support necessary adaptations
beyond the context of emergency assessment. For example, participating
teachers were divided on how to create consistent and contextualized
online learning environments. On the one hand, multiple participants
praised the flexibility school boards gave teachers to use platforms, tools,
302 Holden, DeLuca, MacGregor, and Cooper

and practices that reflected their individual expertise and contexts. As Jodi
recalled, “There was a lot of concern that we were going to be forced to
teach a certain way online. And I’m so thankful that they didn’t.” Raphael
and Stuart echoed Jodi’s sentiments: Both appreciated not being forced
into modes of assessment that did not align with their skill sets and com-
fort levels. Yet, on the other hand, participants recognized that varying
practices from teacher to teacher introduced significant challenges for
students who were learning online and for their parents. Kiley observed,

There’s no consistency with when the teacher is posting things on


Google Classroom, [or] when the daily assignments are coming
out. . . . The navigation of it [the online environment] is some-
thing that we’ve really taken to heart in terms of trying to make it
smoother for parents.

Vera, likewise, recognized that “if we’re going to successfully mark


things, everything needs to be in one place, not in bits and pieces all over
the place.” As participants explained, inconsistencies in how assessments
were organized and implemented exacerbated confusion in a medium
many teachers and students were unfamiliar with. In addition to regular
curricular goals, online learning requires that teachers, students, and
parents receive an orientation to what platforms are being used, where
to access content within those platforms, and how those platforms
can be used to enhance assessment practices. Given teachers’ varying
experiences, skills, and comfort levels, we are led to ask, how do we
support and provide professional learning for teachers as they continue
to create online learning environments? How might such supports be
responsive to teachers’ varying entry points, including those teachers
who have never taught online before, those teachers who have experi-
ence with only emergency remote assessment, and those teachers who
have many years of experience with online learning in different contexts?
Such questions seem pivotal in a not-quite-post-pandemic world and are
addressed in other chapters of this volume.

Performance Assessments as Authentic, Defensible Practice

One of the most frequent adaptations participating teachers described


was a shift towards using authentic performance assessments. Authentic
Adapting Classroom Assessment Practices for Online Learning 303

assessments have long been used as part of formative classroom


assessment (Friesen, 2009; Mueller, 2018; Wiggins, 1993). As Mueller
(2018) described, authentic assessments are “a form of assessment in
which students are asked to perform real-world tasks that demonstrate
meaningful application of essential knowledge and skills” (para. 1).
Rather than being artificial or unapplicable, authentic assessments are
directly relevant to the curriculum and to students’ lives. Performance
assessments more broadly require students to do something (i.e., per-
form) to apply key skills and content knowledge in a novel or mean-
ingful way (Association for Supervision and Curriculum Development,
2011).
Specific performance tasks varied according to grade level and sub-
ject area. For example, Jodi described a research project in their Grade 12
college-level math class where students were asked to examine “how to get
a house, [making a] comparison between renting and buying,” including
“researching mortgages, and what are mortgage rates, and what do terms
mean.” Anna offered two similar examples from Grade 11 English. In
one performance task, students were asked to analyze a story by anno-
tating a one-page excerpt and marking up the page to call attention to the
authors’ use of certain concepts. In another activity, students conducted
a theme analysis using 1-minute clips of movies students enjoyed that
were relevant to the issues and ideas on a theme analysis chart. Hala, like-
wise, shifted from asking closed questions with predetermined answers
(e.g., “Which character did x in Act III?”) to asking students to apply their
knowledge to novel scenarios (e.g., “Based on your knowledge of this
character and their behaviour in the text, how might they behave in the
future?”). Each example prioritizes a linking of curricular concepts with
students’ interpretations, ideas, or personal interests.
Consistent with extant research (Association for Supervision and
Curriculum Development, 2011; Mueller, 2018), some participants ( Jodi,
Anna, Kiley) explicitly contrasted performance assessments with tests,
quizzes, and other summative tasks. Teachers emphasized performance
assessments as process focused, rather than product focused, often in line
with the realities of the online environment. For example, Kim explained
that accounting students often struggle with process mistakes, so their
online assessments needed to provide opportunities to apply specific
processes and receive targeted feedback. Teachers repeatedly discussed
performance assessments as a proactive way of addressing plagiarism
(Cooper et al., 2022; Eaton et al., 2017). As Anna observed,
304 Holden, DeLuca, MacGregor, and Cooper

Assessment is also around revisioning what it means to assess


learning when your tools are narrower, because you cannot con-
trol what a student is doing on the other end. So, you must produce
assessments . . . that assume students are going to be there with an
open book, that are going to assume that students are going to col-
laborate with their parents or caregivers.

Rather than attempting to police plagiarism—which some participants


saw as explicitly futile—performance assessments and similar adaptations
assume that students will access the other resources they have on hand,
and they shift the expectation to using those resources to link curricular
concepts to individual research, analysis, or interpretation. Thus, for many
of our participants, performance assessments represented an authentic,
professionally defensible way of assessing student learning. We therefore
see performance assessments as a promising practice for both face-to-face
and online environments.

Engaging Students in Cycles of Feedback


and Self-assessment

Previous analyses of emergency assessment have highlighted the critical


challenge of meaningfully engaging students in online learning (Cooper
et al., 2022; Jimenez, 2020). As the EEF (2020) observed, “Many pupils
are not engaging in high-quality home learning, and . . . disadvantaged
pupils appear to be learning less than their peers” (p. 19). Our participants
echoed these concerns. Multiple teachers noted that student and parent
engagement varied noticeably and often posed a substantial barrier to
forming accurate assessments of students’ abilities. Jodi reflected,

The last 8 weeks has kind of felt like I’m talking to a wall because
there’s no return from the students. I have no sense that they’re
getting it. As much as I can call them and ask them, and say “reach
out if you’ve got any questions,” they’re not going to. They say that
they’re doing fine.

Participants explained that although some students were able to thrive as


online learners, others lost access to the rapport and relationships so cen-
tral to classroom assessment as a community-based practice. In response,
Adapting Classroom Assessment Practices for Online Learning 305

teachers in our sample described using cycles of feedback and student


self-assessment to combat low engagement.
Cycles of feedback were rooted in providing fit-for-purpose guidance
to improve the teaching and learning process. For example, Maryanne
recorded audio feedback to respond to students’ written submissions, and
Kiley prepared video tutorials that would “automatically pop up with me
explaining the topic a little bit more” when students answered a question
incorrectly during asynchronous work time. Teachers like Helena
recognized that feedback needed to be delivered frequently and in man-
ageable chunks for students to engage because late, overly granular, or
conceptually complex feedback did not always translate well in an online
environment. Critically, participants explained that feedback was not
just a teacher-to-student affair: Teachers explicitly valued opportunities
to receive feedback from their students about their teaching, learning,
and assessment practices. Aziz explained, “It’s one thing for me to make
all these green marks on their Google document [i.e., suggestions via
track changes]. Are they looking at that?” To meet students’ varying
needs, Jodi polled their classes to learn whether students wanted learning
activities outlined day by day or in more spaced-out frequencies (e.g., an
overview posted at the start of each week). Jodi, like other teachers in
our sample, then adapted their approach based on students’ feedback.
Such approaches align directly with the tenets of formative classroom
assessment (Council of Chief State School Officers, 2023). In addition
to providing students with targeted feedback about their learning,
teachers used formative classroom assessment to gather information
about what practices were (or were not) working, including information
about pacing, workload difficulty, individual challenges, and how specific
platforms were being used.
Student self-assessment also featured in some teachers’ approach
to online learning, although to a lesser extent than performance
assessments. Kiley, for example, described their practice of providing
both “a challenging option and a more challenging option,” encour-
aging students to self-select what difficulty level would help to extend
their understanding. Hala explicitly asked students to evaluate their
own progress and compile evidence of their achievement of individual
learning goals, such that the teacher was no longer the person most
responsible for collecting proof of individual student achievement. Aziz,
likewise, recalled how one typically disengaged student began “emailing
me asking questions so she can contribute to her group,” which was later
306 Holden, DeLuca, MacGregor, and Cooper

reflected in the group’s peer feedback data. Such assessment as learning


processes already exist in person (Earl, 2007), but they take on special sig-
nificance when students and teachers are not able to interact and develop
self-regulation strategies together in conventional ways (Andrade, 2013).
As Stuart explained, “I’m a very over-the-shoulder teacher, and it’s hard
to do that in this medium.” When teachers cannot directly observe
much of the work students do in online environments (Eaton et al.,
2017), and when meaningful engagement is a central goal (DeLuca &
Johnson, 2017), it follows that teachers are not the only actors important
to classroom assessment. Students—as self-assessors and sources of crit-
ical feedback—can directly contribute to assessment processes in online
environments.

Final Thoughts

The COVID-19 pandemic was an unwanted and unanticipated crisis for


educators worldwide. Despite the devastating effects and understand-
able concerns for students’ postpandemic futures, the pandemic provided
opportunities to learn from two years of emergency remote teaching and
emergency assessment (International Educational Assessment Network,
2021). The 17 secondary school teachers who participated in this study
illustrate that teachers—those who struggled with and those who thrived
in adapting to online learning—have much to offer as capable professionals
engaged in formative classroom assessment (DeLuca & Johnson, 2017;
Timperley, 2015). The findings call attention to how teachers navigated
the successes and challenges of adapting to online learning; their use
of performance assessments as authentic, defensible practice; and their
engagement of students in cycles of feedback and self-assessment. We
concur with Stuart’s observation that

there is a hot iron here that we need to strike to figure out as much
as we can about what worked and what did not work in this situ-
ation, especially because we know the [Ontario] government is
moving forward with furthering eLearning.

Emergency assessment is necessarily different from conventional online


learning, and teaching in online environments involves critical differences
from in-person classroom assessment (Cooper et al., 2022; Hodges et al.,
Adapting Classroom Assessment Practices for Online Learning 307

2020). Fundamentally, however, the findings from this study represent a


clarion call for centring formative classroom assessment as part of sub-
stantive online learning. We believe these teachers’ insights directly con-
tribute to widespread efforts to leverage such assessment practices as part
of a new, better normal in both in-person and online education (EEF,
2020; International Educational Assessment Network, 2021; National
Academy of Education, 2021).

References

Andrade, H. L. (2013). Classroom assessment in the context of learning theory


and research. In J. H. McMillan (Ed.), SAGE handbook of research on class-
room assessment (pp. 17–34). SAGE Publications. https://doi.org/10.4135/
9781452218649
Association for Supervision and Curriculum Development. (2011). What is per-
formance assessment? https://pdo.ascd.org/lmscourses/PD11OC108/media/
Designing_Performance_Assessment_M2_Reading_Assessment.pdf
Baird, J. A., Andrich, D., Hopfenbeck, T. N., & Stobart, G. (2017). Assessment and
learning: Fields apart? Assessment in Education: Principles, Policy and Practice,
24(3), 317–350. https://doi.org/10.1080/0969594X.2017.1319337
Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in
Education: Principles, Policy and Practice, 18(1), 5–25. https://doi.org/10.1080/
0969594X.2010.513678
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy and Practice, 25(6), 551–575. https://doi.org/10.
1080/0969594X.2018.1441807
Bonner, S. M. (2013). Validity in classroom assessment: Purposes, properties,
and principles. In J. H. McMillan (Ed.), SAGE handbook of research on class-
room assessment (pp. 87–106). SAGE Publications. https://doi.org/10.4135/
9781452218649.n6
Brookhart, S. M. (2013). Classroom assessment in the context of motivation
theory and research. In J. H. McMillan (Ed.), SAGE handbook of research on
classroom assessment (pp. 35–54). SAGE Publications.
Cochran-Smith, M. (2003). Teaching quality matters. Journal of Teacher Education,
54(2), 95–98. https://doi.org/10.1177/0022487102250283
Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational
and Psychosocial Development, 20, 37–46. https://doi.org/10.1177/001316446
002000104
308 Holden, DeLuca, MacGregor, and Cooper

Cooper, A., DeLuca, C., Holden, M., & MacGregor, S. (2022). Emergency
assessment: Rethinking classroom practices and priorities amid remote
teaching. Assessment in Education: Principles, Policy & Practice, 29(5), 534–554.
https://doi.org/10.1080/0969594X.2022.2069084
Cooper, A., Timmons, K., & MacGregor, S. (2021). Exploring how Ontario
teachers adapted to learn-at-home initiatives during COVID-19: Blending
technological and pedagogical expertise in a time of growing inequities.
Journal of Teaching and Learning, 15(2), 81–101. https://doi.org/10.22329/jtl.
v15i2.6726
Corbett, M., & Gereluk, D. (2020). Rural teacher education: Connecting land and
people. Springer. https://doi.org/10.1007/978-981-15-2560-5
Corwin, Z. B., & Clemens, R. F. (2020). Analyzing fieldnotes: A practical guide. In
M. R. M. Ward & S. Delamont (Eds.), Handbook of qualitative research in education
(pp. 409–419). Edward Elgar. https://doi.org/10.4337/9781788977159.00047
Council of Chief State School Officers. (April 20, 2023). Revising the definition
of formative assessment. https://ccsso.org/resource-library/revising-definition-
formative-assessment
Cowie, B., & Harrison, C. (2016). Classroom processes that support effective
assessment. In G. T. L. Brown & L. R. Harris (Eds.), Handbook of human and
social conditions in assessment (pp. 335–350). Routledge. https://doi.org/10.4324/
9781315749136
Crossouard, B. (2011). Using formative assessment to support complex learning
in conditions of social adversity. Assessment in Education: Principles, Policy and
Practice, 18(1), 59–72. https://doi.org/10.1080/0969594X.2011.536034
Cuban, L. (2013). Inside the black box of classroom practice: Change without reform in
American education. Harvard Education Press.
DeCuir-Gunby, J., Marshall, P. L., & McCulloch, A. W. (2011). Developing and
using a codebook for the analysis of interview data: An example from a pro-
fessional development research project. Field Methods, 23(2), 136–155. https://
doi.org/10.1177/1525822X10388468
DeLuca, C., & Johnson, S. (2017). Developing assessment capable teachers in
this age of accountability. Assessment in Education: Principles, Policy and Practice,
24(2), 121–126. https://doi.org/10.1080/0969594X.2017.1297010
DeLuca, C., & Volante, L. (2016). Assessment for learning in teacher education
programs: Navigating the juxtaposition of theory and praxis. Journal of the
International Society for Teacher Education, 20(1), 19–31.
Doucet, A., Netolicky, D., Timmers, K., & Tuscano, F. (2020). Thinking about
pedagogy in an unfolding pandemic: An independent report on approaches to distance
learning during COVID-19 school closures. UNESCO. www.oitcinterfor.org/en/
node/7809
Adapting Classroom Assessment Practices for Online Learning 309

Earl, L. (2007). Assessment—a powerful lever for learning. Brock Education


Journal, 16(1), 1–15. https://doi.org/10.26522/brocked.v16i1.29
Earl, L., Volante, L., & DeLuca, C. (2015). Assessment for learning across Canada:
Where we’ve been and where we’re going. EdCan Network. www.edcan.ca/
articles/assessment-for-learning-across-canada/
Eaton, S. E., Guglielmin, M., & Otoo, B. (2017). Plagiarism: Moving from puni-
tive to pro-active approaches. In A. P. P. Babb, L. Yeworiew, & S. Sabbaghan
(Eds.), Selected proceedings of the IDEAS conference 2017: Leading educational
change (pp. 28–36). Werklund School of Education, University of Calgary.
https://prism.ucalgary.ca/items/0ada9efa-bf13-4d47-948d-61063258113d
Education Endowment Foundation. ( June 2020). Impact of school closures on
the attainment gap: Rapid evidence assessment. https://educationendowment
foundation.org.uk/public/files/EEF_(2020)_-_Impact_of_School_Closures_
on_the_Attainment_Gap.pdf
Friesen, S. (2009). What did you do in school today? Teaching effectiveness: A frame-
work and rubric. Canadian Education Association. www.galileo.org/cea-2009-
wdydist-teaching.pdf
Gallagher-Mackay, K., & Brown, R. S. (2021). The impact of school closures and emer-
gency remote learning on Grade 12 students in spring 2020: Preliminary findings from
Toronto. The Higher Education Quality Council of Ontario. https://heqco.ca/
wp-content/uploads/2021/07/Formatted_COVID-Impacts-in-the-TDSB.pdf
Gallagher-Mackay, K., Srivastava, P., Underwood, K., Dhuey, E., McCready, L.,
Born, K. B., Maltsev, A., Perkhun, A., & Steiner, R. (2021). COVID-19 and
education disruption in Ontario: Emerging evidence on impacts. Science
Briefs of the Ontario COVID-19 Science Advisory Table, 2(34), 1–36. https://doi.
org/10.47326/ocsat.2021.02.34.1.0
Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for
qualitative research. Aldine.
Hargreaves, A., & Fullan, M. (2012). Professional capital: Transforming teaching in
every school. Teachers College Press.
Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (March 2020). The diffe-
rence between emergency remote teaching and online learning. EDUCAUSE
Review. https://er.educause.edu/articles/2020/3/the-difference-between-emer
gency-remote-teaching-and-online-learning
International Educational Assessment Network. (2021). Imperatives for a better
assessment future during and post COVID [Position paper]. www.iean.network/
gallery/iean-assessment-imperatives-covid-may2021.pdf
Jimenez, L. (September 10, 2020). Student assessment during COVID-19. Center
for American Progress. www.americanprogress.org/article/student-assess
ment-covid-19/
310 Holden, DeLuca, MacGregor, and Cooper

Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement


(4th ed., pp. 17–58). Praeger Publishers; National Council on Measurement in
Education; American Council on Education.
König, J., Jäger-Biela, D. J., & Glutsch, N. (2020). Adapting to online teaching
during COVID-19 school closure: Teacher education and teacher compe-
tence effects among early career teachers in Germany. European Journal of
Teacher Education, 43(4), 608–622. https://doi.org/10.1080/02619768.2020.
1809650
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for
categorical data. Biometrics, 33, 159–174.
Mason, M. (2008). Complexity theory and the philosophy of education.
Educational Philosophy and Theory, 40(1), 4–18. https://doi.org/10.1111/
j.1469-5812.2007.00412.x
McMillan, J. H. (2013). Why we need research on classroom assessment. In SAGE
handbook of research on classroom assessment (pp. 3–16). SAGE Publications.
https://doi.org/10.4135/9781452218649
McNamara, G., & O’Hara, J. (2004). Trusting the teacher: Evaluating edu-
cational innovation. Evaluation, 10(4), 463–474. https://doi.org/10.1177/
1356389004050219
Merriam, S. B., & Tisdell, E. J. (2015). Conducting effective interviews. In S. B.
Merriam & E. J. Tisdell (Eds.), Qualitative research: A guide to design and imple-
mentation (4th ed., pp. 107–136). Jossey-Bass.
Ministry of Education. (2010). Growing success: Assessment, evaluation, and
reporting in Ontario schools. Government of Ontario. www.edu.gov.on.ca/
eng/policyfunding/growsuccess.pdf
Ministry of Education. (February 3, 2022). Online learning for sec-
ondary students. Government of Ontario. www.ontario.ca/page/
online-learning-secondary-students
Morstatter, F., & Liu, H. (2017). Discovering, assessing, and mitigating data
bias in social media. Online Social Networks and Media, 1, 1–13. https://doi.
org/10.1016/j.osnem.2017.01.001
Mueller, J. (2018). What is authentic assessment? Authentic Assessment Toolbox.
http://jfmueller.faculty.noctrl.edu/toolbox/whatisit.htm
National Academy of Education. (February 2021). Educational assessments
in the COVID-19 era and beyond. https://naeducation.org/wp-content/
uploads/2021/02/Educational-Assessments-in-the-COVID-19-Era-and-
Beyond.pdf
Newmann, F. M., Marks, H. M., & Gamoran, A. (1996). Authentic pedagogy and
student performance. American Journal of Education, 104(4), 280–312. www.
jstor.org/stable/1085433
Adapting Classroom Assessment Practices for Online Learning 311

Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Sage.
Phillippi, J., & Lauderdale, J. (2018). A guide to field notes for qualitative research:
Context and conversation. Qualitative Health Research, 28(3), 381–388. https://
doi.org/10.1177/1049732317697102
Shepard, L. A. (2019). Classroom assessment to support teaching and learning.
Annals of the American Academy of Political and Social Science, 683(1), 183–200.
https://doi.org/10.1177/0002716219843818
Shepard, L. A., Penuel, W. R., & Pellegrino, J. W. (2018). Using learning and motiv-
ation theories to coherently link formative assessment, grading practices,
and large-scale assessment. Educational Measurement: Issues and Practice, 37(1),
21–34. https://doi.org/10.1111/emip.12189
Sokolowski, R. (2000). Introduction to phenomenology. Cambridge University Press.
Stake, R. E. (2010). Qualitative research: Studying how things work. Guilford Press.
Stobart, G. (2008). Testing times: The uses and abuses of assessment. Routledge.
Swaffield, S. (2011). Getting to the heart of authentic assessment for learning.
Assessment in Education: Principles, Policy and Practice, 18(4), 433–449. https://
doi.org/10.1080/0969594X.2011.582838
Thomas, D. R. (2006). A general inductive approach for analyzing qualitative
evaluation data. American Journal of Evaluation, 27(2), 237–246. https://doi.
org/10.1177/1098214005283748
Timperley, H. S. (2015). Continuing professional development. International
Encyclopedia of the Social & Behavioral Sciences (2nd ed.), 4, 796–802. https://
doi.org/10.1016/B978-0-08-097086-8.92134-2
Tschannen-Moran, M. (2009). Fostering teacher professionalism in schools: The
role of leadership orientation and trust. Educational Administration Quarterly,
45(2), 217–247. https://doi.org/10.1177/0013161X08330501
Weller, M. (2020). 25 years of ed tech. Athabasca University Press. https://doi.
org/10.15215/aupress/9781771993050.01
Wiggins, G. P. (1993). Assessing student performance. Jossey-Bass.
Equity in Action 16
Virtual Learning Environment
Considerations
Sharlene McHolm

In March 2020 the World Health Organization declared COVID-19 a


pandemic, and schools made the radical shift to online learning. In April
of that year, the Global Online Learning Alliance acknowledged that
“twenty years of talk of digital literacy and educational preparedness for
the knowledge economy has been condensed into 20 days of urgency”
(Caldwell, 2020, p. 11). Those initial days were stressful for students, fam-
ilies, and educators alike as the world shut down.
Since 2020, students’ and educators’ familiarity and comfort with
learning in digital spaces have increased exponentially. Even those who
were reluctant online students or teachers in prepandemic times learned
to manage within digital platforms. But just managing does not suffice
when considering the long-term effects on students’ education. The inequi-
ties throughout the educational system became glaring and undeniable
with online learning. Students with strong home support systems could
manage, and those who benefitted from direct one-to-one attention from
a household member seemed to grow. Those without that attention did
not. The delivery model, highly reliant on reading and writing skills, left
many of the youngest learners and special education students behind. As
described in this chapter, a remarkable shift happened as virtual learning
environment (VLEs) and educators evolved, and VLEs became places for
many students to thrive.
Now in 2023, the political push for digital learning as a learning plat-
form is a reality, and the practices witnessed in the acute stage of the

DOI: 10.4324/9781003347972-20
Equity in Action 313

transition are no longer accepted (Porter et al., 2021). As Ontario school


districts formalize permanent virtual learning options for students, edu-
cational leaders and teachers are examining their approaches to teaching
and assessments in VLEs. Using the Ontario context, this chapter examines
the conditions for learning within the VLE and explores meaningful edu-
cator considerations and actions. The virtual setting has both limitations
(Mukhtar et al., 2020) and advantages over brick-and-mortar schools
(Hantoobi et al., 2021). Virtual classrooms, teacher planning models, and
differentiation approaches for effective learning are explored. The chapter
concludes with implications for future educators.
In the 21st century, particularly in a post-acute-pandemic world,
educators must masterfully leverage technology in the classroom
(Harris & Jones, 2020). Recent experiences have highlighted educators’
need for competency in creating equitable, engaging, and commu-
nally connected VLEs (Caprara & Caprara, 2022). How educators build
student-teacher and student-student connections is different than when
teaching in person, but what makes in-person classrooms successful often
enhances learning in virtual settings (Ally, 2008). Similar to traditional
schools, teachers need networks of support as they navigate through
VLEs (Schmitz et al., 2022).

Thinking Narratively

My starting point for this chapter was my interest in how teacher know-
ledge of in-person classrooms translated to the online environment with
the sudden transition into the VLE. Using Clandinin and Connelly’s
(2004) framework of narrative inquiry, I stepped back to consider and
contemplate the sudden shift. Their framework “entails thinking within
the three commonplaces of narrative inquiry—temporality, sociality,
and place” (Clandinin, 2013, p. 38). In this chapter, I draw on the notion
that narrative inquiry is a way of understanding experience across time,
places, and relationships (Clandinin, 2013).
The initial weeks of the pandemic were ones of trauma, confusion,
and discovery for both students and teachers. As a seasoned principal of
17 years, I found myself in the unexpected position of leading teachers
in a domain that I had relatively little experience with. I had taught in
a virtual high school, and I had completed three degrees through dis-
tance learning, but all those environments used asynchronous models.
The direct transposition of in-person teaching to the synchronous world
314 Sharlene McHolm

of online learning left me searching for best practices in the Wild West
of newly established VLEs. As did other educational leaders, I grappled
with questions about who these classrooms were serving and who they
were not. The historical patterns of underserved groups and the wide
variety of technological skills of teachers, students, and their families
began to illuminate a disparaging effect on underserved groups. I focused
on themes of equity for neurodivergent and marginalized learners.
This chapter chronicles my personal reflections and the rich
discussions I shared with three fellow administrators of stand-alone
VLE elementary schools servicing over 15,000 students at their peaks,
to 500 students currently. Our joint inquiry included interviews, pro-
fessional sharing of documents, and emails from 2020 to 2022, with a
culminating semistructured interview I conducted with them in the fall
of 2022 to consolidate my reflections through our shared experiences.
The questions that shaped my inquiry into the VLE were framed around
what constituted excellent teaching environments (in person and vir-
tually), how teachers selected digital tools that supported all types of
learners, and what insights from these practices might improve educa-
tional outcomes for students. Discussions and written communication
documented our journey as teachers struggled to become more compe-
tent virtual learning educators.
As a researcher and practising administrator, I am positioned as a
non-Indigenous individual with expertise in special education and educa-
tional leadership. I have worked in two publicly funded school districts in
Ontario. I acknowledge that my lived experiences are and were different
than many of the students and families that I was trying to support. Thus,
as a part of the dominant structures within the educational landscape,
it was critical to reflectively account for my roles (both intentional and
unintentional) within the current systems.
In-person learning and VLEs share fundamental principles of learning.
These principles include establishing a rich learning environment, cultivating
strong relationships amongst students and teachers (social constructivism
and connectivism), and using evidence-based pedagogical understandings
(cognitive processes) for specific content learning (Hollingshead & Carr-
Chellman, 2019). Superseding these conditions for learning, educators
must focus on the equity and human rights process of removing sys-
temic barriers experienced by Indigenous, racialized, 2SILGBTQ++,
and neurodivergent learners (Masta, 2018). All students must have edu-
cational opportunities in public education that lead to equal life oppor-
tunities after graduation (Masta, 2018). My shared experiences with the
Equity in Action 315

three remote learning administrators confirmed that several underserved


communities benefitted from VLE, and I examine how digital platforms
can also amplify barriers.

Theories of Online Learning

The process of distance learning was discussed as early as the 1970s


(Ally, 2008). Remote learning contexts have existed and been debated
over the past 50 years with implications for a small number of remote
learners. Since the 1990s, the use of the internet has shifted focus from
emails to virtual portals (Ally, 2008). As adoption rates of electronic
communication increased, software engineers began to build platforms
to allow for synchronous communication and modern-day remote
learning was born. No longer was the internet a mere vehicle for the
delivery of information with tasks completed individually by students;
remote learning began to include real-time interactions and changed
the conversation about learning in the VLE. Kozma (2001) imputed
that in the process of acquiring meaningful knowledge within the VLE,
instructional strategies, not the technology itself, led to the quality of
learning. The complexity of learning theories in the online environ-
ment involved the intersection of psychology, neuroscience, and edu-
cational disciplines.
At its foundation, pedagogy must incorporate behaviourism, cog-
nitivism, and social construction theories (Picciano, 2017). Although
in-depth investigation of these theories is beyond the scope of this chapter,
they were strands woven through my thinking and the reflections of the
researcher practitioners. Equally important in this chapter’s exploration
is Graham et al.’s (2013) work considering the three-part taxonomy of
exploring what currently exists, explaining why it unfolds in the way it
does, and designing a better system to achieve preferred outcomes.
Garrison et al. (2000) noted that “community of inquiry” (p. 87) in the
VLE was necessary to provide the cognitive, social, and teaching supports
for learning to result. Examples and explanations in this chapter indicate
both success and disengagement related to the degree of a VLE’s syner-
getic connectivism. The chapter therefore infuses the social context into
the initial stage of mass enrolment into VLE during 2020 (Phase 1), a
stabilizing phase later in 2021 (Phase 2), and a permanent phase (Phase 3)
within the publicly funded elementary educational system in Ontario
in 2022. The shifts to the VLE are framed within these three phases
316 Sharlene McHolm

to understand and acknowledge the changing context of learning and


teacher competencies.

Definitions

In the chaotic start of the pandemic, terms such as distance learning,


e-learning, virtual classrooms, and remote learning were used interchange-
ably. For the purposes of this chapter, these terms are distinctly defined.
First, distance learning is an umbrella term that includes all forms of
non-in-person learning, including asynchronous digital learning, VLEs,
synchronous learning virtually, and mail-in pencil-and-paper learning.
E-learning refers to the digital platform of delivery: It does not require
real-time attendance of either the teacher or the student. Virtual
classrooms are the specific software used to hold the VLE; examples
include Brightspace (www.d2l.com/brightspace/), Desire2Learn
(www.d2l.com/), and Google Classroom (https://classroom.google.
com/). Finally, remote learning refers to a parallel virtual learning
environment that requires synchronous learning, replicating many of
the aspects of traditional brick-and-mortar classrooms. One partici-
pating administrator summarized an e-learning program as “where
they [teachers] give all the kids their assignments for the week and
they’re [teachers] just there to help,” in contrast to remote learning,
“where it’s similar to a traditional classroom, where the teacher slowly
scaffolds the learning.” In remote learning, the teacher bases assessment
on the triangulation of observation, products, and conversations. My
experience has shown that the ideal VLE is one that is delivered syn-
chronously, including tenets of universal design, differentiation, and
scaffolded learning.
Mishra (2019) studied how technology is used to deliver learning,
and their technological, pedagogical, and content knowledge (TPACK)
framework explored the interplay of those elements in the VLE. The
TPACK diagram (see Figure 16.1) shows the interconnected nature of
technology, pedagogical decisions, and content delivery decisions within
VLEs. What content an educator decides to teach impacts what tech-
nology they should employ and what pedagogical purposes it meets.
Each area is interconnected. Note that two elements, student voices and
culturally responsive and relevant teaching practices, are missing from
the framework.
Equity in Action 317

Figure 16.1 The TPACK framework. From “Using the TPACK Image,” by M.
J. Koehler, 2011 (http://matt-koehler.com/tpack2/using-the-tpack-
image/). Copyright 2011 by tpack.org. Reproduced with permission.

Over the past decade, higher education institution research has


examined technological adoption in educational environments
(Appiah & Van Tonder, 2018; Schmid et al., 2014), but findings applicable
to the K–12 environment have been absent. With the onset of the pan-
demic, elementary and secondary researchers started to study adoption
of technology in their settings (Trust & Whalen, 2021). Investigations
linking academic achievement and digital assessment within VLEs are
now emerging (Caprara & Caprara, 2022; Shamir-Inbal & Blau, 2021;
Trust & Whalen, 2021).

Phase 1: VLE in 2020

In the beginning stages of mass enrolment in VLEs, teachers and


students started from similar places and similar realities. Teachers
318 Sharlene McHolm

asynchronously posted pencil-and-paper type work and marked


that work later. With cameras off and caregivers directly assisting
students, many educators felt uncomfortable ascertaining ownership
of submitted work, leading to pragmatic reflection on teaching and
assessment practices within the VLE. One administrator reflected,
“People were scrambling to find a digital tool, as opposed to the right
digital tool. Little thought about best practices, and pedagogically
sound decision-making was not at the forefront of their planning.”
Another remote learning administrator described assessment: “The
majority of the teachers were assessing students based upon products.
Fill out a worksheet, fill out the Google Form, or create a slide deck.
They did ‘matching’ types of worksheets digitally.” In those initial
stages, there was little conversation about what was good for students,
and one administrator described the types of assessment as “very
‘teacher-pay-teacher.’ ” However, as that administrator continued,
“Great teachers, and there were lots out there, those teachers who
are really exceptional, used conversations and observations with some
products [as assessment].”
This phase presented other challenges: inexperience with the learning
platforms, lack of technology, lack of internet access, lack of supervi-
sion, and phantom attendance (students online but not engaging). Many
students shared with me that the temptations of gaming and social media
during the school day made being mentally present in the VLE difficult
for them. One colleague suggested that the “loss of social learning where
you learn from your neighbour” was the greatest casualty during the ini-
tial stages of online learning. Each administrator shared their concern for
the mental health of all learners and educators during that initial stage of
the pandemic.
During Phase 1 of VLE adoption, educators were forced onto
digital platforms, highlighting gaps in their technological proficien-
cies (Schmitz et al., 2022). With efforts focused on learning how to
deliver online materials, many teachers concentrated on techno-
logical skill development first and pedagogical decisions second.
I saw that all teachers became first-year teachers in their pedagogical
instruction. As teachers settled into this new reality, the initial transi-
tion was over. VLE in Ontario became a sustained method of instruc-
tional delivery. During Phase 2, many families preferred to remain in
VLEs. They wanted the predictability of a VLE classroom for their
children.
Equity in Action 319

Phase 2: Spring 2021 to Spring 2022

During this second phase, educators and students settled into the work
of learning again. The families that selected the VLE demanded a
higher quality education. School leaders engaged with staff to reflect on
the practices during Phase 1 and to consider what could be improved.
Kimmons et al. (2020) provided the PICRAT matrix (see Figure 16.2)
for this analysis. The PICRAT matrix focuses on the student, requiring

Figure 16.2 The PICRAT model. Reproduced from “The PICRAT Model for
Technology Integration in Teacher Preparation” by R. Kimmons,
C. Graham, & R. West, 2020, Contemporary Issues in Technology
and Teacher Education, 20(1) (https://citejournal.org/volume-20/
issue-1-20/general/the-picrat-model-for-technology-integration-in-
teacher-preparation/). Copyright 2020 by the Society for Information
Technology and Teacher Education. Reproduced by permission of
Creative Commons Attribution 3.0 United States (CC BY 3.0 US).
320 Sharlene McHolm

the educator to reflect on their planning decisions, the purpose of indi-


vidual tasks, and which skills the students are developing to complete the
given task. As the educational system entered Phase 2, I and my three
colleagues used this matrix to discuss teaching practices that did or did
not support student learning.
The PICRAT matrix is divided along two axes: PIC, which describes
students’ interactions as passive, interactive, or creative; and RAT,
describing teachers’ use of technology for replacement, amplification, or
transformation. Kimmons et al. (2020) emphasized that teachers need to
carefully consider the effect that technology has on their practice while
considering what the students are doing with that technology. This model
acknowledges that a passive-replacement (PR) activity can have value
in the VLE, but that a creative-transformation (CT) activity will induce
higher-level thinking in students. Passive learning and direct instruction
are important components of every classroom, including the VLE, but
in the initial stage of distance learning, PR activities were being used
without reflection. During this second phase, I asked teachers to reflect
on their pedagogical decisions while emphasizing deliberate digital tool
selections. Classroom time was spent on culminating tasks involving cre-
ativity, amplification, or transformation, thereby eliciting learner growth.
With the appropriate triangulation of data (observation, conversation,
and product), deep learning was reached authentically.

Phase 3: The Stable Establishment of VLE in Ontario’s


Publicly Funded Schools

Phase 3 of widely available, publicly funded VLE education in Ontario


came with many changes and stabilization factors. During this phase,
enrolments decreased from earlier phases but were less transient, with
more stable teacher and student cohorts. Remote learning schools tried
to distance themselves from the makeshift classrooms of Phase 1. One
administrator said, “People need to recognize that it’s not what emer-
gency learning online was. It’s not what distance learning was . . .. It’s an
excellent option for many families. Learning happens as authentically as
in an in-person classroom.” The flexibility of remote learning created a
place for many students with different types of needs: to keep their fam-
ilies safe, to keep themselves safe, to receive an education while receiving
mental health treatment, and so on. Vulnerable students drove remote
Equity in Action 321

learning schools to find better resources and to implement research-based


assessment skills.
As teachers moved from being low on the substitution, augmentation,
modification, and redefinition framework to becoming more masterful
VLE educators, engagement increased, connections increased, and
teachers and students found their places in the VLE. In the next step of
Phase 3 stabilization, systematic evaluation of digital tools was under-
taken to ensure cyber security and to provide digital warehousing of
onboarding supports for new teachers. Many of the tasks in the VLE need
digital footprints to determine who is doing what, and change-tracking
tools gave insight into individual activities. Teachers began using keyword
searches (e.g., using hypothes.is; https://web.hypothes.is/) to quickly
see patterns and misconceptions among students.
Students in VLE had a combination of hands-on tools (manipulatives,
whiteboards, markers) to supplement digital tools for their learning. Tools
that allowed students to colour code important, concurring, and opposing
ideas, such as Diigo (www.diigo.com/) and Google Docs (https://docs.
google.com/), created debate amongst students. This interactive work
with peers in the VLE was critical for engagement and created summaries
for teachers to use as formative assessment.
The set-up and philosophy of the VLE had significant implications for
the types of assessment practices that educators used effectively. In Phase
3, educators also began to purposefully include gamification tools in their
classrooms. AI-adapting programs such as Dreambox (www.dreambox.
com/), Blooket (www.blooket.com/), Quizlet (https://quizlet.com/),
Kahoot (https://kahoot.com/), GoNoodle (www.gonoodle.com/), and
Lexia (www.lexialearning.com/) were used for formative assessment.
With each passing day, more digital game-based learning came online,
but structures and processes to determine their fit in the VLE needed
attention. Tsekleves et al. (2016) measured educational games using nine
criteria: (a) game and curriculum alignment, (b) supporting assessment,
(c) providing enhanced learning opportunities for students, (d) rather
than just a source of “fun,” (e) transferability of learning to real-life situ-
ations, (f ) technical challenges associated with the design and develop-
ment of serious games, (g) game platform maintenance and support, (h)
production costs associated with the game development, and (i) promo-
tion and distribution of games (p. 168).
As remote learning schools grappled with limited resources and pur-
chasing decisions, educators began to critically determine if the tools
322 Sharlene McHolm

selected matched the intended purposes. Only in Phase 3 did the staff
know enough about why they were selecting a particular digital tool
and how they would use it. Having district-approved tools ensured that
teachers had pragmatic choices to advance learning.
The VLE has great options for diagnostics. In Phase 3, I saw
conversations, observations, video conference platform polls, and the use
of shared brainstorming tools, including Jamboard (https://jamboard.
google.com/), Prezi (https://prezi.com/), and Kidspiration Maps (www.
diagrammingapps.com/kidspirationmaps) to invite conversation and
collaboration. I also saw how students used the applications to solidify
their thinking, making their metacognition visible.
The power of the VLE is its potential to differentiate the various needs
in a classroom. One area I pondered was why universal design was not
happening to the degree that I would have hoped: In this third phase,
I thought that it would be automatic. One administrator remarked,
“The majority of teachers, about 85%, still did not differentiate in the
fall of 2022. They had their slide decks and they just continued.” But
through strategic scaffolding of professional development, strides were
made across schools. Early adopters were paired with each other, and
more resistant staff were paired with moderate-to-adopt individuals who
shared personality traits.
Although still early in this work, all three administrators using this
approach noted positive gains. One stated, “In classrooms, I am seeing
differences in practices. I can talk with teachers, and they can articulate
why they are doing something in a particular way.” Another adminis-
trator shared, “I am proud of the progress they are making. We are doing
things like moderated marking to create benchmarks for VLE teachers.”
This practice, although simple, was beginning to show promise. A third
administrator said, “We are beginning to see a change and our student
survey supports our thinking that student voice and differentiation makes
all the difference for engagement.” As in all schools, some people are nat-
ural leaders and some need support to move towards better pedagogical
and assessment practices.

Supporting Exceptional Students in the VLE

In the past, educators would have described an idealized VLE learner


profile as someone having strong independent learning skills, high lit-
eracy rates (both decoding and encoding), and a capacity to stay focused
Equity in Action 323

for long periods of time. VLE students must also have sufficient internet
bandwidth, skills using technology, and self-motivation. In the earliest
stages of province-wide distance learning, these parameters were true,
but as my colleagues and I took the time to develop the structures,
improve teachers’ technological knowledge, and support them in peda-
gogically sound practices in the VLE, I saw that our assumptions were
flawed.
Many students with special education needs benefitted academically
from VLEs, particularly during Phases 2 and 3. This, at first glance, seems
illogical, as providing hands-on assessments was found to be more diffi-
cult. One administrator noted that teachers felt it was “difficult to triangu-
late data, as there were too many students to speak to one on one,” but in
that circumstance, it was evident that tailored professional development
was required. When asked whether these academic improvements were
related to the learning platform or a family member redirecting the stu-
dent to stay on task, an administrator responded, “It worked really well
for the ones whose parents could do that one-to-one.” With sensitivities,
distractions, and the mental taxation of unstructured social landscapes
removed, students with behavioural needs could focus their energies
solely on the learning happening in the classroom.
Instructional tools readily available assisted multilingual and
neurodivergent learners. Closed captioning with real-time translation
and transcripts (for hearing-impaired students and those with verbal
processing disorders) was effective accommodation. The use of pacing
options available through flipped classroom methods, slowing recordings
down when viewing, screen readers, speech-to-text plugins, online
summarizers, repetitious viewing, and noise-cancelling headphones
all assisted students with learning differences. Digitally chunking
assignments helped those with focus and attentional issues. Teacher cre-
ativity shone through different information delivery approaches while
keeping student interest high (Hollingshead & Carr-Chellman, 2019).
Picciano (2021) reinforced the importance of student engagement
within the VLE, and student-centred collaborative projects with process-
oriented guided inquiry naturally included students with disabilities into
the learning.
Mental health issues such as anxiety were also mitigated through the
VLE. One administrator suggested that

VLE for kids with anxiety was actually really, really good for a lot of
them because they didn’t have to face the daily going to school. . . .
324 Sharlene McHolm

In general, kids with anxiety were thriving because they knew what
to expect, [and those anxiety provoking situations were] removed.

All the additional social situations, such as facing bullies or being in large
groups, were controlled for, allowing students to focus on their learning.
The four of us involved in this inquiry agreed that the reduction in stress
allowed many students to attend more consistently and to receive their
necessary accommodations in an almost invisible way.
One remarkable finding was that many of the students with individual
education plans (IEPs)—that is, those who needed formal extra supports—
could be served in the VLE without needing those IEPs. One adminis-
trator found that 46.7% of their IEPs were based on accommodations
used by all students in the VLE. The administrator recalled the excite-
ment of families when their child could come off an IEP. One remote
learning school pledged to try using the VLE without IEPs for one year,
and “every single parent agreed to have their child taken off their IEPs,”
according to the school’s principal. The empowerment of some students
previously thought as disadvantaged has been amazing to see.
One of the greatest strengths that the VLE environment offers is con-
fidentiality. For students with special learning requirements, they can
access the VLE environment seamlessly with their peers and receive
real-time accommodations. Simple tools like voice to text, real-time lan-
guage translation, viewing videos at a slower speed, or viewing videos
repeatedly can be used with complete discretion. Products, success cri-
teria, and modified programs can be provided without classroom peers
seeing accommodations or students being withdrawn to work with a spe-
cial education or resource teacher. This is, by far, the greatest outcome of
the VLE for students with special needs.

Weaknesses of VLE

The biggest problem with the VLE during Phases 1 and 2 was related to its
context. In short, people were selecting the VLE for its safety, rather than
for its educational power. The wrong people flocked to the platform ini-
tially. The transitions back and forth between the learning environments
of Phase 1 and 2 resulted in low commitment to building competency
and applying tools judiciously. Engagement (of both staff and students)
also varied. In Phase 3, a stabilization occurred, which addressed many of
these weaknesses.
Equity in Action 325

In the initial periods of the pandemic, some students were enrolled


for non-educational reasons. One administrator noted varying degrees
of engagement in the VLE learning during the initial stages of the
pandemic. Parents of young children worried about the amount
of time being spent on technology, and one shared with an admin-
istrator, “This is too much. I can’t do this every day.” The process
of tracking and reengaging students was challenging. One admin-
istrator shared, “[Students] would just show up enough to stay on
the register.” Students in junior, intermediate, and senior divisions
often had their cameras off while online. They were logged in but not
engaged because nobody was refocusing them. In those cases, success
in learning was not possible. And some struggling teachers did not
know how to hook the kids into their classes. They did not know how
to engage them.
During the stabilization period of Phase 3, more students knew what
to expect than in earlier phases. More teachers knew how to leverage
technology to enhance engagement. A higher proportion of students
who wanted to learn through this modality were the ones enrolled. In
this phase, the students enrolled in VLEs was small, representing between
0.5% and 1% of student populations in districts that our administrator
team were in contact with. In Phase 3, one administrator estimated
that only 2% of students in the VLE kept their cameras off. This clue to
engagement relates to a better fit in the VLE, better teaching, and better
assessment practices.

Discussion and Call to Action

The original concerns that the VLE was isolating or unavailable to


students with diverse needs were not true. Although remote learning is
not the best learning environment for all students, for some it is superior
to in-person options. I found that some teachers in VLEs do not provide
the differentiation that would make learning accessible to all students.
They fail to take advantage of the supports offered within the technology.
One administrator said, “Some teachers made the mistake of thinking
that a VLE was aligned with the linear work of a computer—being the
sage on the stage. They missed the power of flexible private groupings
and breakout rooms.”
By 2023, remote learning VLEs had evolved to a stable, functioning
method of teaching. The onboarding process of teachers included
326 Sharlene McHolm

ready-built training to set teaching expectations, and training videos


allowed for consistent messaging. Then with the baseline of expectations
established, personal support from administrators to teachers was more
individualized than in earlier phases. The VLE administrators all agreed
that individual supports assisted remote learning teachers to adopt
stronger pedagogical and assessment practices.

Implications for Future Educators

The VLE represents a vast opportunity for both educators and students.
As the world shifts to more personalized options in many facets of life,
the VLE is an expression of this trend. It is not a one-size-fits-all type of
educational system (Sugmawati & Winarni, 2023). It allows learners who
need repetition to have it. It facilitates accommodations seamlessly and
invisibly, leveraging technological functionality for privacy (Chen et al.,
2023). It has the potential to move hundreds of students off IEPs because
these needs are met through the modality. As communities look to the
future of work, with increasing virtual working environments, students
who have been learning in a VLE may have advantages.
VLEs have challenges: Teachers must be precise with lesson planning,
success criteria, and assessment tools (Means et al., 2014). Teachers in
the VLE have fewer opportunities to organically hear misconceptions
than they do when teaching in person. Interactions in the VLE need to
be orchestrated to maximize purposeful student-student interactions
(Hodges et al., 2020). The VLE requires educators to be explicit about
what they are teaching and reflective as to why they are teaching it in
a particular way (Mukhtar et al., 2020). Assessment within the VLE
holds the same challenges and possibilities as in a brick-and-mortar
school.
As critical as it is for a teacher to have strong mathematical skills
or literacy development understandings, so too does a teacher in the
VLE need specific skills in the precision of their activities (Hodges et al.,
2020; Mukhtar et al., 2020). In the VLE, educators can track and reflect
on what works and what could be improved. The permanency of digital
resources, recordings, and multimedia creates accountability that is not
available in the in-person classroom. Anyone can see the activities and
content and review them: Although differentiation can be invisible, all
the educators’ decisions are more visible than in the brick-and-mortar
school.
Equity in Action 327

Conclusion

Throughout the pandemic, Ontario, like all provinces and territories, was
forced to invest in the infrastructure, technological skills, and educator
capacity needed to shift VLE from a temporary educational necessity to a
powerful learning environment. Reflection on shifts in the VLE landscape
has been uplifting for me. I can see the possibilities within the virtual
space. Learning in the VLE can be a rich and ideal place for students to
attend school. The evolution of the platforms used in virtual classrooms
and the educators leading them accelerated exponentially in the pan-
demic. This tragic impetus does have its silver lining. Now, a space that
traditionally excluded neurodiverse, medically fragile, and marginalized
students is acting as a haven for some. It is not the space for everyone, but
with great attention to building relationships, critically thinking about
pedagogy, and implementing strong assessment practices, the VLE has
become more than a viable option for many children.

References

Ally, M. (2008). Foundations of educational theory for online learning. In T.


Anderson (Ed.), The theory and practice of online learning (2nd ed., pp. 15–44).
Athabasca University Press. www.aupress.ca/books/120146/ebook/01_
Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf
Appiah, M., & van Tonder, F. (2018). E-assessment in higher education: A review.
International Journal of Business Management & Economic Research, 9(6), 1454–
1460. www.ijbmer.com/docs/volumes/vol9issue6/ijbmer2018090601.pdf
Caldwell, B. J. (2020). Leadership of special schools on the other side. International
Studies in Educational Administration (Commonwealth Council for Educational
Administration & Management (CCEAM)), 48(1), 11–16.
Caprara, L., & Caprara, C. (2022). Effects of virtual learning environments:
A scoping review of literature. Education & Information Technologies, 27(3),
3683–3722. https://doi.org/10.1007/s10639-021-10768-w
Chen, H., Evans, D., & Luu, B. (2023). Moving towards inclusive education:
Secondary school teacher attitudes towards universal design for learning in
Australia. Australasian Journal of Special and Inclusive Education, 1–13. https://
doi.org/10.1017/jsi.2023.1
Clandinin, D. J. (2013). Engaging in narrative inquiry. Routledge.
Clandinin, D. J., & Connelly, M. (2004). Knowledge, narrative and self-study. In J.
J. Loughran, M. L. Hamilton, V. K. LaBoskey, & T. Russell (Eds.), International
328 Sharlene McHolm

handbook of self-study of teaching and teacher education practices (pp. 575–600).


Springer. https://doi.org/10.1007/978-1-4020-6545-3_15
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-
based environment: Computer conferencing in higher education model. The
Internet and Higher Education, 2(2–3), 87–105.
Graham, C. R., Henrie, C. R., & Gibbons, A. S. (2013). Developing models and
theory for blended learning research. In A. G. Picciano, C. D. Dziuban, & C. R.
Graham (Eds.), Blended learning: Research perspectives, Volume 2. Routledge. www.
taylorfrancis.com/chapters/edit/10.4324/9781315880310-3/developing-models-
theory-blended-learning-research-charles-graham-curtis-henrie-andrew-gibbons
Hantoobi, S., Wahdan, A., Salloum, S. A., & Shaalan, K. (2021). Integration of
knowledge management in a virtual learning environment: A systematic
review. In M. Al-Emran & K. Shaalan (Eds.), Recent advances in technology
acceptance models and theories (pp. 247–272). Springer Nature. https://link.
springer.com/chapter/10.1007/978-3-030-64987-6_15
Harris, A., & Jones, M. (2020). COVID-19–School leadership in disruptive times.
School Leadership & Management, 40(4), 243–247. https://doi.org/10.1080/13
632434.2020.1811479
Hodges, C. B., Moore, S., Lockee, B. B., Trust, T., & Bond, M. A. (2020). The diffe-
rence between emergency remote teaching and online learning. Virginia Tech School
of Education Publishing. https://vtechworks.lib.vt.edu/handle/10919/104648
Hollingshead, A., & Carr-Chellman, D. (2019). Engaging learners in online envir-
onments utilizing universal design for learning principles. eLearn. https://
elearnmag.acm.org/archive.cfm?aid=3310383#
Kimmons, R., Graham, C., & West, R. (2020). The PICRAT model for technology
integration in teacher preparation. Contemporary Issues in Technology and Teacher
Education, 20(1). https://citejournal.org/volume-20/issue-1-20/general/
the-picrat-model-for-technology-integration-in-teacher-preparation/
Kozma, R. B. (2001). Counterpoint theory of “learning with media.” In R. E.
Clark (Ed.), Learning from Media: Arguments, Analysis, and Evidence (pp. 137–
178). Information Age Publishing.
Masta, S. (2018). Settler colonial legacies: Indigenous student reflections on
K–12 social studies curriculum. Intersections: Critical issues in education, 2(2), 4.
https://digitalrepository.unm.edu/intersections/vol2/iss2/4/
Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells
us about whether, when and how. Routledge. www.routledge.com/Learning-
Online-What-Research-Tells-Us-About-Whether-When-and-How-1st/
Means-Bakia-Murphy/p/book/9780415630290
Mishra, P. (2019). Considering contextual knowledge: The TPACK diagram
gets an upgrade. Journal of Digital Learning in Teacher Education, 35(2), 76–78.
https://doi.org/10.1080/21532974.2019.1588611
Equity in Action 329

Mukhtar, K., Javed, K., Arooj, M., & Sethi, A. (2020). Advantages, limitations
and recommendations for online learning during COVID-19 pandemic.
Pakistan Journal of Medical Sciences, 36(COVID19-S4), S27–S31. https://doi.
org/10.12669/pjms.36.covid19-s4.2785
Picciano, A. G. (2017). Theories and frameworks for online education: Seeking an
integrated model. Online Learning, 21(3), 166–190. https://doi.org/10.24059/
olj.v21i3.1225
Picciano, A. G. (2021). Seeking an integrated model. In C. Fuentes (Ed.), A
guide to administering distance learning (pp. 79–103). Brill. https://doi.
org/10.1163/9789004471382_005
Porter, S. G., Greene, K., & Esposito, M. C. (2021). Access and inclusion of
students with disabilities in virtual learning environments: Implications for
post-pandemic teaching. International Journal of Multicultural Education, 23(3),
43–61. https://doi.org/10.18251/ijme.v23i3.3011
Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes,
M. A., Wade, C. A., & Woods, J. (2014). The effects of technology use in
postsecondary education: A meta-analysis of classroom applications. Computers &
Education, 72, 271–291. https://doi.org/10.1016/j.compedu.2013.11.002
Schmitz, M.-L., Antoniette, C., Cattaneio, A., Gonon, P., & Petko, D.
(2022). When barriers are not an issue: Tracing the relationship between
hindering factors and technology use in secondary schools across Europe.
Computers & Education, 179, Article 104411. https://doi.org/10.1016/j.
compedu.2021.104411
Shamir-Inbal, T., & Blau, I. (2021). Facilitating emergency remote K–12 teaching
in computing-enhanced virtual learning environments during COVID-19
pandemic—Blessing or curse? Journal of Educational Computing Research, 59(7),
1243–1271. https://doi.org/10.1177/0735633121992781
Sugmawati, D., & Winarni, R. (2023). The problems of using online-based indi-
vidual learning programs during the COVID-19 pandemic. SCIENTIA: Social
Sciences & Humanities, 2(1), 337–342. https://doi.org/10.51773/sssh.v2i1.171
Trust, T., & Whalen, J. (2021). K–12 teachers’ experiences and challenges with
using technology for emergency remote teaching during the COVID-19 pan-
demic. Italian Journal of Educational Technology, 29(2), 10–25. https://doi.
org/10.17471/2499-4324/1192
Tsekleves, E., Cosmas, J., & Aggoun, A. (2016). Benefits, barriers and guideline
recommendations for the implementation of serious games in education for
stakeholders and policymakers. British Journal of Educational Technology, 47(1),
164–183. https://doi.org/10.1111/bjet.12223
Ubuntu at a Distance 17
Online Assessment for Care,
Justice, and Community
Sarah Elizabeth Barrett

In early 2020, when schools adopted emergency remote teaching (ERT)


due to the COVID-19 pandemic and teachers struggled to adjust their face-
to-face teaching practices to remote or online environments, teachers’
ability to assess students’ academic progress and social and psychological
well-being were negatively affected (Bozkurt et al., 2020). Using the sub-
Saharan ethic of ubuntu as a conceptual lens, described in detail below, this
chapter focuses on the experiences of 50 kindergarten to Grade 12 (K–12)
teachers interviewed in the summer of 2020. Specifically, I explore what
the teachers’ experiences indicate about the nature of online assessment
and how these experiences could inform teacher education in the future.
During ERT, students did not necessarily have equipment, devices,
or home circumstances conducive to effectively participating in online
learning and many people, including teachers, lived in communities
where the internet was unreliable or nonexistent (Barrett, 2021). Teachers
did not have adequate time to plan for the new platforms, resulting in the
use of a narrower group of pedagogical tools than might otherwise be
present in online learning (Tartavulea et al., 2020). In Ontario, Canada
(the jurisdiction in which this study was done), as in many regions, rather
than focusing on academic progress, the priority in ERT was for teachers
to support their students emotionally (Bozkurt et al., 2020). However,
teachers in Ontario were still expected to assess their students’ academic
progress and assign grades. This chapter provides an analysis of how
teachers experienced assessing their students during ERT.

DOI: 10.4324/9781003347972-21
Ubuntu at a Distance 331

Conceptions of Assessment

Assessment is typically defined as any act a teacher takes to determine a


student’s level of understanding, knowledge, or skill (Unal & Unal, 2019).
In Canada, provincial policies mandate that assessment should be both
accurate and equitable (e.g., British Columbia Ministry of Education and
Training, 2022; Ontario Ministry of Education, 2010; Nunavut Department
of Education, 2008). To be accurate, assessments must give students
multiple and varied opportunities to demonstrate their knowledge,
understanding, and skills (British Columbia Ministry of Education and
Training, 2022; Nunavut Department of Education, 2008; Ontario Ministry
of Education, 2010). For assessments to be equitable, teachers must rec-
ognize students’ circumstances and accommodate their needs, constantly
adjusting the pace of and approach to lessons (Nunavut Department of
Education, 2008). In Ontario, the policy for assessment in K–12 is outlined
in the document Growing Success (Ontario Ministry of Education, 2010),
which mandates that teachers use procedures that “are fair, transparent,
and equitable for all students [and] support all students, including those
with special education needs” (Ontario Ministry of Education, 2010, p. 6).
In this chapter, however, I expand on this notion of assessment beyond
being equitable and accurate to explicitly include well-being. Ideally, as part
of their custodial function, K–12 teachers are caring adults in students’ lives
who are constantly assessing the well-being of their students (Gay, 2021).
In other words, I define assessment as any act a teacher takes to determine
a student’s level of knowledge, understanding, skill, or well-being.
Moorhouse and Tiet (2021) described how many teachers struggled
with providing a satisfactory level of care to their students during ERT
and how teachers were feeling disconnected from their students—not
knowing if their attempts to create a caring community online were
effective. Moorhouse and Tiet, however, did not connect a pedagogy of
care to assessment per se. Indeed, in my review of the literature, I saw
almost no reference to assessment as a component of caring for students
and ascertaining their well-being. And yet, I would argue that the well-
being of students is an important aspect of assessment practices.

Ubuntu as a Conceptual Lens

The starting point for this study was not theory but experience (Lindsay-
Dennis, 2015; Rose & Adams, 2014). I needed a way to make sense of what
332 Sarah Elizabeth Barrett

teachers said about their experiences. I sought a theory that could encom-
pass the three aspects of assessment described earlier: accuracy, equity,
and well-being. To that end, I used ubuntu, an Indigenous sub-Saharan
African ethical framework, to examine the data. Known by various names
in African languages (ubuntu being the Xhosa term), ubuntu does not
map neatly onto any one of the four commonly known Western ethical
frameworks of deontology, virtue, utilitarian, or caring (Bonn, 2007) but
rather represents a conception of ethical character and conduct rooted
in community. Ubuntu is so ingrained in sub-Saharan African cultures
that it can be difficult to provide a clear definition (Bonn, 2007), but it
boils down to the idea that “a person depends on others to be a person”
(Bonn, 2007, p. 863) or “I am because we are” (Brock-Utne, 2016, p. 31).
Specifically, ubuntu espouses a balance between three domains of ethical
action: justice, care, and community (Letseka, 2012; Metz & Gaie, 2010).
I use ubuntu as a conceptual lens with the recognition that it is not a
monolithic and essentialist feature of sub-Saharan African perspectives
(Waghid & Smeyers, 2012). Rather, ubuntu is a constellation of perspectives
revolving around community and human dignity which, through my own
African and Caribbean roots, forms the basis of the ethical framework
I brought to this research. Similar to other scholars (Letseka, 2012), I do
not argue here that ubuntu is the only ethical framework that espouses
a balance between justice, caring, and community norms (Metz & Gaie,
2010), but rather that it is the conceptual lens through which I undertook
the analysis of the data from this study.

Context

In Ontario, school buildings closed on March 12, 2020, and ERT began
on April 6, 2020, with most school districts opting for online platforms.
However, there were delays getting devices to students who did not have
them, and good quality internet access was either unavailable or finan-
cially unattainable for many. School boards did send cellular internet
hubs out to students, but this process delayed the participation of many
students who were often already marginalized (Barrett, 2021). Further,
the emotional load students were carrying during the pandemic was
exacerbated by disrupted routines and their separation from friends
(Bozkurt et al., 2020).
In response to these extraordinary circumstances, the Ontario Ministry
of Education mandated that, although teachers were still expected to
Ubuntu at a Distance 333

assess and assign grades to students’ work, final grades should not be
negatively affected by ERT (Lecce, 2020). All Ontario school boards took
this instruction to mean that grades could not drop below where they
stood when the school closures began. This policy protected students
whose circumstances prevented them from participating fully in ERT
from immediate academic penalty; however, it also meant that any stu-
dent who was satisfied with the grades they had earned up until March 12
could opt out. Many older students chose to work instead of attending
school and, as the school closures continued into the spring, many fam-
ilies chose to opt out, as well (Barrett, 2021).

Methods

Conducted in the late spring and summer of 2020, this study involved
a survey and in-depth interviews. Participants were Ontario K–12 class-
room teachers who had been teaching in person and had had to switch
to ERT. Although they ranged in experience from novice to semi-retired,
the majority had more than 10 years’ experience teaching, with an almost
even split between elementary and secondary teachers. Three hundred
of the 762 survey participants volunteered to be interviewed, and the
50 interviewees were purposefully sampled (Creswell, 2013) to approxi-
mately match the demographics of the survey participants (age, years
of experience, elementary or secondary panel, gender) and provide var-
iety in geographic location (north; southwest; east; rural, urban, or sub-
urban; and Greater Toronto and Hamilton Areas). The one-hour (on
average) interviews conducted by videoconference included questions
about how the ERT affected their ability to assess students’ academic pro-
gress and their relationships with students. I examined the full transcripts
of interviews for themes through a process of decontextualization and
recontextualization (Tesch, 1990). Decontextualization involved open
coding of the transcribed interviews and collapsing the codes into larger
themes. Recontextualization involved analyzing these larger themes
through an ubuntu lens and relating the themes to the three aspects of
ubuntu: justice, caring, and community. I related assessing students’ aca-
demic needs to justice, accurate and equitable assessment to community,
and assessment of well-being to caring.
The qualitative analysis reported in this chapter focuses on interview
data and was guided by the following questions: How did teachers experi-
ence assessment during ERT? What can these experiences tell educators
334 Sarah Elizabeth Barrett

about how best to approach assessment in online environments? Validity


was established through the large number of participants and member
checking: Towards the end of each interview, I shared initial impressions
with interviewees, and the research report was shared with participants
for feedback before its release.

Findings

Teachers reported a hampered ability to respond to students’ academic


needs; expressed concerns about equity and accuracy in evaluation of aca-
demic skills, knowledge, and understanding; and struggled with monitoring
students’ well-being. All names used in this chapter are pseudonyms.

Difficulties Responding to Students’ Academic Needs

The participants in this study were concerned with the extent to which
they could respond to and account for students’ academic needs, noting
that, in normal circumstances, a lot of the assessment they did was in
the moment, with the purpose of constantly adjusting their teaching to
suit the class (e.g., changing the pace of instruction, adding clarification).
These adjustments were challenging to make in online platforms. Once
ERT was underway, as teachers attempted to engage their students and
assess the success of the lesson or how to pace instruction, they were
faced with what many described as a brick wall. For example, Stacey, a
high school social sciences teacher, said,

I don’t know what the kids are thinking. I can’t look at their faces and
say, “OK, you’re confused right now.” Because I know I can recog-
nize those faces . . . when kids either are drifting off because they’re
bored, or they’re confused, or they just don’t know what’s going on.

This feedback was integral to Stacey’s teaching; therefore, the absence of


it in the moment led to students being left behind. Lexie, a middle school
French teacher, said,

A lot of my students did not want to have their video or micro-


phone on, which I completely understand, . . . [but it] became hard
to tell if students were actually on the call or if they logged in and
Ubuntu at a Distance 335

then were doing something in another window or walked away


from the computer.

Without the information provided by body language, student engage-


ment was a major concern for teachers in this study. They might have
known students needed something different, but because of the brick
wall of ERT, teachers did not have the information they needed to make
the right adjustments. All reported precipitous drops in the number of
students logging in as the term progressed, but they did not have enough
information to determine the cause. For example, Blake, a high school
special education resource teacher, noted that low participation rates
became a main topic of conversation with his colleagues as, over time,
one by one, students disappeared from their online classrooms:

[I was] venting to other staff members, [saying], “Am I the only one
in this position? Am I doing something wrong? . . . Am I doing too
much? Am I inadvertently pushing my students away?” There was a
lot of questions about it. “Are they not engaging because of me? Or
are they not engaging because that’s just their choice in general?”
And there was always that question, “Am I doing enough?”

Such questions were impossible for teachers to answer. Further, they


could not know what gaps they needed to fill or account for in students’
home circumstances. Kurt, a K–5 teacher, noted,

I think some students went to their parents for clarifications and


had parents who kind of filled the teacher’s role. Some, you could
tell they were getting coaching and editing and ideas from parents.
Some, you could tell, were getting none of that.

What Kurt described is a recognition that when teaching in person,


teachers could provide equitable resources and supports to all students,
but in ERT they could not know the students’ circumstances and there-
fore could not easily compensate or account for students’ academic needs.

Concerns About Accuracy and Equity

The participants in this study aimed to enact equitable and accurate


assessment practices, but doing so was challenging given the combination
336 Sarah Elizabeth Barrett

of the grading mandate, poor attendance, and the timing of school


closures. Further, many participants noted that plagiarism was ram-
pant. One high school French teacher stated that plagiarism severely
undermined the integrity of the grades he would assign because he
believed that, in ERT, there was no reliable way to ascertain student
authorship:

Losing [academic integrity] makes you feel like you’re losing every-
thing. Why should we be assessing and giving marks or grades to
the student? We could, for example, be saying, “OK, I’ll give you
a participation mark, if you participate.” But assessing or giving
marks to tests and quizzes, we don’t feel is right. [Under normal
circumstances], teachers are in control of tests and quizzes.
We know how to conduct a quiz or test in class. We take all the
precautions. . . . We want to have certain integrity in the assessment
process. So, [right now, with ERT], we are not in control of that. So,
we don’t feel it’s right. (Frank)

Frank expressed a feeling that his professional integrity was compromised


by a situation which required him to assign grades when he was certain
students were cheating but was not in a position to prove it. In normal
circumstances, cheating online is not necessarily more prevalent than
it is face-to-face (Peled et al., 2019). During the pandemic, there was an
increase in plagiarism in some jurisdictions, but it was still less than what
the teachers thought it was (Yazici et al., 2022). In this study, teachers
expressed a feeling of not being in control of academic integrity because
of the brick wall mentioned earlier.
The brick wall and the teachers’ inability to see body language got
in the way of determining students’ levels of knowledge, skills, and
understanding. Participants stated that, in person, they endeavoured
to get to know students well enough to evaluate them accurately and
equitably. David, a middle school teacher, had prided himself on these
perceptions:

In a classroom, I can tell. “OK, that kid’s having a bad day. I’ll chat
with them quietly. We’ll put this [class work] aside for now.” [With
ERT], I couldn’t get that personal feel. I couldn’t get that personal
touch. And so [I would ask myself], “Was this child really produ-
cing D work or is this child having some struggles?” “Is this student
Ubuntu at a Distance 337

giving me C work when normally they produce, at the very least, B?


Or is this that they’re demoralized, and they just don’t care?” As in,
they don’t see the point in all this right now.

Thus, David wanted to ascertain the meaning of a student’s performance


in order to assess it accurately and also to determine what that student
needed. However, it is more than the teachers wanting to assess equitably.
It is also the students wanting to show their best work. As Mary, an elem-
entary teacher, put it,

[During ERT, students] can’t show you whatever they wanted to


show you. They can’t grab me by the hand and pull me over. And
so, they don’t engage in the same way. . . . [Kids are] going to play
and be interested with sharing with people who are around them.
I think that the big one is that nonverbal body language, that prox-
imity. They pick up on that so strongly, and you just can’t do that
online.

Mary’s statement is a direct example of assessment being dialogical and


dependent on building relationships. Students want teachers to see their
best work and may be the ones to initiate an instance of assessment.
Further, if criteria for success are explicitly provided by the teacher or
developed within the classroom community, then students know when
they have reached them. Marco had used this strategy in his high school
mathematics classes:

[The students and I] always create a rubric for observations and


conversations. . . . It’s not an easy thing to do, but we do it
collectively. And during the course of the course, we are often
gathering evidence [through] the observations that I make for
the work or conversations we have with them in addition to
assignments.

An approach like Marco’s helps students to reflect on their own learning


and serves to build community because it creates conversations around
a shared purpose within the course (Rovai, 2002). Unfortunately, Marco
was not able to continue this ongoing conversation online. With spor-
adic attendance and lack of participation in synchronous classes, students
were not engaged enough to do so.
338 Sarah Elizabeth Barrett

Concerns About Students’ Well-Being

I have discussed how teachers wanted to understand students’ mental


state in order to be equitable and accurate with assigning grades but,
here, I want to discuss assessing student well-being for its own sake. As
mentioned earlier, due to the extraordinary circumstances of the pan-
demic, care was prioritized for students, from kindergarten through
to higher education (Moorhouse & Tiet, 2021). In the case of K–12 in
Ontario, prioritizing students’ mental health over academic achievement
was an explicit policy (Lecce, 2020). None of the teachers interviewed
disagreed with this emphasis but all struggled with implementing it. As
Lexie, a middle school French teacher, said,

It was so hard to be away from them and know that they were having
their own struggles that we couldn’t really deal with, or having the
students that wouldn’t show up for anything, and you didn’t really
know what was going on.

Lexie described the disruption of established relationships through the


sudden shift to ERT. This feeling of detachment from students, where
teachers were unable to provide the care they could have in person,
was a dominant theme in interviews. Participants highlighted the cus-
todial nature of K–12 teaching, where, as noted earlier, educators are
caring adults in students’ lives. Thus, the brick wall that many of the
participants described, not only made it difficult to assess students’
academic progress but also their well-being. Kate, an English teacher
in a private school, said, “The sad thing is that you just don’t know
what’s happening with those families and in those houses and you just
hope for the best—that they’re doing all right [and] that they’re coping
with it.”
This desire to care for students was not confined to the elementary
level. High school teachers were equally worried about their students.
The most vulnerable students tended to be the hardest to reach; there-
fore, their well-being was most threatened by ERT. Indeed, teachers who
worked primarily with vulnerable students were the most concerned
(Barrett, 2021) because programs specifically designed to be flex-
ible enough to accommodate the needs of vulnerable students were
dependent on teachers being able to assess those needs on a day-to-day,
moment-to-moment basis. Willis, a teacher in a program designed for
Ubuntu at a Distance 339

high school students with chronic absenteeism, expressed frustration


about his inability to meet his students’ emotional needs once ERT began:

There’s no way that I can do what I do or achieve the results that


I’ve achieved with the digital format. . . . The connection is severed
for many of them completely, and for others we can’t engage in the
same way. I can’t read their facial expressions as well because we’re
on the phone. They’re not doing video chats. . . . They’re not having
peer to peer conversations that I can overhear, [and] say, “Hey,
I notice you’re talking to so‐and‐so about this. . . . Let’s see if we can
figure some of this out because it sounds like you’re struggling.”
That doesn’t happen.

Here, body language was referenced again, being used not just to gauge
the pace and effectiveness of a lesson or to ascertain if a student was
performing at their optimal level academically, but also to determine the
student’s psychological and social well-being for its own sake. Further,
Willis referred to ongoing assessment of students’ social interactions and
how they indicated well-being and areas of concern.

Discussion: Expanding Conceptions of Assessment

The teachers in this study expressed profound unease with the quality of
assessment that they were able to enact during ERT. They faced difficulty
ensuring that their assessments in the moment were sufficient to gauge
student needs, that their grading was accurate (i.e., reflective of students’
actual knowledge, skills, and understanding) and equitable (students’ aca-
demic needs were being met), and that their determination of student
well-being met their standards of care. These experiences are mirrored in
the literature (Moorhouse & Tiet, 2021).
In my analysis, I draw parallels between the communitarian ethos of
ubuntu and the participants’ assessment experiences. Assessing students’
academic needs is related to justice; accurate and equitable assessment is
related to community; and assessment of well-being is related to caring.
For example, whether teachers are in a position to respond to students’
academic needs and adjust accordingly is a question of justice, where
justice is defined as everyone having what they need to thrive. Ubuntu
states that justice is a communal and positive endeavour (Viriri & Makaye,
340 Sarah Elizabeth Barrett

2020). Justice is not merely a concept that applies to transgressions and


punishments within a society. It is a reflection of the ethical structure of
a social group (Le Roux, 2000). On a day-to-day basis, it represents an
agreement between individuals on how they will treat each other so that
every individual is included and given the means to contribute. Thus, from
an ubuntu perspective, the assessment situation during ERT was unjust
on several levels. First, the most vulnerable students were the most likely
to stop attending or trying (Barrett, 2021), and their vulnerability fed into
marginalization within the classroom community. Because the teachers
could not reach these students once they stepped away, the teachers could
not adjust to help them. Participants stated that numerous phone calls
and emails to students and their parents went unanswered. It was also
unjust for the teachers, whose raison d’être was severely undermined by
the sudden separation from their students and lack of ability to assess
their students’ needs and adjust to them accordingly. Through their sep-
aration from the classroom community, neither teachers nor students
could coordinate their efforts in the group project of learning together.
Indeed, the declaration “I feel like I can’t do my job” was repeated by
many of the participating teachers.
With respect to accurate and equitable assessment, the teachers iden-
tified academic integrity as a major issue. From an ubuntu perspective,
academic integrity is more than a reflection of students’ good character;
it is fundamental to community cohesion. Viriri and Makaye (2020)
described ubuntu as a solution to widespread cheating in national exams
in Zimbabwe and Nigeria. They noted that explicit teaching of ubuntu
values would teach students the deleterious effects of cheating on them-
selves and the community: When students see colleagues getting grades
they had not equitably earned or engage in plagiarism themselves, it
corrodes the classroom community (Chaminuka & Ndudzo, 2014).
When students and teachers are unable to communicate on an ongoing
basis about the learning that is occurring, relationships are strained, and
when the classroom community is no longer able to discuss shared goals,
the community begins to drift and lose its centre (Rovai, 2002).
Finally, participants had difficulty assessing their students’ well-being.
From an ubuntu perspective, caring is integral to assessment. Beets and
van Louw (2005) described ubuntu approaches to assessment well:

Humanness (warmth, tolerance, understanding, peace, humanity)


and caring (empathy, sympathy, helpfulness, and friendliness) capture
Ubuntu at a Distance 341

the spirit in which assessment should be conducted . . . humanness


towards and caring unconditionally for the learner constitutes the
foundation for effective teaching and learning. Learners experience
assessment as positive only when they are sure that the teacher who
guides the learning process is a humane and caring person.
(p. 157)

Thus, caring relationships are fundamental to assessment. The purpose


of relationships is to care for one another, and any circumstance which
undermines the ability to do so needs to be corrected (Mayaka & Truell,
2021).
Ubuntu treats caring, justice, and community as inseparable and inter-
dependent. Similarly, I argue that assessment practices that are equit-
able and accurate, and that incorporate an awareness of well-being, are
dependent on the quality of the classroom community and vice versa. If
teachers cannot or do not assess students’ well-being and closely monitor
students’ academic progress in the moment, the grades that are assigned
are unlikely to be reflective of students’ actual knowledge, skills, and
understanding. If teachers cannot ensure that all students have the same
opportunities to learn, even if their assessments are accurate, the pro-
cess was inequitable, which undermines not only the individual student’s
sense of belonging but the cohesion of the classroom community. Ubuntu
highlights the role of assessment in creating and maintaining a healthy
classroom community and the role of community in supporting caring,
accurate, and equitable assessment. The usefulness of ubuntu lies in its
fundamentally holistic approach. For assessment, I imagine care, justice,
and community as fundamentally integrated and designed to maintain
and enhance student and teacher learning individually and together.

Implications for Teacher Education

How can the experiences of the participants in this study inform teacher
educators and assessment approaches in online environments? The
assessment experiences of the participants in this study demonstrate
that the collapse of classroom communities during ERT undermined
teaching and learning, including assessment. The antidote to this collapse
is community building. At the core of community building is care. The
teacher must continually assess individual students’ integration into the
342 Sarah Elizabeth Barrett

classroom community. Participants indicated that in an in-person class-


room, they assessed integration by reading body language and, in the
case of students who were struggling, initiating informal conversations
and keeping in touch with parents. This process was used in both elem-
entary and secondary classes, and whether students were marginalized
because of demographic categories, abilities, or circumstances (Barrett,
2021). Online, a pedagogy of care needs to be done deliberately and expli-
citly (Barrett, 2021; Rovai, 2002). Check-ins during synchronous classes,
journal entries, and one-on-one encounters designed so that teachers can
assess student well-being are possible strategies.
Teacher educators, then, can help teacher candidates to understand
that online environments can be learning environments that are as rich
as those in person. Specifically, in online environments, teachers want
to be able to determine how the class is responding to synchronous or
asynchronous lessons and assess individual students on what they need
to progress in their learning. If a caring classroom community has been
established, it may be easier to ask for feedback, determine as a class-
room community how to demonstrate reactions to lessons online, or ask
questions one-on-one with the teacher (Oviawe, 2016). Teachers can dis-
cuss with the class why establishing norms is important, and students
can become active partners in this ongoing formative assessment. The
teacher can assess the students’ academic needs and how to adjust their
approach to optimize student learning. Finally, once the caring classroom
community has been established and the teacher has designed a plan for
reenforcing norms, then the class can have an explicit conversation about
authentic and equitable assessment and what constitutes fair play—spe-
cifically the role of collaboration and conceptions of academic integrity
(Beets, 2012).
In ERT, teachers tended to focus on the content of the course rather
than creating community (Bozkurt et al., 2020), even as they tried
their best to connect with their students when they failed to log in,
turn on their cameras, or show up to online classes and events. Despite
knowing that care was fundamental, most of the participants in this
study were unable to enact that care. It would be unfair to judge these
participants because the situation was an emergency and the move to
online teaching was involuntary. Families and school systems were not
prepared for these circumstances, and most teachers were learning
to teach online for the first time. However, studying this event does
highlight the fact that the online classroom community that is con-
ducive to accurate, equitable assessment, and that pays attention to
Ubuntu at a Distance 343

students’ well-being, cannot happen by accident. Establishing norms


for assessment—making criteria for success explicit; talking about what
students should do if they need help; and negotiating how students can
communicate with the teacher when they need them to slow down,
reiterate, reframe, or review course content—should be part of an
ongoing conversation throughout each course. The teacher needs to
constantly endeavour to create and maintain the conditions to care for
students and for students to care for one another. Teacher educators
are in a position to help teacher candidates develop the skills to do this
while emphasizing the reasoning behind it.
In conclusion, this study occurred at a particular time and place, where
participants in the larger sample self-selected and interviewees were pur-
posefully sampled; therefore, I hesitate to generalize these findings, but the
participants’ experiences are in alignment with extant literature on ERT
(Barrett, 2021; Ferretti et al., 2021). Future research in the area of online
assessment needs to expand to include the social conditions, policies, and
skills necessary for it to be done well. For example, researchers could
ask to what extent the quality of the classroom community influences
the quality of assessment and vice versa. I also suggest exploring this
topic using different conceptual frameworks (e.g., through Indigenous
frameworks). In this chapter, the concept of ubuntu highlights the values
underlying assessment—care, justice, and community—for all learning
environments and recognizes their particular importance online, where
community building and effective moment-to-moment pedagogical
adjustments can be challenging to achieve.

References

Barrett, S. E. (2021). Maintaining equitable and inclusive classroom communities


online during the COVID-19 pandemic. Journal of Teaching and Learning, 15(2),
102–116. https://jtl.uwindsor.ca/index.php/jtl/article/view/6683/5291
Beets, P. A. D. (2012). Strengthening morality and ethics in educational assessment
through ubuntu in South Africa. Educational Philosophy and Theory, 44(Suppl.
2), 68–83. https://doi.org/10.1111/j.1469-5812.2011.00796.x
Beets, P. A. D., & van Louw, T. (2005). Education transformation, assessment
and ubuntu in South Africa. In Y. Waghid (Ed.), African(a) philosophy of educa-
tion: Reconstructions and deconstructions (pp. 126–139). Stellenbosch University.
Bonn, M. (2007). Children’s understanding of “ubuntu.” Early Child Development
and Care, 177(8), 863–873. https://doi.org/10.1080/03004430701269291
344 Sarah Elizabeth Barrett

Bozkurt, A., Jung, I., Xiao, J., Vladimirschi, V., Schuwer, R., Egorov, G., Lambert,
S. R., Al-Freih, M., Pete, J., Olcott, D., Jr., Rodes, V., Aranciaga, I., Bali, M.,
Alvarez, A. V., Jr., Roberts, J., Pazurek, A., Raffaghelli, J. E., Panagiotou, N.,
de Coëtlogon, P., . . . Paskevicius, M. (2020). A global outlook to the interrup-
tion of education due to COVID-19 pandemic: Navigating in a time of uncer-
tainty and crisis. Asian Journal of Distance Education, 15(1), 1–126. https://doi.
org/10.5281/zenodo.3878572
British Columbia Ministry of Education and Training. (2022). Curriculum and
assessment. Government of British Columbia. https://www2.gov.bc.ca/gov/
content/education-training/k-12/support/curriculum-and-assessment
Brock-Utne, B. (2016). The “ubuntu” paradigm in curriculum work, language
of instruction and assessment. International Review of Education, 62(1), 29–44.
https://doi.org//10.1007/s11159-016-9540-2
Chaminuka, L., & Ndudzo, D. (2014). Students and staff perceptions on exam-
ination malpractice and fraud in higher education in Zimbabwe. Asian
Journal of Humanities and Social Sciences, 2(2), 78–90. https://ajhss.org/pdfs/
Vol2Issue2/9.pdf
Creswell, J. W. (2013). Qualitative inquiry and research design: Choosing among five
approaches (3rd ed.). SAGE.
Ferretti, F., Santi, G. R. P., Agnese Del, Z., Garzetti, M., & Bolondi, G. (2021).
Assessment practices and beliefs: Teachers’ perspectives on assessment
during long distance learning. Education Sciences, 11(6), Article 264. https://
doi.org/10.3390/educsci11060264
Gay, G. (2021). Culturally responsive teaching: Ideas, actions, and effects. In H.
R. Milner, IV, & K. Lomotey (Eds.), Handbook of urban education (2nd ed.,
pp. 212–233). Taylor & Francis Group.
Le Roux, J. (2000). The concept of “ubuntu”: Africa’s most important contribu-
tion to multicultural education? Multicultural Teaching, 18(2), 43–46.
Lecce, S. (2020). Letter to parents from the Minister—March 31, 2020. Government
of Ontario. www.ontario.ca/page/letter-ontarios-parents-minister-education#
section-11
Letseka, M. (2012). In defence of ubuntu. Studies in Philosophy and Education,
31(1), 47–60. https://doi.org/10.1007/s11217-011-9267-2
Lindsay-Dennis, L. (2015). Black feminist-womanist research paradigm: Toward
a culturally relevant research model focused on African American girls.
Journal of Black Studies, 46(5), 506–520. www.jstor.org/stable/24572888
Mayaka, B., & Truell, R. (2021). Ubuntu and its potential impact on the inter-
national social work profession. International Social Work, 64(5), 649–662.
https://doi.org/10.1177/00208728211022787
Ubuntu at a Distance 345

Metz, T., & Gaie, J. B. R. (2010). The African ethic of ubuntu/botho: Implications
for research on morality. Journal of Moral Education, 39(3), 273–290. https://
doi.org/10.1080/03057240.2010.497609
Moorhouse, B. L., & Tiet, M. C. (2021). Attempting to implement a pedagogy of
care during the disruptions to teacher education caused by COVID-19: A col-
laborative self-study. Studying Teacher Education, 17(2), 208. https://doi.org/
10.1080/17425964.2021.1925644
Nunavut Department of Education. (2008). Ilitaunnikuliriniq—Foundation for
dynamic assessment as learning in Nunavut schools. https://gov.nu.ca/sites/
default/files/ilitaunnikuliriniq_eng.pdf
Ontario Ministry of Education. (2010). Growing success: Assessment, evaluation,
and reporting in Ontario schools. Toronto: Queen’s Printer for Ontario
Oviawe, J. O. (2016). How to rediscover the ubuntu paradigm in education.
International Review of Education, 62(1), 1–10. https://doi.org/10.1007/
s11159-016-9545-x
Peled, Y., Eshet, Y., Barczyk, C., & Grinautski, K. (2019). Predictors of aca-
demic dishonesty among undergraduate students in online and face-to-face
courses. Computers and Education, 131, 49–59. https://doi.org/10.1016/j.
compedu.2018.05.012
Rose, E., & Adams, C. (2014). “Will I ever connect with the students?”: Online
teaching and the pedagogy of care. Phenomenology & Practice, 7(2), 5–16.
https://doi.org/10.7939/R3CJ8803K
Rovai, A. P. (2002). Building sense of community at a distance. The International
Review of Research in Open and Distance Learning, 3(1), 1–16. https://doi.
org/10.19173/irrodl.v3i1.79
Tartavulea, C. V., Albu, C. N., Albu, N., Dieaconescu, R. I., & Petre, S. (2020).
Online teaching practices and the effectiveness of the educational process in
the wake of the COVID-19 pandemic. Amfiteatru Economic, 22(55), 920–936.
https://doi.org/10.24818/EA/2020/55/920
Tesch, R. (1990). Qualitative research: Analysis types and software tools. Falmer Press.
Unal, A., & Unal, Z. (2019). An examination of K–12 teachers’ assessment beliefs
and practices in relation to years of teaching experience. Georgia Educational
Researcher, 16(1), 4–21. https://doi.org/10.20429/ger.2019.160102
Viriri, E., & Makaye, J. (2020). Unhu/Ubuntu and examination malpractice in
Zimbabwe: Perceptions of selected stakeholders from Masvingo urban sec-
ondary schools. Journal of New Vision in Educational Research, 1(2), 321–338.
http://ir.gzu.ac.zw:8080/jspui/bitstream/123456789/367/1/UnhuUbuntu
%20and%20examination%20malpractice%20in%20Zimbabwe%20
Perceptions%20of%20selected.pdf
346 Sarah Elizabeth Barrett

Waghid, Y., & Smeyers, P. (2012). Reconsidering “ubuntu”: On the educational


potential of a particular ethic of care. Educational Philosophy and Theory,
44(Suppl. 2), 6–20. https://doi.org/10.1111/j.1469-5812.2011.00792.x
Yazici, S., Yildiz Durak, H., Aksu Dünya, B., & Şentürk, B. (2022). Online versus
face‐to‐face cheating: The prevalence of cheating behaviours during the pan-
demic compared to the pre‐pandemic among Turkish university students.
Journal of Computer Assisted Learning. https://doi.org/10.1111/jcal.12743
Index

Note: Page numbers in italics indicate figures and those in bold indicate tables.

Abaci, S. 203 assessment/evaluation comparison


Adie, L. 36, 47 272 – 273
affirmation from others, teacher assessment for learning 273
assessment identity and 47 – 48 Assessment in a Digital Age
affirming self-dialogue 48 – 49 (professional learning series)
Aguilar-Cruz, P. J. 17 15 – 16
AI-adapting programs 321 assessment knowledge 45 – 46
Alberta Assessment Consortium assessment literacy through
(AAC) 74 technology 164 – 180;
Ali, W. 146 assessment possibilities
Alsup, J. 37 findings 170 – 172; case
Andrade, H. L. 130 study design method used
Aristotle 249 167 – 168; course activities
artist’s statement 261 and assignments 168, 169;
a/r/tography 249, 250 defined 164; focus groups
arts-informed inquiry 229 – 230 169 – 170; learning curve
assessment, learning, and educational findings 175 – 176; limited
technology relationship access to resource challenges
see professional learning 172 – 174; overview of
community (PLC) 164 – 165; research context
assessment and evaluation 167; study findings discussed
concentration (AEC) course 176 – 179; time commitment
167, 169; see also assessment findings 174 – 175; TPACK
literacy through technology framework explained
assessment documentation, online 165, 166
learning feedback 128 – 129 assessment literate 164
348 Index

assessment of learning (AfL) 209 – 211, 339 – 341; experimenting with,


219 – 220, 273; centre students in online classroom 46 – 47;
and 210; decentre content forms of 209; knowledge,
and 209 – 210; described 209; gaining 45 – 46; as learning 273;
teacher professionalism and literature on 16 – 18; online
210 – 211 275 – 276; partnerships 53 – 70;
assessment partnerships 53 – 70; SAMR model for 276 – 279;
co-construction process and side-by-side 282 – 284, 283;
57 – 60, 67 – 69; described 55 – 56; small group 280, 281 – 282, 282;
future work implications termed 209; in 21st century
for 69; group discussion 273 – 275
participation and contributions augmentation stage of SAMR model
64 – 67; noticing theory 277
described 54 – 55; overview authentic assessments 102, 302 – 304
of 53 – 54; preservice teacher autoethnography methodology
noticings 62 – 63; previous described 150 – 151
assessment experiences and Ayalon, M. 62
63 – 64; professional noticing
described 54 – 55; study backward design program planning
methods used 56 – 60, 58, 59; 247 – 248
teacher educator noticings Bahula, T. 95
60 – 61 Barnett, R. 70
assessment practices; see also Battiste, M. 112
decolonizing assessment BC Ministry of Education and Child
practices; preservice teachers, Care (BCMoE) 187
reconceptualizing assessment BC Teachers’ Federation 190
frameworks for; teacher Beets, P. A. D. 340 – 341
educators, K-12 online Black, P. 94, 189 – 190
assessment practices and; Bloxham, S. 129
individual practices: assessment Boltz, L. O. 16 – 17
partnerships 53 – 70; digital age Bottle, R. 118
15 – 29; introduction to 4 – 5; Boud, D. 56, 61, 62, 126, 132
in online environment 94 – 95; Bovill, C. 62
online learning challenges and Bragg, L. A. 27
73 – 89; teacher assessment Brookhart, S. M. 296
identity 34 – 50; teacher Brown, G. T. 36, 44, 45, 46
candidates, online assessments Brown, R. S. 294
of 91 – 106; virtual learning Brown, S. 75
and 1 – 2 Brunner, D. 93, 104
assessment principles, feedback and Buck, G. A. 62
75 – 76, 81, 81 – 82, 85 – 86 Buckworth, J. 177
assessment(s): authentic 102; caring burnout 43
relationships and 340 – 341;
conceptions of 331; defined 2, Campbell, L. 129
93, 272; digital age (see digital Canadian Digital Learning Research
age assessment); expanding Association (CDLRA) 2
conceptions of, ERT and Carless, D. 129, 130, 132
Index 349

case study, defined 167 and 35 – 36; online learning


Catania, J. 77 presence at outset of 146 – 147;
Chao, C. I. 134, 135 virtual learning and 1 – 2
Clandinin, D. J. 313 (see also online learning
Coate, K. 70 challenges, assessment
co-construction 57 – 60, 58, 59; principles and)
discomfort during process of creation, described 275
67 – 69 creative process 246
cognitive presence, CoI model 148 criteria, assessment principles and
Cohen, J. 299 74 – 75, 79, 79 – 80, 80 – 81,
collaborative action research 84 – 85
approach 116 – 117
community of inquiry (CoI) model Dale, E. 56
145, 147 – 148, 148 Danyluk, P. 95, 118
competencies 103 Daoud, R. 18
competency-based assessment Darra, M. 130
practices in British Columbia Davies, A. 74
187 – 203; assessment findings decolonization, described 112 – 113
198 – 199; competency chart/ decolonizing assessment practices
proficiency scales findings 111 – 123; axiological
193, 194, 195; conditions considerations used in
supporting changes in 114 – 115; challenges
201 – 202; Google hacks encountered in 121 – 122;
findings 196 – 198; letter grades collaborative action research
findings 199 – 201; narrative approach described 116 – 117;
case study approach described overview of 111 – 112;
191 – 193, 193; overview of purpose of 112 – 113;
187 – 188; pedagogical hacking theoretical framework used
and 189 – 191 115 – 116; through formative/
competency charts 191 – 192 summative assessment
Connell, R. 215 119 – 121; through task design
Connelly, M. 313 117 – 119
consumption, described 275 DeCuir-Gunby, J. 299
content knowledge 18 – 19, 20, 28, Deeley, S. 62
103, 166 desired results 247
Cook-Sather, A. 55 – 56 dialogical identity assessment 36 – 37
Coombs, A. 34 – 35 dialogue, encouraging 133 – 134
Cooper, A. 294 differentiated instruction 233 – 234
Cooper, D. 77, 248, 255, 267 digital age assessment 15–29; literature
Cotton, W. 26 on assessment and 16–18;
Council of Chief State School overview of 15–16; research
Officers 294 findings 24–26, 25; research
COVID-19 pandemic: co-construction methodology described 20–24,
and 57; EdTech in public 21; teacher capacity and 17–18;
education and 206 – 207; TPACK framework described
education systems and 15 – 16; 18–20, 20
initial teacher education digital learning, defining 3
350 Index

digital portfolios 265, 267, 268 feedback; see also online learning
Dillman, D. A. 77 – 78 feedback: assessment
discrete units 284 – 285, 285, 286, principles and 75 – 76, 81,
286 – 287, 287 – 288, 288 81 – 82, 85 – 86; effective,
distance learning, defined 316 principles of 126 – 127; online
Donald, D. 114 learning 126 – 141
Doyle, E. 55 – 56 feedback literacy 132, 132 – 133
Flip (web-based platform) 118
EdTech malfunction 206 – 221; formative assessments: decolonizing
assessment of learning and assessment through
209 – 211; centre students and 119 – 121; described 93 – 94;
210; decentre content and in online teaching practices
209 – 210; demoralized teacher 97 – 99; teacher candidates’
participant feelings and 213 – 216; perceptions of online 100 – 101
Freirian dialectical methodology formative classroom assessment
and 217 – 218, 218; humanizing 294 – 296, 300; defined 294 – 295
assessment and 216 – 220; online Freeman, K. 56
pedagogy and 208 – 209; overview Freire, P. 213 – 214, 217
of 206 – 208; regressive practices Freirian dialectical methodology
and 212 – 213; study context 211; 217 – 218, 218
study findings 211 – 216; teacher Friesen, S. 295
professionalism and 210 – 211
Education Endowment Foundation Gallagher-Mackay, K. 294
(EEF) 304 Garrison, D. R. 145, 147, 155 – 156,
E-learning, defined 316 157, 158, 315
emergency remote teaching (ERT) Gikandi, J. 156
330–343; accuracy/equity Global Online Learning Alliance 312
concerns 335–337; assessment Gonzalez, J. 134 – 135
and, expanding conceptions of good performance, clarifying 127 – 130
339–341; context of 332–333; Google Sheets, pedagogical hacking
overview of 330; students’ and 192 – 193, 193
academic needs and 334–335; Graduate Student Association,
students’ well-being and Melbourne University 146
338–339; study methods used Graham, C. R. 156, 315
333–334; teacher education Grossman, P. 62
and, implications for 341–343 Growing Success (Ontario Ministry of
e-portfolios 131 Education) 225, 331
evaluation, defined 272 Guasch, T. 133 – 134
evidence of learning, assessment
principles and 76 – 77, 86, 247 hacking, educational contexts of 190
exceptional students in VLE, Hanson, A. 118
supporting 322 – 324 Hattie, J. 77, 93, 94, 126, 248, 267
external tensions 41 – 42 Hopper, T. 187 – 188
Hughes, M. 116, 117 – 118
Fadel, C. 187
Failure to Disrupt: Why Technology identity development, tensions in 37
Alone Can’t Transform Indigenizing practice 113; see also
Education (Reich) 248 – 249 decolonizing assessment practices
Index 351

Indigenous Elders 114 – 115 Molloy, E. 62


Indigenous knowledge 188 Moorhouse, B. L. 331
individual education plans (IEPs) 324 Morcom, L. 56
information: delivery of high-quality Mueller, J. 303
132 – 133; extrapolating and
online learning feedback Namyssova, G. 167
139 – 140 narrative identity assessment 36 – 37
initial teacher education (ITE) 34 – 35; narrative inquiry framework 313 – 315
COVID-19 pandemic and National Film Board of Canada 253
35 – 36; practica and 34, 44 negotiation processes, tension 44 – 45
instructor feedback 120 – 121 Nicol, D. J. 6, 126 – 127
internal tensions 40 – 41 niinwi 56
Irwin, R. L. 249 noticings 60 – 63; overview of 60;
“I” statements 36 preservice teacher 62 – 63;
teacher educator 60 – 61
John, T. E. 139 – 140 noticing theory, described 54 – 55
Johnson, N. 3
Jones, A. 131 online assessments 275 – 276
online learning: defined 3; theories of
Kay, R. 95 315 – 316
kiinwa 56 online learning challenges, assessment
kiinwi 56, 70 principles and 73–89;
Kimmons, R. 319 – 320 assignment evaluation results
Koehler, M. J. 18, 19, 180 82, 82–83, 83; criteria results 79,
Kozma, R. B. 315 79–80, 80–81, 84–85; evidence
of learning and 76–77, 86;
LaPointe-McEwan, D. 216 feedback opportunities and
Lauricella, S. 95 75–76; feedback results 81,
learning plan 247 81–82, 85–86; future research
Levi, A. J. 129 – 130 for 88–89; learning goal and
Looney, L. 36 – 37, 39, 45 74–75; overview of 73–74;
Louie, D. L. 112, 113, 115 – 116, 117, research methods used 77–78;
119, 122 student experience results
Lubart, T. I. 246 83–84, 86–87; survey results
78–84
Macfarlane-Dick, D. 6, 126 – 127 online learning feedback 126 – 141;
Makaye, J. 340 assessment documentation
Manitoba Education, Citizenship and 128 – 129; dialogue and
Youth (MECY) 219 133 – 134; extrapolating
Mason, J. 54 – 55 information and 139 – 140;
McTighe, J. 126, 247, 255, 267 feedback literacy and 132,
Medina, D. L. 17 132 – 133; good performance,
Merriam, S. B. 191 clarifying 127 – 130; overview
meta-position 48 of 126 – 127; positive self-
Milligan, C. 127 perception and 134 – 137;
Mishra, P. 18, 19, 180, 316 rubrics 129 – 130; scaffolded
modification stage of SAMR stages to close performance/
model 278 learning goals gap 138,
352 Index

139; screencasts and video practica: constraints, TAI and 43 – 44;


feedback 128; self-assessment initial teacher education and
130 – 131, 131; single point 34, 44
rubric 134 – 137, 135, 136 – 137 preservice teachers, reconceptualizing
online learning presence through assessment frameworks for;
student narratives see also individual headings:
145 – 160; autoethnography assessment literacy through
methodology described technology 164 – 180;
150 – 151; Chi’s vignette decolonizing assessment
153 – 155, 154; CoI model practices 111 – 123;
use explained 147 – 148, 148; introduction to 5 – 6; online
Giang’s vignette 156 – 158, learning feedback 126 – 141;
157; implications of study online learning presence
158 – 159; at outset of COVID- through student narratives
19 146 – 147; overview 145 – 160
of 145 – 146; researcher preservice teacher(s): accessibility
positionalities 148 – 150; Tan’s issues and 2; noticings
vignette 151 – 153, 152; Xuan’s 62 – 63; previous assessment
vignette 155 – 156 experiences and 63 – 64
online teaching see virtual learning professional learning community
Ontario College of Teachers (OCT) (PLC) 225 – 239; arts-informed
35, 47 inquiry and 229 – 230;
Ontario Ministry of Education 246, assessment with technology,
332 – 333 learning about 236 – 238;
community in 231; described
pandemic see COVID-19 pandemic 227 – 228; differentiated
Papanthymou, A. 130 instruction and 233 – 234;
partnership: assessment as 55 – 56; discussion board postings
defined 55 – 56 229; expanding assessment
Patton, M. Q. 296 practices 231; feedback from
pedagogical hacking: assessment, 232 – 233; learning culture and
competency-based learning 234 – 236; methodology used
and 189 – 191; defined 188; 228; overview of 225 – 226;
Google Sheets and 192 – 193, participants in 236 – 238;
193 SAMR model 226, 227, 236;
pedagogical knowledge 19, 20, 28, 166 student choice and 231 – 232;
phantom attendance 318 survey data used 229
phenomenology 57 professional noticing, described
Picciano, A. G. 323 54 – 55
PICRAT matrix 319, 319 – 320 progressive skill development 279
Pillen, M. T. 37 progress tracking 139
PLC see professional learning public exhibition venues 261, 262, 263,
community (PLC) 264, 265, 266
poetic inquiry 230 Puentedura, R. 7, 8, 226, 239
positive self-perception, online
learning feedback and 134, Ramaprasad, A. 126, 141
135, 135, 136 – 137, 137, 138 reciprocity 56
Index 353

redefinition stage of SAMR model single point rubric (SPR) 134 – 137,
278 – 279 135, 136 – 137
reflective practice 139 small group assessment 280, 281 – 282,
Reich. J. 248 – 249, 255, 267 282
remote learning, defined 316 Smith, B. 190, 191
repertoire of practice 19 Smith, L. T. 115
rubric cocreation 257, 258 – 259, 260 social presence, CoI model 148
rubrics, online learning feedback and Solder, R. 56, 61
129 – 130 Sparkes, A. 191
station rotation 280, 281 – 282, 282
Saldaña, J. 78 Stefani, L. A. 55
SAMR model see substitution, Stevens, D. D. 129 – 130
augmentation, modification, Stiggins, R. J. 164
and redefinition (SAMR) Stommel, J. 213
model Struve, M. E. 129
Sanford, K. 187 – 188, 218 students’ academic needs, ERT and
scaffolded stages to close 334 – 335
performance/learning goals students’ well-being, ERT and
gap 138, 139 338 – 339
Schimmer, T. 195, 196, 200 Styres, D. 114 – 115
screencasts, online learning feedback substitution, augmentation,
and 128 modification, and redefinition
Scriven, M. 93 – 94 (SAMR) model 226, 227,
Scull, J. 16 – 17 239, 271 – 290; assessment/
secondary school teacher adaptations evaluation comparison
293 – 307; authentic 272 – 273; for assessments
performance assessments 276 – 279; assessments as, for,
and 302 – 304; formative and of learning framework
classroom assessment 273 – 276; augmentation stage
294 – 296; overview of 277; benefits/considerations
293 – 294; phenomenological for technology use 288 – 290,
design method explained 289; discrete units and
296 – 299; student feedback/ 284 – 285, 285, 286, 286 – 287,
self-assessment and 304 – 306; 287 – 288, 288; modification
successes/challenges in stage 278; overview of 271;
300 – 302 progressive skill development
self-assessment 120 – 121, 273; online and 279; redefinition stage
learning feedback and 278 – 279; side-by-side
130 – 131; positive outcomes assessments 282 – 284, 283;
of, by level 131 small group assessment and
self-checkout education 213 280, 281 – 282, 282; substitution
self-perception, positive 134, 135, 135, stage 277
136 – 137, 137, 138 substitution stage of SAMR model 277
shared brainstorming tools 322 summative assessments: decolonizing
Shepard, L. 234 – 235 assessment through 119 – 121;
Shulman, L. 18 described 94; elements of,
side-by-side assessments 282 – 284, 283 of importance to teacher
354 Index

candidates 101 – 103; in online environment assessment


teaching practices 99 – 100; practices 94 – 95; overview
outcomes-based 121 of 91 – 92; perceptions of
summative assessment tasks 76 formative 100 – 101; research
synchronous learning 147 questions used 95 – 96;
summative assessment
TAI see teacher assessment identity (TAI) described 94; summative
Talib, M. A. 18 assessment practices in
talking circle 118 – 119 99 – 100; transmission of
task design, decolonizing assessment content knowledge/ teaching
through 117 – 119 skills/competencies and 103;
Tasks Before Apps (Burns) 276 visible learning and 94
teacher assessment identity (TAI) teacher capacity 17 – 18
34 – 50; affirmation from teacher educator noticings 60 – 61
others and 47 – 48; affirming teacher educators, K-12 online
self-dialogue and 48 – 49; assessment practices
assessment experimenting and and; see also individual
46 – 47; assessment knowledge headings: competency-based
and 45 – 46; external tensions assessment practices in British
and 41 – 42; general practica Columbia 187 – 203; EdTech
constraints 43 – 44; internal malfunction 206 – 221; ERT
tensions and 40 – 41; as and 341 – 343; introduction
narrative and dialogical 36 – 37; to 6 – 9; professional learning
online classroom constraints community (PLC) 225 – 239;
and 42 – 43; overview of SAMR model 271 – 290;
34 – 36; study findings 40 – 49; secondary school teacher
study methods described 38, adaptations 293 – 307; ubuntu
38 – 39; tension negotiation and 330 – 343; virtual art
processes 44 – 45; tensions in education assessments
identity development and 37 244 – 269; virtual learning
teacher assessment literacy in practice environment (VLE) 312 – 327
(TALiP) 36 teacher knowledge of in-person
teacher candidates, online classrooms 313 – 315
assessments of 91 – 106; teacher’s toolbox 19
assessment defined 93; teaching presence, CoI model 148
case study methodological teaching skills 103
approach explained 96; technological, pedagogical, and
conceptual framework content knowledge (TPACK)
described 93; context of framework 18 – 20, 20, 165,
research 92 – 93; elements 166, 316, 317
of summative assessments technology knowledge 19, 20, 28, 166
of importance to 101 – 103; technology tools: benefits and
formative assessment considerations for 288–290, 289;
described 93 – 94; formative in urban vs. rural schools 2 – 3
assessment practices in 97 – 99; tensions: external 41 – 42; internal
literature reviews concerning 40 – 41; negotiation processes
93 – 96; online learning 44 – 45; types of 37, 40
Index 355

Textile Museum of Canada 253, 255 teaching techniques and


Thomas, B. 139 – 140 252 – 255, 253, 254, 256
Tiet, M. C. 331 virtual field trips 253, 255
Timperley, H. 126 virtual learning: accessibility issues
TPACK see technological, pedagogical, with 2 – 3; assessment practices
and content knowledge and 1 – 2; constraints, TAI and
(TPACK) framework 42 – 43; COVID-19 pandemic
Trilling, B. 187 and 1 – 2; EdTech malfunction
Truth and Reconciliation Commission and 208 – 209; experimenting
of Canada 54, 111 with assessment in 46 – 47
Tsekleves, E. 321 virtual learning environment
(VLE) 312 – 327; definitions
ubuntu 330 – 343; see also emergency concerning 316 – 317;
remote teaching (ERT); exceptional students in,
assessment conceptions and supporting 322 – 324; future
331; and caring as integral educators of, implications for
to assessment 340 – 341; as 326; overview of 312 – 313;
conceptual lens 331 – 332; phase 1, 2020 317 – 318;
emergency remote teaching and phase 2, spring 2021 to
330; overview of 330 spring 2022 319, 319 – 320;
understanding by design concept 247 stable establishment of, in
publicly funded schools
van Louw, T. 340 – 341 320 – 322; teacher knowledge
video feedback, online learning of in-person classrooms and
feedback and 128 313 – 315; theories of 315 – 316;
Viriri, E. 340 weaknesses of 324 – 325
virtual art education assessments visible learning 94
244 – 269; access to materials VLE see virtual learning environment
and 251 – 252; a/r/tography (VLE)
described 249, 250; assessment
and evaluation strategies 248; Wenger, E. 46
backward design program Wiggins, G. 126, 247, 255, 267
planning and 247 – 248; creative Wiliam, D. 94, 189 – 190
process and 246; digital Wilkie, K. J. 62
portfolios and 265, 267, 268; Wilson, J. 134
engagement and 248 – 249; Wilson, S. 116, 117 – 118
original artwork submissions World Health Organization 312
and 260 – 261, 261, 262, 263,
264, 265; overview of 244 – 245; Xu, Y. 36, 44, 45, 46
public exhibition venues and
261, 262, 263, 264, 265, 266; Yan, Z. 129, 130
research methods explained Year of the Tiger (Elle) 255, 256
250, 250 – 251; student Yin, R. K. 167
engagement and 255, 256 – 257,
257, 258 – 259, 259 – 260, 260; Zoom fatigue 43

You might also like