Professional Documents
Culture Documents
Learners
EDITED BY
PAOLINA SEITZ AND S. LAURIE HILL
Designed cover image: © Getty Images
First published 2024
by Routledge
605 Third Avenue, New York, NY 10158
and by Routledge
4 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN
Routledge is an imprint of the Taylor & Francis Group, an informa business
© 2024 selection and editorial matter, Paolina Seitz and S. Laurie Hill; individual
chapters, the contributors
The right of Paolina Seitz and S. Laurie Hill to be identified as the authors of the
editorial material, and of the authors for their individual chapters, has been asserted in
accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988.
All rights reserved. No part of this book may be reprinted or reproduced or utilised in
any form or by any electronic, mechanical, or other means, now known or hereafter
invented, including photocopying and recording, or in any information storage or
retrieval system, without permission in writing from the publishers.
Trademark notice: Product or corporate names may be trademarks or registered
trademarks, and are used only for identification and explanation without intent to infringe.
Library of Congress Cataloging-in-Publication Data
Names: Seitz, Paolina, editor. | Hill, S. Laurie, editor.
Title: Assessment of online learners : foundations and applications for teacher
education / edited by Paolina Seitz and S. Laurie Hill.
Description: First edition. | New York : Routledge, 2024. | Includes bibliographical
references and index.
Identifiers: LCCN 2023033662 (print) | LCCN 2023033663 (ebook) |
ISBN 9781032376912 (hbk) | ISBN 9781032390123 (pbk) |
ISBN 9781003347972 (ebk)
Subjects: LCSH: Educational evaluation—Canada. | Educational tests and
measurements—Canada. | Academic achievement—Canada—Evaluation. |
Distance education—Canada—Evaluation. | Educational technology—Canada. |
Web-based instruction—Canada. | Student teachers—Training of. |
Teachers—Training of.
Classification: LCC LB2822.3.G7 A87 2024 (print) | LCC LB2822.3.G7 (ebook) |
DDC 379.1/580971—dc23/eng/20230925
LC record available at https://lccn.loc.gov/2023033662
LC ebook record available at https://lccn.loc.gov/2023033663
ISBN: 978-1-032-37691-2 (hbk)
ISBN: 978-1-032-39012-3 (pbk)
ISBN: 978-1-003-34797-2 (ebk)
DOI: 10.4324/9781003347972
Typeset in Dante & Avenir
by Apex CoVantage, LLC
Contents
Acknowledgmentsxix
Introduction 1
S. Laurie Hill and Paolina Seitz
Part I
Assessment Shifts in Teacher Education 13
Part II
Reconceptualizing Assessment Frameworks for Preservice
Teachers109
Part III
Teacher Educators and Assessment in K–12 Contexts 185
Index347
About the Editors and
Contributors
Editors
Contributors
Caitlin Fox shares her passion for teaching and learning with pre-
service teachers in the Bachelor of Education program at Red Deer
Polytechnic, Canada. As a teacher, curriculum leader, instructional
leader, post-secondary course designer, teacher educator, and mentor,
Caitlin has dedicated her practice to inspiring teachers to consider the
power of quality learning and assessment design.
seventh degree. During the pandemic, Dr. McHolm’s passion for educa-
tion led her to ask deep questions about how virtual learning environ-
ments serve all types of students.
Chi (Linh) Tran is a PhD scholar in the School of Social Science, Media,
Film, and Education at Swinburne University of Technology, Australia.
Chi has been working for Vietnam’s Ministry of Natural Resources
and Environment. She is an environmental/sustainability education
researcher with 10 years of experience collaborating and working with
Vietnamese K–12 students and in-service/preservice teachers. Chi’s
research interests include climate change education, environmental edu-
cation, public engagement, and ecologically and multigenerationally
justice-oriented education towards new materialist, posthumanist, and
feminist thinking. She recently completed her doctoral thesis entitled
About the Editors and Contributors xvii
The inspiration for this collection grew from our lived experience of
teaching during the early days of the pandemic when the way in which we
approached our teaching practices changed dramatically. The movement
to virtual learning necessitated an urgent shift to online and hybrid leaning
environments for our preservice teachers. This shift invited a renewed
interest in assessment practices and a consideration of how we conduct
meaningful and purposeful assessment of learning in online courses. In
this edited book, we bring together contributors to share how the shift
to online teaching and learning impacted assessment practices and stu-
dent learning experiences in teacher education programs. The collection
also includes insights from research conducted on virtual learning spaces
by teacher educators in the kindergarten to Grade 12 (K–12) educational
context.
The importance of examining and understanding the pedagogical
shifts that occurred in virtual classroom contexts cannot be understated.
“Independent of the move to online learning as a result of the COVID-19
pandemic, there is a clear upward trend to the prevalence and variety of
online learning opportunities available to students” (Caprara & Caprara,
2021, p. 3683). Prepandemic, the number of courses being offered online
was growing in postsecondary institutions; indeed, “in the 2016–2017 aca-
demic year, 18% of postsecondary students in Canada took at least one
fully online course” (Weleschuk et al., 2019, p. 4). During the pandemic,
this trend was accelerated as universities across Canada, and around
the world, quickly moved courses to an online learning environment.
DOI: 10.4324/9781003347972-1
2 S. Laurie Hill and Paolina Seitz
in urban schools than in rural schools (Tadesse & Muluye, 2020), and
cramped spaces in homes were not ideal for teaching or learning (Metcalfe,
2021). “At the same time, there was a growing awareness of how the
pandemic was impacting Black people, Indigenous people, and people
of colour, who suffered disproportionately [from the negative impacts
of the pandemic]” (Danyluk et al., 2022, p. 3). Meaningful learning and
assessment of that learning cannot be fair when students do not have
the requisite devices, supplies, and support for the learning experiences
offered online.
There is a great deal of uncertainty among institutions in how
to define online learning. Terms such as “blended learning, hybrid
learning, and hyflex learning have emerged to describe the various ways
institutions can deliver learning experiences to students” ( Johnson et al.,
2022, p. 92). Defining digital learning as a term that refers to all types of
learning supported by technology, Johnson (2021) developed a modes of
learning spectrum framework. She applied definitions used by various
institutions to the framework to help categorize common terms. In this
framework, Johnson defined online learning as a learning context in which
“all instruction and interaction is fully online; synchronous or asyn-
chronous” ( Johnson et al., 2022, p. 94). We have adopted this definition
in our collection.
Part 2 offers four chapters in which the authors examine ways that
assessment strategies were reconceptualized for online teaching and
learning in teacher education programs. In Chapter 6, titled “Decolonizing
Assessment Practices in Teacher Education,” authors Joshua Hill, Christy
Thomas, and Allison Robb-Hagg present an account of their collabor-
ation in integrating decolonizing assessment practices in a fully online
teacher education course. Their work drew upon decolonizing principles
of storytelling and negotiation to support the integration of Indigenous
ways of knowing and doing in the course learning tasks, formative
assessment strategies, and determination of grades. The authors con-
clude with a discussion of the challenges they encountered and the peda-
gogical decisions they made on their journey to decolonize assessment
practices in their online classrooms.
6 S. Laurie Hill and Paolina Seitz
resources, could be met with creativity. Yarmol also discusses how the
reimagining, teaching, and assessing of a visual arts curriculum in an
online environment resulted in a rich learning experience for her students.
In the next chapter, “Using the SAMR Model to Design Online K–12
Assessments,” author Sheryl MacMath describes her work with teacher
candidates when their practicum experiences were moved online.
Although many aspects of assessment strategies used in face-to-face
classrooms also worked in online classes, these strategies had to be
reimagined to be most effective in virtual spaces. MacMath introduced
Puentedura’s (2006) SAMR model into her course assessment frame-
work and modelled this approach for her teacher candidates to support
them in developing assessment literacy. Teacher candidates applied this
knowledge in carrying out online assessment of their students during
their practicum experiences. The chapter concludes with recommended
programs and apps that can be used when assessing student work.
In Chapter 15, “Adapting Classroom Assessment Practices for Online
Learning: Lessons Learned From Secondary School Teachers in the Early
Days of COVID-19,” Michael Holden, Christopher DeLuca, Stephen
MacGregor, and Amanda Cooper provide an overview of contemporary
research in formative assessment and implications for online learning.
They argue that disruptions created by the pandemic exacerbated
preexisting challenges with equitable access to education. The authors
interviewed 17 secondary school teachers to examine how formative
assessment strategies were used to meet students’ learning needs and
how systemic challenges were met during the shift to online learning.
Insights provided in this chapter will assist classroom teachers and pre-
service teachers in considering how to best adjust effective assessment
strategies to support online learning.
In the chapter “Equity in Action: Virtual Learning Environment
Considerations,” author Sharlene McHolm uses a narrative inquiry
approach to examine the conditions of learning for online learners and
shares experiences of school administrators leading remote learning
schools during a time of significant upheaval. The author questions
what constitutes an excellent teaching environment, in both face-to-face
and virtual contexts. She shares strategies that supported the learning
of exceptional students, finding that neurodiverse, medically fragile,
and marginalized students can thrive in a virtual learning environment.
Finally, McHolm discusses implications for future educators, noting the
opportunities and potential challenges the online environment offers to
educators and students.
Introduction 9
Final Thoughts
References
Ayers, W. (2001). To teach: The journey of a teacher (2nd ed.). Teachers College
Press.
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.
1080/0969594X.2018.1441807
Burns, A., Danyluk, P., Kapoyannis, T., & Kendrick, A. (2020). Leading the pan-
demic practicum: One teacher education response to the COVID-19 crisis.
International Journal of E-learning and Education, 35(2), 1–25. www.ijede.ca/
index.php/jde/article/view/1173/1836
Caprara, L., & Caprara, C. (2021). Effects of virtual learning environments:
A scoping review of literature. Education and Information Technologies, 27,
3683–3722. https://doi.org/10.1007/s10639-021-10768-w
Introduction 11
Danyluk, P., Burns, A., Hill, S. L., & Crawford, K. (Eds.). (2022). Crisis and
opportunity: How Canadian Bachelor of Education programs responded to the pan-
demic (Canadian Research in Teacher Education series, Vol. 11). Canadian
Association for Teacher Education. https://doi.org/10.11575/PRISM/39534
Grumet, M. (2006). Where does the world go when schooling is about schooling?
Journal of Curriculum Theorizing, 22(3), 47–54.
Hartwell, A., Brown, B., & Hanlon, P. (2021). Designing for technology-enabled
learning environments. University of Calgary. http://hdl.handle.net/1880/
113710
Heritage, M. (2010). Formative assessment: Making it happen in the classroom.
Corwin.
Johnson, N. (2021). Evolving definitions in digital learning: A national framework
for categorizing commonly used terms. Canadian Digital Learning Research
Association. www.cdlra-acrfl.ca/2021-cdlra-definitions-report/
Johnson, N., Seaman, J., & Poulin, R. (2022). Defining different modes of
learning: Resolving confusion and contention through consensus. Online
Learning Journal, 26(3), 91–110.
Looney, A., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualizing
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.108
0/0969594X.2016.1268090
Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
Metcalfe, A. S. (2021). Visualizing the COVID-19 pandemic response in Canadian
higher education: An extended photo essay. Studies in Higher Education, 46(1),
5–18. https://doi.org/10.1080/03075079.2020.1843151
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge:
A framework for integrating technology in teachers’ knowledge. Teachers College
Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated
learning: A model and seven principles of good feedback practice. Studies in
Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Puentedura, R. (August 18, 2006). Transformation, technology, and education [Blog
post]. http://hippasus.com/resources/tte/
Tadesse, S., & Muluye, W. (2020). The impact of COVID-19 pandemic on educa-
tion system in developing countries: A review. Open Journal of Social Sciences,
8, 159–170. https://doi.org/10.4236/jss.2020.810011
Tarc, P. (2020). Education post-‘Covid-19’: Re-visioning the face-to-face class-
room. Current Issues in Comparative Education, 22(1), 121–124. https://files.
eric.ed.gov/fulltext/EJ1274311.pdf
12 S. Laurie Hill and Paolina Seitz
Weleschuk, A., Dyjur, P., & Kelly, P. (2019). Online assessment in higher educa-
tion. Taylor Institute for Teaching and Learning Guide Series, University of
Calgary.
Xu, Y., & Brown, G. T. (2016). Teacher assessment literacy in practice:
A reconceptualization. Teaching and Teacher Education, 58, 149–162. https://
doi.org/10.1016/j.tate.2016.05.010
Part I
Assessment Shifts in
Teacher Education
Exploring Assessment 1
in a Digital Age
Preservice and In-service Teachers’
Professional Learning Experiences
Nadia Delanoy, Barbara Brown, and Jodie Walz
DOI: 10.4324/9781003347972-3
16 Nadia Delanoy, Barbara Brown, and Jodie Walz
Literature on Assessment
Assessment has always been a challenging area for educators, and some-
times they perceive digital advancements as adding to the complexity
of assessment practices (Kurvinen et al., 2020). For innovative redesign
of assessment practices, teachers need to collectively reflect on the
challenges with current practices (Guskey, 2000; O’Connor, 2007). For
example, common challenges include integrating meaningful formative
assessment into learning and bringing students into the assessment and
grading process (O’Connor, 2007). Moreover, educators struggle with
balancing the demands of high-stakes or standardized tests and creative
means of assessing students (Wiliam, 2017). Many educators believe
they cannot keep up with the pace of advancements and consequently
feel uncertain about how to implement appropriate applications (Paulus
et al., 2020). As such, an examination of the relationship between theory
and practice can be the foundation for reflection and enhance educators’
level of comfort with digital applications, affirm utility of the tools in
the digital space, and aid in continuing to advance pedagogy and student
outcomes (Kurt, 2019; Mishra & Koehler, 2006).
As Boltz et al. (2021) and Scull et al. (2020) asserted, practitioners
must harness the social and constructivist nature of learning to continue
Exploring Assessment in a Digital Age 17
Theoretical Framework
areas and how the relationship between these areas supports the diverse
nature of classrooms.
Methodology
1. Which best identifies you? (We recognize you may have more than
one of the roles listed. Please select your primary role or use other to
identify a different role that is not listed.) Preservice teacher (under-
graduate student); Instructor (includes faculty of all ranks, contract
faculty, sessional); Graduate student; In-service teacher (includes
practising teacher in the field, mentor teacher, substitute teacher,
learning leader); Other.
2. What date(s) did you attend an online pedagogy series?
3. I had a good experience attending the session.
4. The session helped inform my practice.
5. I will likely use something from the session in my practice.
6. I found the session helpful with my teaching amid the COVID-19
pandemic.
7. Based on your experience, what did you find most helpful that you
plan to use in your teaching or that has informed your teaching
practice?
8. How might this session help with your teaching during the COVID-
19 pandemic?
9. What were the benefits of participating in the online pedagogy series?
10. What recommendations do you have to improve the design of the
online pedagogy series?
11. Provide any additional comments that you would like to share with
the research team regarding the online pedagogy series.
Findings
The participants’ responses about how the sessions helped with their
teaching, whether in practicum or in their own classrooms, indicated
that the professional learning was beneficial to their practice. Question 9
required participants to share their perceived benefits of participating in
an online pedagogy series. The majority of participants saw value in the
series and expressed a high level of satisfaction. Table 1.2 displays the six
most common response themes from the participants.
Participants reported an overall positive experience with the profes-
sional learning series. They found value in the content as well as the ways
in which engagement was fostered through interactive digital lessons,
tool exploration, and discussions of the connections to practice from
multiple access points. In the session on learning management systems
and portfolio use, tools were discussed for formative and summative
assessment, inquiry, and evaluation of student learning, including tool
capabilities for student voice with Mote (www.mote.com/) Google Forms
(https://forms.google.com) with extended responses, and digital lessons
Table 1.2 Top six response themes about the benefits of an online pedagogy
series.
Benefits of participating in professional Sample excerpts from questionnaire
learning responses
Collaboration with others provides “Collaboration with others who might
unique insights have unique insights.”
Current and relevant technologies “It was most beneficial to see what
used in teaching and learning is currently being used to ensure
inclusivity and meet the diverse
needs of students.”
Sharing knowledge and pedagogical “Learning, finding out what others are
experiences practising, and sharing.”
Practical application “Training and professionalism. Real
answers rather than more questions.”
Flexibility of sessions “Flexibility of participation. I’m in
the field right now and was able to
seamlessly transition from a staff
meeting to the session.”
Easy access of online offering “Per being online, I literally jumped
on to the course. . . . I didn’t have to
choose between being there for my
family, and my continuing education
is invaluable.”
26 Nadia Delanoy, Barbara Brown, and Jodie Walz
Discussion
and technology was received well by participants as they valued the pro-
cess, found the engagement with the tools or platforms manageable, and
had time to play in a safe space where trial and error was supported by the
facilitators. Bragg et al. (2021) asserted that the meshing of technology
and content should be seamless so users can engage with technology as
an extension of learning and not as an add-on. The sessions focused on
the former to coalesce content knowledge, technological knowledge or
learning, and pedagogical knowledge.
As the focus was on assessment from a theoretical and pragmatic per-
spective, participants were able to conceptualize use of the digital tools and
platforms outside of the sessions more readily because of the professional
learning design. The mapping of technology as a means of assessment
was used throughout the sessions. From a professional learning stance,
pedagogy was the common thread interwoven throughout the series.
The sessions also focused on how-to, and participants embraced new
tools and considerations for the application to teaching and learning.
Participants shared their ideas and pedagogical considerations such as
student access points and utility. Bringing together teachers with varying
years of experience helped preservice teachers and in-service teachers
alike to contextualize their pedagogical knowledge and support others
in this process. For example, the relevance of this process during the
COVID-19 pandemic and how teachers traversed online, blended, and
hybrid approaches was widely discussed, providing, in multiple ways,
needed insight for preservice teachers as well as a sense of manageability.
In a study of preservice teachers in Australia during the pandemic, inter-
mittent professional learning supported deeper considerations of how
to navigate technology integration and shifts due to the COVID-19 pan-
demic (Scull et al., 2020).
Through the lens of this professional learning rooted in the digital and
with the foundation of the TPACK framework, teachers were provided
context to understand the interconnected dimensions of content,
technology, and pedagogical knowledge and the digital environment
(Koehler et al., 2014). Seeing this process as a holistic and interrelated
system can support recognition of technology as a tool for learning
and interconnected to pedagogy and content. Engaging in professional
learning that fosters sharing, collaboration, capacity building through
play, and linkages to the program of studies can help teachers recognize
the connections between tools, platforms, and content more authentic-
ally. Providing a forum for educators to share their experiences across the
career span can make the navigation of the evolving digital world more
28 Nadia Delanoy, Barbara Brown, and Jodie Walz
manageable and create a level of comfort that teachers on their own may
not experience. Though this study had a small group of questionnaire
respondents, their consistent underscoring of positive experiences and
lessons learned can inform the design and implementation of profes-
sional learning to support assessment in a digital age and can provide a
foundation for future study.
Figure 1.2 highlights the interplay and embedded nature of the
dimensions of TPACK, overlaid with descriptors for the design of the
Conclusion
References
Daoud, R., Starkey, L., Eppel, E., Vo, T. D., & Sylvester, A. (2020). The educa-
tional value of internet use in the home for school children: A systematic
review of literature. Journal of Research on Technology in Education, 53(4), 353–
374. https://doi.org/10.1080/15391523.2020.1783402
Darling-Hammond, L., & Hyler, M. E. (2020). Preparing educators for the time
of COVID . . . and beyond. European Journal of Teacher Education, 43(4), 457–
465. https://doi.org/10.1080/02619768.2020.1816961
DeLuca, C., Volante, L., & Earl, L. (2015). Assessment for learning across
Canada: Where we’ve been and where we’re going. Education Canada, 55(2).
www.edcan.ca/articles/assessment-for-learning-across-canada/
Friesen, S., & Brown, B. (2021). Advancing knowledge creation in education
through tripartite partnerships. Canadian Journal of Learning and Technology,
47(4). https://doi.org/10.21432/cjlt28052
Getenet, S. (2019). Using design-based research to bring partnership between
researchers and practitioners. Educational Research, 61(4), 482–494. https://
doi.org/10.1080/00131881.2019.1677168
Guskey, T. R. (2000). Grading policies that work against standards . . .
and how to fix them. NASSP Bulletin, 84(620), 20–29. https://doi.
org/10.1177/019263650008462003
Hill, C., Rosehart, P., St. Helene, J., & Sadhra, S. (2020). What kind of educator
does the world need today? Reimagining teacher education in post-pandemic
Canada. Journal of Education for Teaching, 46(4), 565–575. https://doi.org/10.
1080/02607476.2020.1797439
Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (March 27, 2020).
The difference between emergency remote teaching and online learning.
Educause Review. https://er.educause.edu/articles/2020/3/the-difference-
between-emergency-remote-teaching-and-online-learning
Kalloo, R. C., Mitchell, B., & Kamalodeen, V. J. (2020). Responding to the
COVID-19 pandemic in Trinidad and Tobago: Challenges and opportun-
ities for teacher education. Journal of Education for Teaching, 46(4), 452–462.
https://doi.org/10.1080/02607476.2020.1800407
Koehler, M. J. (May 11, 2011). Using the TPACK image. TPACK.org. http://matt-
koehler.com/tpack2/using-the-tpack-image/
Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The
technological pedagogical content knowledge framework. In J. M. Spector,
M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational
communications and technology (pp. 101–111). Springer.
Kurt, S. (September 16, 2019). TPACK: Technological pedagogical content knowledge
framework. International Society for Educational Technology. https://educational
technology.net/technological-pedagogical-content-knowledge-tpack-framework/
32 Nadia Delanoy, Barbara Brown, and Jodie Walz
Kurvinen, E., Kaila, E., Laakso, M.-J., & Salakoski, T. (2020). Long term effects
on technology enhanced learning: The use of weekly digital lessons in math-
ematics. Informatics in Education, 19(1), 51–75. https://doi.org/10.15388/
infedu.2020.04
Marzano, R. J., & Pickering, D. J. (2011). Chapter one: Research and theory. In
The highly engaged classroom (pp. 3–20). Marzano Research Laboratory.
McKenney, S., & Reeves, T. C. (2018). Conducting educational research. Routledge.
Miles, M. B., Huberman, A. M., & Saldaña, J. (2020). Qualitative data analysis:
A methods sourcebook (4th ed.). SAGE.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge:
A framework for integrating technology in teachers’ knowledge. Teachers College
Record, 108(6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
O’Connor, K. (2007). A repair kit for grading: 15 fixes for broken grades. Pearson.
Paulus, M. T., Villegas, S. G., & Howze-Owens, J. (2020). Professional learning
communities: Bridging the technology integration gap through effective pro-
fessional development. Peabody Journal of Education, 95(2), 193–202. https://
doi.org/10.1080/0161956X.2020.1745610
Quezada, R. L., Talbot, C., & Quezada-Parker, K. B. (2020). From bricks and
mortar to remote teaching: A teacher education program’s response to
COVID-19. Journal of Education for Teaching, 46(4), 472–483. https://doi.org/
10.1080/02607476.2020.1801330
Robinson, M. (2021). The virtual teaching experience with Google Classroom
during COVID-19: A phenomenological study [Doctoral dissertation]. St. John’s
University. ProQuest Dissertations Publishing.
Scull, J., Phillips, M., Sharma, U., & Garnier, K. (2020). Innovations in teacher
education at the time of COVID-19: An Australian perspective. Journal of
Education for Teaching, 46(4), 497–506. https://doi.org/10.1080/02607476.20
20.1802701
Selwyn, N., Hillman, T., Eynon, R., Ferreira, G., Knox, J., Macgilchrist, F., &
Sancho-Gil, J. M. (2020). What’s next for ed-tech? Critical hopes and concerns
for the 2020s. Learning, Media and Technology, 45(1), 1–6. https://doi.org/10.1
080/17439884.2020.1694945
Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform.
Harvard Educational Review, 57(1), 1–23. https://doi.org/10.17763/haer.57.1.
j463w79r56455411
Talib, M. A., Bettayeb, A. M., & Omer, R. I. (2021). Analytical study on the impact
of technology in higher education during the age of COVID-19: Systematic
literature review. Education and Information Technologies, 26(6), 6719–6746.
https://doi.org/10.1007/s10639-021-10507-1
Exploring Assessment in a Digital Age 33
descriptions from 12 ITE programs across Canada and found that course
content most commonly focused on assessment strategies (e.g., pro-
viding feedback, creating rubrics), the role of assessment in the teaching
and learning cycle, and accommodations and modifications to support
diverse learners’ needs. However, assessment literacy development
studies continue to show that teachers’ assessment beliefs and practices
are strongly influenced by personal and social factors (e.g., personal
experiences with assessment as students) beyond their learning (Xu &
Brown, 2016). These findings call for a new, more holistic approach to
assessment education.
More recently, researchers have begun to conceptualize and examine
teacher assessment identity (TAI; e.g., Adie, 2013; Looney et al., 2018;
Xu & Brown, 2016). TAI draws on teacher identity and assessment lit-
eracy research to account for the personal and social factors that mediate
teachers’ assessment learning and practice. In other words, the construct
suggests the extension of teacher education’s focus on teacher identity
development to include identity development related to assessment
specifically. Just as teacher identity development can support role sat-
isfaction, self-efficacy, and resiliency in the face of dynamic classroom
demands (e.g., Hong et al., 2017; Kelchtermans, 2005), assessment iden-
tity development may support the navigation of assessment dilemmas
and strengthen teachers’ assessment practice.
Up until the COVID-19 pandemic’s beginnings in Canada in 2020, the
Ontario College of Teachers (OCT), the regulatory college for teaching
in the province, had required student teachers to complete face-to-face
practica (Ontario College of Teachers, 2022). However, as the pandemic
mandated physical distancing, many student teachers were assigned
to online classrooms. With the rise of online learning and the poten-
tial for continued online practica in the future, ITE programs, teacher
educators, associate teachers (ATs), and student teachers themselves need
to understand how student teachers are engaging with assessment—and
developing their assessment identities—on online practica. Accordingly,
this study was led by the following questions:
Findings from this study can guide how ITE programs prepare and support
student teachers in not only developing their assessment identities on
36 Jenny Ge
Methods
were enrolled to teach in the intermediate and senior divisions, and four
were enrolled to teach in the primary and junior divisions. Five were
enrolled in the concurrent stream and had completed some teacher
education courses during their undergraduate programs; the student
teachers enrolled in the consecutive stream entered the ITE program
after completing their undergraduate programs. Given their varying foci
and prior teacher education experiences, participants began the study
with varying levels of knowledge of and confidence towards assessment.
However, all participants expressed feeling as though they had limited
assessment experiences relative to teaching experiences before the start
of the program.
Participants chose their own pseudonyms. Each student teacher
participated in three semistructured interviews of 60 minutes: one during
their first practicum, one during their second practicum, and one after their
final practicum. Interviews focused on exploring their experiences with
learning about and conducting assessment online. Given that it was difficult
to predict practicum arrangements during the pandemic, four participants
ended up completing one or two practica in person. In these cases,
interviews focused on their experiences with conducting assessment more
broadly; participants were later prompted to compare their in-person and
online assessment experiences as all participants’ final practica were online.
Interviews were transcribed verbatim by Zoom (https://zoom.us),
and I reviewed them for accuracy before sharing them with participants
so they could clarify their responses as desired. Data were analyzed in
NVivo following three inductive, three-stage coding cycles (Saldaña, 2013)
guided by the principles of discourse analysis (Gee, 1999). First, I analyzed
transcripts using Looney et al.’s (2018) framework to identify the ways in
which participants narrated their own assessment identities. For example,
I assigned a code whenever a participant spoke as an assessor, indicated
by their use of “I know,” “I feel,” “I believe,” and so on, (from Looney et
al.’s framework) and other related words as they described their engage-
ment in assessment. My analysis of the transcripts then shifted to iden-
tify experiences of tensions related to participants’ assessment learning
or practice during their online practica, indicated by the expression of
conflicting ideas or feelings, discomfort, and uncertainty, as well as by
moments of hedging or silence. Finally, I analyzed transcripts to iden-
tify the processes through which participants negotiated tensions. Codes
about participants’ assessment identities prior to and after negotiation
processes were compared to identify changes and developments in their
assessment identities.
40 Jenny Ge
Internal Tensions
External Tensions
Between their first and second practica, the student teachers completed a
foundational assessment course that appeared to significantly support the
42 Jenny Ge
All participants discussed, at least once during the study, feeling com-
pelled to conform to their ATs’ assessment practices while on practica,
a constraint that has been commonly reported by student teachers
(e.g., Oo et al., 2021). In some cases, participants described conforming
because they generally agreed with their ATs’ practices; in other cases,
they conformed even when they disagreed with their ATs’ practices
because they believed they ought to as a student teacher. The need to
conform discouraged some participants from engaging autonomously
with assessment. Lola, for example, shared:
44 Jenny Ge
I feel like it’s just because every time that I ever assess, it’s always
been within the constraints of my AT. . . . I always had to, like, do
it, and then say, “Okay, is this good? Is this not good? How do I fix
this?” . . . It wasn’t really me thinking. It was me just kind of doing
whatever they want.
Participants also described how the short time frames of practica made
it difficult to experiment with assessment and observe the outcomes of
their experiments (e.g., whether an assessment was engaging, whether
students believed the assessment was fair) to inform their future practice.
Avery, for example, noted,
Other studies have similarly reported that student teachers shift their
conceptions of assessment from largely summative to more formative
through ITE learning (Smith et al., 2014) and practica (Xu & He, 2019).
These findings also support Xu and Brown’s (2016) and Looney et al.’s
(2018) emphasis on knowledge in their TAI frameworks.
Given that participants expressed difficulty in translating their
assessment thinking and practices between in-person and online
classrooms, gaining online assessment knowledge specifically helped
ease tensions between participants’ desired assessment identities and
online classroom constraints by expanding their conceptions of how they
could conduct assessment online to navigate those constraints. Avery, for
example, shared how she used her learning to help her be an “engaging”
assessor online as desired: “The [educational technology] course taught
us how to use [Google Earth], and I realized it could be a fun way for the
kids to learn and create a presentation.”
All participants, however, noted that there had been limited, if any,
instruction or resources on online assessment within their ITE courses.
46 Jenny Ge
Tanya recalled only one course that had spent one hour, at the instructor’s
discretion, on providing accommodations for online assessments. Miles
and Avery stated that an educational technology course talked about tech-
nology broadly without explicit reference to its potential use in online
assessment. These participants cited their own research and conversations
with ATs as building their knowledge of online assessment.
To prepare student teachers for online assessment, ITE programs
should provide specific information and guidance on online assessment.
Xu and Brown (2016) suggested that an appropriate assessment know-
ledge base should include knowledge of disciplines and pedagogical con-
tent; assessment purposes, content, and methods; grading; feedback;
assessment interpretation and communication; student involvement in
assessment; and assessment ethics. As ITE programs develop their cur-
ricula on online assessment, they may consider these topics, as well as
other contemporary online learning topics (e.g., digital skills, educational
technology), in relation to online classrooms.
The best part was that the AT gave me the floor and was like, “It’s
your choice what you want to do with them.” . . . And I saw the
kids were pretty engaged. . . . Maybe had I been in another class-
room, the AT might say, “Okay, this is not what is accepted; . . . do
this instead.” Or maybe they would keep giving suggestions, so
I wouldn’t know if I’m doing good [according to the AT], or if I’m
actually doing fine.
Supporting Student Teacher Assessment Identity Development 47
Tanya emphasized that implementing her own ideas, given the limited
guidance on online assessment she received, was the primary process
through which she gained confidence in her online assessment abilities.
Participants noted that opportunities to assess varied significantly
between practica and ATs. Additionally, although it was not observed in
this study, Adie (2013) found in her study that the online setting, when
cameras were off, made it easier for newer or more hesitant teachers to
be passive in assessment conversations with other teachers. At the time
of this study, neither the ITE program’s practicum guidelines for student
teachers and ATs nor the OCT’s (2022) Registration Guide outlined spe-
cific expectations for students to engage in assessment on practica. ITE
programs may consider working with ATs to ensure student teachers are
offered authentic opportunities to both practice and discuss assessment
on in-person and online practica. Further, given the value of context-
specific experience, student teachers anticipating teaching online should
be supported in completing online practica.
Tanya similarly shared how she began to feel more confident in online
assessment after her ITE professor copied one of her approaches:
If I hadn’t done the study I probably would have thought more gen-
erally in terms of my growth as a teacher, . . . but I don’t think
I would have had that same awareness of how I’ve grown as an
assessor.
Others shared that reflecting out loud helped ease their experience of
tensions by encouraging them to frame challenges as potential learning
experiences. Rose, for example, noted that recalling her experiences
during the study’s interviews “made it easier for [her] to look at some-
thing and go, ‘Oh, I could try this next time.’ ” This comment also shows
that reflecting helped Rose develop her ability to meta-position, “teachers’
ability to strategically select, blend, and shift between positions depending
on the teaching situation” (Blasco et al., 2021, p. 27).
Although reflective practice is often promoted as an individual pro-
cess (Collin & Karsenti, 2011), findings from this study suggest that ITE
programs should facilitate collaborative reflection exercises such as mock
job interviews and small group discussions to support student teachers not
Supporting Student Teacher Assessment Identity Development 49
Conclusion
The tensions and tension negotiation processes that emerged from this
study provide evidence that student teachers’ assessment identities may
initially develop to be context-specific. However, it is possible that over
time, through more learning, experience, and tension negotiation, begin-
ning teachers will grow their abilities to meta-position (Blasco et al.,
2021) and become better able to independently adapt their assessment
to new contexts. In the meanwhile, the increasing prevalence of online
learning calls for dedicated programming on online assessment in teacher
education. Specifically, ITE programs and teacher educators should con-
sider building student teachers’ knowledge of online assessment, cre-
ating authentic opportunities to practice online assessment, encouraging
and facilitating conversations on and feedback processes about online
assessment, and fostering collaborative and critical reflection of online
assessment experiences. These processes can support student teachers in
negotiating tensions arising in their online assessment learning and prac-
tice—and, in doing so, support developments in their budding assessment
identities that will guide their practice.
It is important to note that as the student teachers in this study all
volunteered to participate and expressed interest in developing as
assessors, it is possible they were more inclined to engage in the tension
negotiation processes identified in this study. Other student teachers
may experience other or additional tensions as they practice online
assessment, including those between personal and teacher or assessor
identities, reflecting more general hesitations towards assessment. This
study also took place during the pandemic, which affected the ways
online classrooms were organized (intentionally and unintentionally)
and teacher educators’ and Ats’ preparedness in supporting student
teachers on online practica. Future research should explore how teacher
educators’ planned approaches to online assessment education support
student teachers’ assessment identity development, how beginning
50 Jenny Ge
References
DOI: 10.4324/9781003347972-5
54 Allison Tucker and Marc Husband
Theoretical Framework
Professional Noticing
Assessment as Partnership
Methods
were invited to help create the success criteria for each assignment—
another example of our work to create partnerships purposefully.
Inviting preservice teachers to think about success criteria followed
a similar process for each assignment. Whether virtual or in person,
co-constructing success criteria involved three steps. First, preservice teachers
individually considered the assignment and generated ideas. Second, they
met in virtual breakout rooms or small groups to share and record group
ideas using either Google Jamboard (https://jamboard.google.com/;
see Figure 3.2), or whiteboard space (see Figure 3.3). Preservice teachers
indicated agreement with other groups’ ideas using checkmarks. In the
60 Allison Tucker and Marc Husband
third and final step, they finalized the criteria and sought agreement among
the group, including the teacher, through whole class discussion.
Noticings
discussion to consolidate and revise the criteria and help bring about
clarity with the whole group. The group addressed the first criterion, and
I started reading the first suggestion from the list. I asked the preservice
teachers if they were okay with it or would like to delete or modify it.
The question elicited many back-and-forth conversations among them to
clarify their wording. I reworded the criterion in response to a few pre-
service teachers’ ideas. Then I said, “Is everyone okay with that?” I looked
for a few nodding heads before ending the class.
Seeking agreement by asking, “Is everyone okay with that?” seemed
fine at the time. Looking back, however, something was off about me
editing their ideas using my words and then asking for agreement in a
way that some might not have felt comfortable saying no. In terms of
developing a partnership, I wondered if this approach could have been
perceived as a unilateral decision.
The first theme that emerged deals with preservice teachers’ previous
assessment experiences. The responses revealed that in prior learning
experiences, they adopted a passive role in their learning by doing what
they were told. Although co-constructing assignment criteria offered them
opportunities to take an active role in their learning, the co-constructing
experience brought up desires to remain passive. In other words, pre-
service teachers wanted to be told what to do because they had been
successful in that paradigm.
This theme was characterized by phrases such as the following: “I had
never heard of this before” (Nancy), “I have never experienced [this]
before” (Anna), “I have not had the opportunity to partake in [assessment
as partnership] before” (Yolonda), and “It was so far from anything I’ve
experienced throughout my education journey” (Rachel). The word
“told” was also frequently used to describe their previous assessment
experiences. Preservice teachers relied heavily on their teachers telling
them what to do. As Emma’s comment illustrated, “I grew up in a system
where I was told what to do, and that was the bottom line.” Yeshi drew
on the understanding that success came from doing what one was told:
“Everything [was] always, ‘Do as you are told, and you will do well.’ ”
Anna’s response captured how the role of the teacher and the role of the
student would be enacted—the teacher would tell, and the student would
do exactly as instructed to “get the highest grade”:
There was tension for preservice teachers: They recognized the oppor-
tunity that co-constructing offered them but wanted the familiarity of
being told what to do. This friction was evident in Leo’s comment: “I felt
a bit frustrated with the [co-constructing] process, almost as if I just
wanted to be told what to do.” Rachel’s comments echoed this sentiment,
which highlight that in playing the game of school, preservice teachers
were told exactly what the teacher wanted them to do:
I’ve always been a student who knew how to play the game of
school. When a teacher or professor told me what they wanted,
I could usually deliver it. So, at times during our co-construction
process, I found myself thinking, just tell me what you want so I can
give it to you!
Our goal was to weave assessment into the process of learning. The result
was that some preservice teachers felt excluded from the process whereas
others took on active roles, eagerly participating in the whole class discus-
sion; such was the nature of the uneven experience. For some, the experi-
ence was positive; they felt they had a voice and were valued through
the process. For example, Yannick “appreciated having the opportunity
to have a ‘say’ in [their] learning” and Luca thought it was “beneficial to
make students feel seen and valued in the classroom.” For others, however,
Assessment as Partnership 65
The duality of this process though, is that I felt alone in enjoying the
process. It did not feel like the people closest to me agreed with the
methods of assessment as partnership. This led to many awkward
conversations and moments of uncertainty for me as a learner.
I did not feel like I personally had much input into selecting cri-
teria. . . . It did seem like there was a set of criteria that we as a
group had to uncover more than we actually created criteria, in a
true sense.
Drew’s comments speak to the idea that for some preservice teachers,
co-constructing criteria felt like a façade, and they did not have input.
Although some preservice teachers experienced assessment as partnership
Assessment as Partnership 67
as beneficial, these others believed that they did not have a voice and felt
lost in the process of co-constructing criteria.
In beginning this process of assessment as partnership, we drew upon
our own experiences of teaching in classrooms with elementary-aged
students and working with in-service teachers whose students are that
age. It somehow seemed easier to do things differently in an elemen-
tary classroom, where the rules of the game are not firmly entrenched.
It is now obvious to us that the preservice teachers have ideas of what
assessment should be based on their past experiences. As we invite them
into this partnership, we are asking them to unlearn a great deal about
the nature of learning and classrooms and adopt new understandings
of how educators contribute ideas and establish shared learning goals.
We have underestimated how deeply embedded ideas about decision-
making in classrooms are for the preservice teachers. Their responses,
when asked for their ideas, ranged from feeling liberated and glad to
have a voice to distrust (we are asking them to simply uncover our
ideas).
In thinking about how we might reframe the process of
co-constructing criteria in the future, one obvious insight is that we
need to address trust within this partnership. We also need to think
about other ways to allow spaces for voices. Perhaps the small group
work does not need to be reported to the whole group. Perhaps we
need to make more spaces for the quieter voices. Perhaps we need to
ask preservice teachers how to structure this process. After all, if this is
truly a partnership; they can guide, and they know their own barriers.
Perhaps we need to reflect on our enactment and understanding of the
“co,” so we can follow their lead.
[It] got slowly better as time went on. . . . Once the learning
started, and everyone started to co-construct the assessment cri-
teria together, it was easier to see the benefit of how it impacted
everyone’s thinking of assessment.
This was a new process and experience for students that required us
to scaffold and support them in these partnerships. Effective teaching
involves knowing and responding to where the preservice teacher is on
their journey as the starting place—we acknowledge that we needed to
be better in this regard. We were okay with the messiness, the unfolding,
and the learning through the process. They, however, came with
expectations of how school functions. We needed to do better to bridge
the gap between where we started and their previous knowledge and
experiences.
Final Thoughts
References
teachers developing criteria for a rich quadratics task. Teaching and Teacher
Education, 89, Article 103011. https://doi.org/10.1016/j.tate.2019.103011
Barnett, R., & Coate, K. (2005). Engaging the curriculum in higher education. SRHE/
Open University Press.
Boud, D. (2007). Reframing assessment as if it were important. In D. Boud &
N. Falchikov (Eds.), Rethinking assessment in higher education: Learning for the
long term (pp. 14–25). Routledge.
Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The
challenge of design. Assessment & Evaluation in Higher Education, 38(6), 698–
712. https://doi.org/10.1080/02602938.2012.691462
Boud, D., & Solder, R. (2016). Sustainable assessment revisited. Assessment &
Evaluation in Higher Education, 41(3), 400–413. https://doi.org/10.1080/026
02938.2015.1018133
Buck, G. A., Trauth-Nare, A., & Kaftan, J. (2010). Making formative assessment
discernable to pre-service teachers of science. Journal of Research in Science
Teaching, 47(4), 402–421. https://doi.org/10.1002/tea.20344
Cook-Sather, A., Bovill, C., & Felten, P. (2014). Engaging students as partners in
learning and teaching. Jossey Bass.
Dale, E. (2017). Reciprocity as a foundational concept in teaching philanthropic
and nonprofit studies. Philanthropy and Education, 1(1), 64–70. https://doi.
org/10.2979/phileduc.1.1.05
Deeley, S., & Bovill, C. (2017). Staff student partnership in assessment: Enhancing
assessment literacy through democratic practices. Assessment & Evaluation in
Higher Education, 42(3), 463–477. https://doi.org/10.1080/02602938.2015.11
26551
Doyle, E., Buckley, P., & Whelan, J. (2019). Assessment co-creation: An explora-
tory analysis of opportunities and challenges based on student and instructor
perspectives. Teaching in Higher Education, 24(6), 739–754. https://doi.org/10.
1080/13562517.2018.1498077
Eizadirad, A. (2019). Decolonizing educational assessment: Ontario elementary
students and the EQAO. Springer Nature.
Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching,
re-imagining teacher education. Teachers and Teaching, 15(2), 273–289. https://
doi.org/10.1080/13540600902875340
Hauge, S. (2021). Self-study research: Challenges and opportunities in teacher
education. In M. J. Hernandez-Serrano (Ed.), Teacher education in the 21st cen-
tury: Emerging skills for a changing world (pp. 139–157). IntechOpen.
Marton, F. (1994a). Phenomenography. In T. Husén, G. Handal, & T. N.
Postlethwaite (Eds.), The international encyclopedia of education (2nd ed.,
pp. 4424–4429). Pergamon Press.
72 Allison Tucker and Marc Husband
Mason, J. (2002). Researching your own practice: The discipline of noticing. Routledge.
Morcom, L., & Freeman, K. (2018). Niinwi—kiinwa—kiinwi: Building non-
Indigenous allies in education through Indigenous pedagogy. Canadian
Journal of Education, 41(3), 808–833. https://journals.sfu.ca/cje/index.php/
cje-rce/article/view/3344
Stefani, L. A. (1998). Assessment in partnership with learners. Assessment &
Evaluation in Higher Education, 23(4), 339–350. https://doi.org/10.1080/
0260293980230402
Stolz, S. (2020). Phenomenology and phenomenography in educational research:
A critique. Educational Philosophy and Theory, 52(10), 1077–1096. https://doi.
org/10.1080/00131857.2020.1724088
Truth and Reconciliation Commission of Canada. (2015). Truth and Reconciliation
Commission of Canada: Calls to action. https://ehprnh2mwo3.exactdn.com/
wp-content/uploads/2021/01/Calls_to_Action_English2.pdf
Wilson, S. (2007). What is an Indigenist research paradigm? Canadian Journal of
Native Education, 30(2), 193–195. https://doi.org/10.14288/cjne.v30i2.196422
Addressing Challenges 4
of Online Learning
through Quality
Assessment Principles
Caitlin Fox and Julia Rheaume
DOI: 10.4324/9781003347972-6
74 Caitlin Fox and Julia Rheaume
Literature Review
and confidence that they can accomplish the tasks on their own (Frey
et al., 2018). Feedback during instruction (for both the educator and the
learner) is a vital component of quality assessment design that supports
student achievement (Black & Wiliam, 1998, 2018).
Educators can provide opportunities for students to use the feedback
they have received during lessons and instruction. Feedback used during
the learning process feeds learning forward and provides opportunities
for students to believe they are meeting the intended learning goals.
Feedback provided only at the end of task completion, with no chance to
use it, is evaluative feedback. Educators can provide evaluative feedback
in the hope that students will consider it in future work, but they are not
providing an opportunity for students to improve or enhance their work
while working towards the learning goals (Davies, 2011).
Beginning with the end in mind means that educators know what evi-
dence of learning will look like before teaching begins. Starting with a
vision of what and how the evidence will be collected allows educators
and students to focus on clear learning goals (Wiggins & McTighe, 2005).
When summative tasks are aligned to the verbs within the learning
outcomes, educators know they are measuring what they are intending
to measure: student performance of the learning outcomes. A closer
look at an assessment task should allow an educator to infer the learning
outcomes that are being assessed. If the connection to the outcomes is
not obvious, it may mean that the assessment must shift away from a
selected response to a demonstration of skill through a performance or
applied task (McTighe, 2021). Selected response and short-answer sum-
mative tasks can help to measure verbs such as “identify,” “describe,”
and “recall.” Performance tasks and extended thinking summative tasks
can measure verbs like “analyze,” “compare,” and “discuss” (Hogg et al.,
2022). Summative assessment tasks are often products that are evaluated
as evidence of learning. By creating products, students demonstrate what
they know and what they can do (Davies, 2011). Evaluating the products,
described with clear learning criteria and aligned to learning outcomes, is
one method of knowing if learning has occurred.
However, products are not the only way to measure learning. To
know that learning has occurred, observations of students performing
skills and using knowledge are essential (Davies, 2011). Conversations
Addressing Challenges of Online Learning 77
Methods
Results
Criteria
Table 4.2 Students’ feelings about assessment during online learning (Question 8).
Feeling n %
Capable/confident 22 30
Overwhelmed 33 45
Confused 21 29
Frustrated/challenged/stressed 16 22
Anxious/nervous/depressed 5 7
Alone 2 3
Neutral 1 1
Note. Data indicate students’ responses to the question, “How have you felt when begin-
ning an assignment while learning online over the last two years?”
The frustrating aspects about this shift was the lack of preparedness
and clarity from certain professors. At times it felt like I was pulling
Addressing Challenges of Online Learning 81
Feedback
Evidence
Table 4.4 Evaluation of assignment (Questions 11, 12, 13, 15, and 16).
Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
11. The feedback 0 0 16 55 29
I received after
evaluation aligned
with the assignment
expectations.
12. I feel the feedback 1 10 32 33 25
I received after
evaluation was
helpful for future
assignments.
13. I feel the assignments 3 12 23 33 29
motivated me and
helped me make
progress towards
my personal and
professional growth.
15. I feel that the types of 0 7 32 45 16
assessments accurately
captured my learning.
Addressing Challenges of Online Learning 83
Survey item number and text Frequency of each response on the Likert scale (%)
1 2 3 4 5
16. I had opportunities 0 15 16 52 16
to demonstrate
my learning and
understanding
(knowledge and skills)
of the course outcome
through a variety of
assignment tasks.
Note. Response options ranged from 1 = never (10% or less of the time) to 5 = consistently
(90% or more of the time). Percentages were rounded, so totals may not equal 100%.
Student Experience
I think the one thing that stood out was the lack of clear expectations
on assignments and how they related to what we were learning or
doing in the classroom to prepare us for our careers. I felt some of
the assignments felt more like busy work without a real purpose as
educators were adapting their assignments to fit an online setting.
Discussion
Criteria
Feedback
Survey results show that students rarely sought feedback prior to sub-
mitting their assignments. The most common sources of feedback for
86 Caitlin Fox and Julia Rheaume
Evidence
Student Experience
Conclusion
and in-person learning modes, not only online learning. However, as 70%
of students were online approximately 70% of the time, the results do
pertain to online experiences for the majority of students.
Another limitation was inherent in using an online survey meth-
odology. Our interpretation of the results was limited to the partici-
pant selections on the Likert scale and two opportunities for qualitative
statements. Future qualitative research may be needed to further explore
student perceptions of online learning and the related challenges.
One vein of this study that we hope to explore in future research
is the importance of mindset towards online learning. We did not ask
about participants’ perceptions of online learning and wonder if there
might be a correlation between mindset and perceived ability to succeed.
For example, the participant who stated, “I enjoy online learning as the
instructors are always willing to help and the learning is in your hands,”
likely had a different experience than the one who indicated they felt
“very overwhelmed and stressed out.”
Students who articulated a positive experience with assessment were
likely connected to the instructors. They could articulate how they were
getting to the learning goals and how they understood their learning had
occurred. Student experiences in the learning process informed how they
felt about assessments. Placing value on the learning process more than
the teaching process can lead to positive experiences for students and pro-
vide rich learning experiences for everyone involved.
References
Alberta Assessment Consortium. (2017). AAC key visual: Assessing student learning
in the classroom. https://aac.ab.ca/wp-content/uploads/2018/01/AAC-Key-
VisualAUG2017.pdf
Bennett, S., Lore, P., & Mulgrew, A. (2016). What matters most about assessment.
Alberta Assessment Consortium.
Bennett, S., & Mulgrew, A. (2010). Scaffolding for student success. Alberta Assessment
Consortium.
Bennett, S., & Mulgrew, A. (2013). Building better rubrics. Alberta Assessment
Consortium.
Bennett, S., & Mulgrew, A. (2019). Creating credible criteria. Alberta Assessment
Consortium.
Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through class-
room assessment. GL Assessment.
90 Caitlin Fox and Julia Rheaume
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.1
080/0969594X.2018.1441807
Brown, S. (2005). Assessment for learning. Learning and Teaching in Higher
Education, 1, 81–89. http://eprints.glos.ac.uk/id/eprint/3607
Conrad, D., & Openo, J. (2018). Assessment strategies for online learning: Engagement
and authenticity. AU Press.
Cooper, D., & Catania, J. (2022). Rebooting assessment: A practical guide for balan-
cing conversations, performances, and products (How to establish performance-based,
balanced assessment in the classroom). Solution Tree.
Creswell, J. W. (2012). Educational research: Planning, conducting, and evaluating
quantitative and qualitative research (4th ed.). Pearson Education Inc.
Davies, A. (2011). Making classroom assessment work (3rd ed.). Connections
Publishing.
Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and
mixed-mode surveys: The tailored design method. John Wiley & Sons.
Fisher, D., Frey, N., & Hattie, J. (2016). Visible learning for literacy: Implementing the
practices that work best to accelerate student learning. Corwin A Sage.
Fox, C. (February 2021). Aligning our values and action: Putting the focus on
learning. Canadian Assessment for Learning Network Newsletter. https://us18.
campaign-archive.com/?u=bbaef655f1d5ab9dfe083c0aa&id=f27ae33b5c
Frey, N., Hattie, J., & Fisher, D. (2018). Developing assessment-capable visible
learners, grades K–12: Maximizing skill, will, and thrill (1st ed.). Corwin Literacy.
Hattie, J. (2018). 10 mindframes for visible learning. Teaching for success. Routledge.
Hogg, R., Armstrong, D., & Jones, M. (February 2, 2022). An excerpt from
“A framework for student assessment” (3rd ed., pp. 8–15). Alberta Assessment
Consortium.
McTighe, J. (October 20, 2021). Leading the conversation: The pedagogy of assessment
[Webinar]. Edmonton Regional Learning Consortium.
McTighe, J., & Ferrara, S. (2021). Assessing student learning by design: Principles and
practices for teachers and school leaders. Teachers College Press.
Saldaña, J. (2009). The coding manual for qualitative researchers. Sage Publications.
Wiggins, G., & McTighe, J. (2005). Understanding by design (2nd ed.). Association
for Supervision and Curriculum Development.
Wiliam, D. (2013). Assessment: The bridge between teaching and learning. Voices
from the Middle, 21(2), 15–20.
Assessing the Content 5
Knowledge, Skills, and
Competencies of
Teacher Candidates
in an Online Learning
Environment
A Case Study
Renee Bourgoin
DOI: 10.4324/9781003347972-7
92 Renee Bourgoin
Context
Conceptual Framework
Literature Review
Visible Learning
Research Questions
Methodology
Findings
Findings are described in three general themes. The first reports on the
analysis of my formative and summative assessment practices in the online
Assess Knowledge, Skills, Competencies of Teacher Candidates 97
We are about to discuss a new concept. Are you ready to move on?
At this point, would it be a good idea to ask any lingering questions?
What may still be unclear to you? What has not yet been answered
for you? What would you like me to discuss before we move on?
areas where the lesson could have been improved. These types of activ-
ities, along with the rich discussions that ensued, enabled me to model
reflective practices and notions of lifelong learning while also developing
important critical thinking skills in students. With such activities I made
my teaching processes visible to my students.
At the end of the term, students were asked to reflect on the overall
value of these periodic reflection exercises. They took stock of their
growth, their biggest takeaways, the value of reflective practices, and
the importance of being a lifelong learner.
5. Inclusive teacher showcase: Students were grouped in triads at the
beginning of the term, and every few weeks, the team would meet
during class to share their perspectives on recently covered topics. To
facilitate discussions, teacher candidates documented their group’s
learning and ideas for classroom applications by engaging in collab-
orative notetaking activities. At the end of term, they transformed
their learning logs into culminating projects highlighting their
learning journey throughout the course.
For all assignments, students were provided specific success criteria and/
or assignment exemplars. Additionally, time in class was set aside for
teacher candidates to work on parts of their assignments to ensure they
felt supported. As they worked together in their small breakout rooms,
I moved from room to room and initiated formal and informal mini-
conferences, inviting group members to talk about their work. I provided
a fair amount of targeted oral feedback in terms of strengths and things
they may want to consider moving forward.
Conclusion
References
Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., & Norman, M. (2010). How
learning works: Seven research-based principles for smart teaching. Jossey-Bass.
Bachman, L. (2004). Statistical analysis for language assessment. Cambridge
University Press.
Bahula, T., & Kay, R. (2021). Video feedback in online learning. In R. Kay &
H. Williams (Eds.), Thriving online: A guide for busy educators (pp. 228–235).
Ontario Tech University.
Beilstein, S. O., Henricks, G. M., Jay, V., Perry, M., Bates, M. S., Moran, C., &
Cimpian, J. R. (2020). Teacher voices from an online elementary mathematics
community: Examining perceptions of professional learning. Journal of
Mathematics Teacher Education, 24, 283–308. https://doi.org/10.1007/s10857-
020-09459-z
Black, P. (2009). Formative assessment issues across the curriculum: The theory
and the practice. TESOL Quarterly, 43(3), 519–524. www.jstor.org/stable/
27785033
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment
in Education: Principles, Policy & Practice, 5 (1), 7–74. https://doi.org/
10.1080/0969595980050102
Bonk, C. J., & Zhang, K. (2006). Introducing the R2D2 model: Online learning
for the diverse learners of this world. Distance Education, 27(2), 249–264.
https://doi.org/10.1080/01587910600789670
Brunner, D. (2006). The potential of the hybrid course vis-à-vis online and trad-
itional courses. Teaching Theology and Religion, 9(4), 229–235. https://doi.
org/10.1111/j.1467-9647.2006.00288.x
Burns, A., Danyluk, P., Kapoyannis, T., & Kendrick, A. (2020). Leading the pan-
demic practicum: One teacher education response to the COVID-19 crisis.
International Journal of E-learning and Education, 35 (2), 1–25. www.ijede.ca/
index.php/jde/article/view/1173/1836
Castro, M. D. B., & Tumibay, G. M. (2021). A literature review: Efficacy of online
learning courses for higher education institution using meta-analysis. Education
Assess Knowledge, Skills, Competencies of Teacher Candidates 107
DOI: 10.4324/9781003347972-9
112 Joshua Hill, Christy Thomas, and Allison Robb-Hagg
Context
Purpose
Battiste (2013) noted that decolonization is, at the core, about asserting
the presence and humanity of Indigenous peoples. With this idea in mind,
Decolonizing Assessment Practices in Teacher Education 113
Theoretical Framework
Methodology
Decolonizing Assessment
Task design emerged from our reflective dialogue as a key area to change
in our current practices to support our aim of decolonization. Commonly
assigned learning tasks in the postsecondary context, such as tests that
ask students to remember and retell information or essays that prompt
students to write from a third person or depersonalized perspective,
emanate from the Eurocentric views of knowledge as fixed, discoverable,
and external from the knower (Davis et al., 2015). In contrast, Wilson and
Hughes (2019) stated that an Indigenist approach is founded on a relational
118 Joshua Hill, Christy Thomas, and Allison Robb-Hagg
Louie et al. (2017) suggested that that “stories can create an envir-
onment in which learning emerges from individuals’ meaningful
experiences and multiple ways of knowing are honoured” (p. 28).
Reflecting on this pedagogical approach as we experienced in our talking
circle learning task, we identified that storytelling supported students to
position themselves in relation to their learning and reflect upon and
share their assumptions and biases. Along with this strength emerged
a challenge in navigating how students responded to one another’s
reflections. We found that a tension existed in that as we encouraged
open, honest, and personal reflection, the reflections of some students
were perceived as offensive to some other students. Through this process
we identified a need to support students to build trusting relationships
to create the conditions for the talking circle to be a safe and welcoming
place. We led our students in large group conversations about how to
frame reflections and choose vocabulary from a place of awareness
about others’ perspectives. We also established a way for students in
breakout rooms to ask for help when they identified a situation where
tensions existed and instructor presence and coaching would be helpful.
Overall, we found that these strategies to assist students in group work
helped us to build relationships with them and in turn helped them to
build relationships with each other. This finding is supported by Louie
et al., who suggested that the instructor should “respond respectfully
and critically, working relationally to identify discourses and experiences
of power, privilege, and marginalization” (2017, p. 27).
These experiences are leading us to explore other ways for creating a
space for reflective dialogue and additional strategies for how we might
support our students in building relationships and learning in groups.
This work has also inspired us to explore how we might draw on peda-
gogical approaches that promote compassionate and culturally respon-
sive learning experiences for our students. We are seeking ways to support
our students in developing learning designs that affirm diversity and strive
for a more equitable curriculum to support antiracist pedagogies.
Challenges
Conclusion
conclusion is the need for this work to grow and be woven throughout
our teacher education program. We believe that continuity and collective
effort are required to build sustainable relationships with Indigenous
Elders and communities to create the safe conditions for students to
encounter, reflect on, and reframe deeply held perspectives; and to disrupt
and replace colonial structures. To this end, we plan to build on and grow
this model of collaborative faculty development through scholarship of
teaching and learning by inviting our colleagues across the program to
partner with us. We also plan to explore Métissage (Donald, 2012) as an
Indigenous research methodology to advance this work. We hope that
in making our learning journey visible, we have opened up some new
questions or possibilities for readers. In closing, we want to thank Louie
et al. (2017) for creating a grounding from which we have sought to grow
towards decolonizing our assessment practices.
References
Donald, D. (2014). Teaching and learning from aboriginal perspectives in the social
studies classroom. CAP Final Report. http://galileo.org/pl/wp-content/
uploads/CAP-Report_D.Donald.docx
Ermine, W. (2007). Ethical space of engagement. Indigenous Law Journal, 6(1),
193–203.
Hanson, A., & Danyluk, P. (2022). Talking circles as Indigenous pedagogy in
online learning. Teaching and Teacher Education, 115, Article 103715. https://
doi.org/10.1016/j.tate.2022.103715
Hendricks, C. (2017). Improving schools through action research: A reflective practice
approach (4th ed.). Pearson Education.
Kovach, M. (2009). Indigenous methodologies; Characteristics, conversations, and
contexts. University of Toronto Press.
Louie, D. L., Poitras Pratt, Y., Hanson, A. J., & Ottmann, J. (2017). Applying indi-
genizing principles of decolonizing methodologies in university classrooms.
Canadian Journal of Higher Education, 47 (3), 16–33. https://doi.org/10.47678/
cjhe.v47i3.187948
McKegney, S. (2008). Strategies for ethical engagement: An open letter concerning
non-native scholars of native literatures. Studies in American Indian Literatures,
20(4), 56–67.
Mertler, C. A. (2022). Introduction to educational research (3rd ed.). Sage.
Mills, G. E. (2013). Action research: A guide for the researcher (5th ed.). Pearson.
Parsons, J., Hewson, K., Adrian, L., & Day, N. (2013). Engaging in action research:
A practical guide to teacher-conducted research for educators and school leaders.
Brush Education.
Smith, L. T. (2012). Decolonizing methodologies: Research and indigenous peoples
(2nd ed.). Zed Books.
Styres, D. (2019). Pathways for remembering and (re)cognizing Indigenous
thought in education. In H. Tomlins-Jahnke, S. Styres, S. Lilley, & D. Zinga
(Eds.), Indigenous education: New directions in theory and practice (pp. 39–62).
University of Alberta Press.
Styres, D., Zinga, D., Lilley, S., & Tomlins-Jahnke, H. (2019). Contested spaces
and expanding the Indigenous education agenda. In H. Tomlins-Jahnke,
S. Styres, S. Lilley, & D. Zinga (Eds.), Indigenous education: New directions in
theory and practice (pp. xiii–xxi). University of Alberta Press.
Truth and Reconciliation Commission of Canada. (2015). Honouring the truth, rec-
onciling for the future: Summary of the final report of the Truth and Reconciliation
Commission of Canada. James Lorimer. https://irsi.ubc.ca/sites/default/
files/inline-files/Executive_Summary_English_Web.pdf
Wiliam, D., & Leahy, S. (2015). Embedding formative assessment: Practical techniques
for K–12 classrooms. Learning Sciences International.
Decolonizing Assessment Practices in Teacher Education 125
DOI: 10.4324/9781003347972-10
Meaningful Feedback in the Online Learning Environment 127
Some studies (Buck et al., 2007; Ogan-Bekiroglu & Suzuk, 2014) have
suggested that preservice teachers have a good theoretical grasp of
128 Maggie McDonnell
Assessment Documentation
Rubrics
Facilitating Self-assessment
Students in all levels struggle with feedback for several reasons, but the
three factors most often cited are brevity, negativity, and complexity—
feedback is too short, too negative, or too difficult to interpret (Pinheiro
Cavalcanti et al., 2019). Weaver (as cited in Pinheiro Cavalcanti et al.,
2019) reported that more than half of college students had not been
taught how to read and apply feedback effectively. If students do not
know what to do with teacher feedback, they will struggle to progress.
Delivering high-quality information about students’ learning, then,
tacitly compels teachers to train them to interpret and apply that feed-
back. Carless and Boud (2018) defined this skill as feedback literacy; that
is, “the understandings, capacities and dispositions needed to make sense
of information and use it to enhance work or learning strategies” (p. 2).
Delivery necessarily requires reception: A package is not delivered if it
has not been received. The same is true of feedback: If students do not
know how to use the information teachers provide, then the teachers must
have not successfully delivered it. To help students develop their feedback
literacy, teachers must focus on three stages, as outlined in Figure 7.1:
seeking information, processing the information, and finally, acting on the
information (Carless & Boud, 2018). The initial stage, seeking informa-
tion, can be as direct as the student asking for feedback on their own work
or for clarification on instructions, material, or assessment tools. As well
as directly seeking feedback, the student may get information through
monitoring; that is, drawing feedback from context (e.g., comparisons
Encouraging Dialogue
• fewer words, which means that they are quicker and easier to create
and easier to understand;
• more flexible interpretations of the criteria, which means that
teachers do not need to anticipate how students might deviate from
the expected standard;
• better quality feedback, because teachers can focus their remarks on
specific problems and on ways in which the student has surpassed
expectations; and
• clearer objectives and standards, which support more effective peer
and self-assessment.
Meaningful Feedback in the Online Learning Environment 135
Figure 7.2 Example of a single point rubric. Adapted from Meet the Single Point
Rubric by J. Gonzalez, 2015, Cult of Pedagogy (www.cultofpedagogy.
com/single-point-rubric/). Copyright 2015 by Cult of Pedagogy.
Rating
Criteria (or Standard) 1 2 3 4 Feedback
Describe the standard Can be used to explain number given,
to be met or explain offer suggestions for improvement, or
the criteria. give advice for pushing even further.
Scale: 1 = standard not met; 2 = standard partially met; 3 = standard met; 4 = exceeds
expectations.
Figure 7.3 Modified single point rubric. Adapted from Meet the Single Point
Rubric: Another Variation (Added in 2017), by J. Gonzalez, 2017, Cult of
Pedagogy (www.cultofpedagogy.com/single-point-rubric/#:~:text=
ANOTHER%20VARIATION%20(ADDED%20IN%202017)).
Copyright 2017 by Cult of Pedagogy.
Your comments
Criteria What to look for or suggestions
Atmosphere: How is the temperature in the room? Does
it ever feel too chilly or too warm?
Is there enough room for everyone to set up
their mat and move comfortably?
Did the music add to the experience? Was it
ever distracting, too loud, or too quiet?
How was the lighting? Could you see the
instructor? Were the lights ever too bright?
What else could your colleague do in this
area to achieve excellence?
Instructor: Was the instructor friendly and supportive?
Were the cues easy to understand and
apply? Did the instructor demonstrate new
poses? Did the instructor offer alternative
poses or incorporate props to support
students at different levels? What else could
your colleague do in this area to achieve
excellence?
Yoga poses: Were you able to follow the sequence of
poses? Was there enough challenge without
anything being completely beyond you?
Did you feel as though your whole body
was involved? Was there enough warm-up?
Were you able to cool down completely
before final relaxation? Was final relaxation
too short or too long? What else could
your colleague do in this area to achieve
excellence?
Perhaps the most effective and most obvious way to provide students
opportunities to close the gap between performance and learning goals
is through scaffolded stages of drafting and revision. In my own prac-
tice, most major assignments involve multiple submission stages with
feedback in various forms at each stage. In my Cégep English courses,
for instance, students write three essays over five weeks. Each essay
involves planning, outlining, drafting, and revising prior to submission, as
illustrated in Figure 7.6.
As is clear from Figure 7.6, peer feedback is woven into the scaffolded
process. Peer review is valuable to teaching and learning in many ways
(Debby Ellis Writing Center, 2022): It fosters collaboration and engage-
ment, and students learn from one another and help one another navigate
assignments and material. The skills they develop in providing helpful
feedback to their classmates prepare them for providing and using feed-
back in future. As well, peer feedback adds a natural step to the writing
process that some students might be tempted to skip; because they are
accountable to their peers for submitting draft work and providing feed-
back, they develop parallel skills and are better prepared for their final
submission.
Conclusion
References
Hamel, C., & Viau-Guay, A. (2019). Using video to support teachers’ reflective
practice: A literature review. Cogent Education, 6(1), 1–14. https://doi.org/10.
1080/2331186X.2019.1673689
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational
Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
John, T. E., & Thomas, B. (2018). Reflective practitioner: Fostering motiv-
ation, thinking skills and self-regulation to enhance the quality of teaching.
International Journal of Research and Analytical Reviews, 5(2), 1056–1058.
Jones, A., Bull, S., & Castellano, G. (2018). “I know that now, I’m going to learn this
next” Promoting self-regulated learning with a robotic tutor. International Journal
of Social Robotics, 10, 439–454. https://doi.org/10.1007/s12369-017-0430-y
Körkkö, M., Morales Rio, S., & Kyrö-Ämmälä, O. (2019). Using a video app as a
tool for reflective practice. Educational Research, 61(1), 22–37. https://doi.org/
10.1080/00131881.2018.1562954
Lemay, D. J., Bazelais, P., & Doleck, T. (2021). Transition to online learning
during the COVID-19 pandemic. Computers in Human Behavior Reports, 4,
Article 100130. https://doi.org/10.1016/j.chbr.2021.100130
Mehall, S. (2020). Purposeful interpersonal interaction in online learning: What
is it and how is it measured? Online Learning, 24(1), 182–204. https://doi.
org/10.24059/olj.v24i1.2002
Nagro, S. A. (2019). Reflecting on others before reflecting on self: Using video
evidence to guide teacher candidates’ reflective practices. Journal of Teacher
Education, 71(4), 420–433. https://doi.org/10.1177/0022487119872700
Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated
learning: A model and seven principles of good feedback practice. Studies in
Higher Education, 31(2), 199–218. https://doi.org/10.1080/03075070600572090
Nicol, D. J., & Milligan, C. (2006). Rethinking technology-supported assessment
in terms of the seven principles of good feedback practice. In C. Bryan & K.
Clegg (Eds.), Innovative assessment in higher education. Taylor and Francis. www.
taylorfrancis.com/chapters/edit/10.4324/9780203969670-16/rethinking-
technology-supported-assessment-practices-relation-seven-principles-good-
feedback-practice-david-nicol-colin-milligan
Ogan-Bekiroglu, F., & Suzuk, E. (2014). Pre-service teachers’ assessment literacy
and its implementation into practice. The Curriculum Journal, 25(3), 344–371.
https://doi.org/10.1080/09585176.2014.899916
Papanthymou, A., & Darra, M. (2019). The contribution of learner self-assessment
for improvement of learning and teaching process: A review. Journal of Education
and Learning, 8(1), 48–64. https://doi.org/10.5539/jel.v8n1p48
Pinheiro Cavalcanti, A., Rolim, V., André, M., Freitas, F., Ferreira, R., & Gašević,
D. (2019). An analysis of the use of good feedback practices in online learning
144 Maggie McDonnell
DOI: 10.4324/9781003347972-11
146 Pham, Tran, Xuan, and Le
Theoretical Framework
In this study, we used the CoI model (Garrison et al., 1999) to frame the
multifaceted realities of OL landscapes. This model was a useful theor-
etical framework for contextualizing our lived experiences from various
learning and evaluation backgrounds during and after the pandemic.
Garrison and colleagues (1999) developed the concept of presence within
the CoI model, influenced by Dewey’s thinking about collaborative
constructivism in learning, to enable researchers to identify significant
dimensions influencing students’ OL in relation to teachers, peers, and
other stakeholders (Kim & Gurvitch, 2020). As shown in Table 8.1, the
CoI model includes three interconnected elements in the synchronous
learning mode: social presence, cognitive presence, and teaching presence
(Garrison et al., 1999; see also Anderson et al., 2001; Garrison, 2007). We
use synchronous learning to describe face-to-face communication without
physical contact, which has become an increasingly popular method of
communication on virtual platforms such as Zoom (https://zoom.us/)
or Google Meet (https://meet.google.com/).
Given that the global effectiveness of OL for graduate students is
still understudied (Bains et al., 2021; Xie et al., 2020), the CoI model
(Garrison et al., 1999) provided a theoretical framework for investigating
our OL experiences by analyzing each of the three presences in our
autoethnographic stories. We asked two questions: How does the concept
of presence in virtual platforms inform and illuminate our experiences of
148 Pham, Tran, Xuan, and Le
Researcher Positionality
Tan
Chi
Xuan
Giang
Methodology
Narrative Vignettes
Each vignette represents one of the themes that emerged from our OL
experiences during the COVID-19 pandemic: a lack of belonging (Tan),
the value of a virtual “library” (Chi), lack of engagement and integrity in
OL (Xuan), and chaos and disconnections (Giang). In our narratives, we
link to the literature to illustrate how presence was embedded into our
experiences. We include photographs that encapsulate some of our phys-
ical, emotional, and psychological experiences during OL (all are used
with the permission of those included in the images). We invite readers
to immerse themselves in our feelings, thoughts, and courses of action,
and find resonance with them in terms of the OL challenges and benefits
they highlight.
Silence often resulted when the class transferred from lecture presenta-
tion to collaborative work in a breakout room. I wondered whether what
I was sharing was meaningful because almost nobody turned on their
152 Pham, Tran, Xuan, and Le
camera or responded to anything (see Figure 8.1). This was not a dis-
course as I had expected. The silence disrupted my iterative process of
“moving between the personal worlds and the shared worlds” (Garrison
et al., 2001, p. 10) and reduced my sense of belonging to the group. I did
not learn much in this breakout room (or others like it).
My classmates’ decision not to display their profile pictures or videos,
hindering a sense of classroom belonging, encapsulated a dilemma
(Hirsch & Smith, 2017) in terms of time and identity engendered by demo-
graphic differences. It was not until the instructor joined our breakout room
that a sense of belonging began to form. He reconfigured our identities and
reconnected our shared worlds by asking questions back and forth, encour-
aging group members to share or comment on others’ ideas, summarizing
the discourse, and (re)constructing knowledge. His course of action built
understanding in terms of social presence and was a precondition for self-
reflection in terms of cognitive presence (Garrison et al., 1999).
My experience in these group discussions emphasized to me that
group discussions could be an assessment approach in OL if the mutual
relationship between cognitive and social presence were balanced (by
the instructor). Given that OL retains several drawbacks related to gaps
in content and understanding, group discussion could serve to help
learners exchange information and co-construct new knowledge. In
terms of student assessment, it could illustrate to the instructor how
much knowledge students have acquired and how they have used their
Figure 8.1 A screenshot of my online class. Three profile pictures have been
covered for anonymity.
Analyzing Presence in Online Learning Environments 153
OL was a challenge for me right from the first meetings with my supervisors,
as I tried to deal with the complexity of online meetings given the unstable
internet connections, calls dropping, and time differences. These negative
aspects amplified my feelings of disconnection and isolation, resulting in
me experiencing a lack of belonging and an uncertain sense of identity.
When my data generation was finished, I submitted applications to return
to campus. Unfortunately, my applications were rejected. My concern was
that the lack of physical study support and lack of social contact with staff
and other students would jeopardize my capacity to finish my program
on time. All supervision and communication activities were redesigned
and conducted online. It was not until three of my colleagues and I had a
group chat one day, coming up with the idea of creating a virtual “library”
on Zoom, that I reestablished my momentum. We held a daily Zoom
meeting that replicated a library’s quiet space. Every time I looked at the
screen and saw my friends working, it inspired me to do the same.
Our virtual library provided not only intellectual support when we
shared academic knowledge, but also significant emotional support
during our difficult and painful research journey. I was moved to tears
when one friend burst into tears after learning that her elderly parents
had contracted COVID-19 and were being rushed to hospital. Another
friend broke down after learning that her best friend had died of COVID-
19. I was away from the screen for three weeks during my mother’s
hospitalizations for a spike in liver enzymes, followed by the deaths of
two individuals who were close to me. The library added an element of
humanity by connecting us with others and making our suffering visible
to those who could support us (see Figure 8.2).
In the second year of the pandemic, I reached a critical point in finishing
my thesis. As I was new to post-qualitative inquiry approaches, I anticipated
that working closely with the supervision staff during in-person meetings,
rather than online meetings, would be more beneficial and increase my
154 Pham, Tran, Xuan, and Le
Figure 8.2 Attendees in the virtual library. We put our hands over our mouths to
de-identify ourselves.
chances of success. However, the international travel ban made this option
impossible. Recognizing my academic challenges, my supervisor held
monthly online individual progress meetings that provided a safe space
for me to explore my feelings of insecurity and manage my academic
journey. In addition, my supervisor chaired monthly “silence and write
sessions” and “peer reading sessions,” where she encouraged her doctoral
student cohort to meet and connected us to external reading groups, net-
work activities, podcasts, and conferences related to our fields of study.
Using regular online connection, she created a community of robust
scholarly engagement that fostered new approaches, perspectives, skills,
knowledge, and meaning-making in online workshops and meetings. As
a result, I had an individual meeting, group reading or writing session,
or other academic forums weekly. These sessions were useful sources of
development of cognitive presence (Gibson et al., 2012).
This OL experience was an incredible form of intellectual and cogni-
tive support that offered participants a high level of course engagement
Analyzing Presence in Online Learning Environments 155
OL had negative impacts for me, despite the fact that initially it looked
advantageous. Yes, it allows learners to learn from anywhere, yet I think
that OL has caused students to become disengaged from the classroom
(Kuo et al., 2014). I noticed that the hugs and handshakes that normally
occur when meeting others did not take place; instead, I was confronted
with black squares on the computer screen that made my emotions grad-
ually turn negative, leading to loss of motivation. Although my class was
multicultural, no activities facilitated any cultural interactions. The class
was merely a monologue presentation of the lecturer. I found that I was
missing out on important elements of building a social presence in an OL
environment because my classmates and I did not have the opportunity
to express our feelings to one another (Garrison, 2009).
I also concluded that the way in which assessment was conducted in
my class did not accurately gauge student learning. I decided to drop out
from my doctoral program in economics because my teacher used free
material on the internet to make questions for the test. I was shocked: I did
not think that a professor would test graduate students with questions
sourced from Quizlet (https://quizlet.com/). Although I consulted with
other instructors and other students in the class, no one seemed to care
about the matter. This situation soured my opinion about the integrity
and quality of online teaching and learning during the pandemic.
According to Garrison (2009), to create and maintain a demanding com-
munity, teaching presence is essential, and instructional planning must
156 Pham, Tran, Xuan, and Le
References
Fall, R., Webb, N. M., & Chudowsky, N. (2000). Group discussion and large-
scale language arts assessment: Effects on students’ comprehension. American
Educational Research Journal, 37(4), 911–941. https://doi.org/10.2307/1163497
Firang, D., & Mensah, J. (2022). Exploring the effects of the COVID-19 pandemic
on international students and universities in Canada. Journal of International
Students, 12(1), 1–18. https://doi.org/10.32674/jis.v12i1.2881
Garrison, D. R. (2007). Online community of inquiry review: Social, cognitive,
and teaching presence issues. Journal of Asynchronous Learning Networks, 11(1),
61–72. www.learntechlib.org/p/104064/
Garrison, D. R. (2009). Communities of inquiry in online learning. In Encyclopedia
of distance learning (2nd ed., pp. 352–355). IGI Global.
Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and
practice. Routledge.
Garrison, D. R., Anderson, T., & Archer, W. (1999). Critical inquiry in a text-
based environment: Computer conferencing in higher education. The
Internet and Higher Education, 2(2–3), 87–105. https://doi.org/10.1016/
s1096-7516(00)00016-6
Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive
presence, and computer conferencing in distance education. American Journal of
Distance Education, 15(1), 7–23. https://doi.org/10.1080/08923640109527071
Garrison, D. R., Anderson, T., & Archer, W. (2010). The first decade of the
community of inquiry framework: A retrospective. The Internet and Higher
Education, 13(1–2), 5–9. https://doi.org/10.1016/j.iheduc.2009.10.003
Gibson, A. M., Ice, P., Mitchell, R., & Kupczynski, L. (2012). An inquiry into
relationships between demographic factors and teaching, social, and cogni-
tive presence. Internet Learning, 1(1), 7–17. https://doi.org/10.18278/il.1.1.2
Gikandi, J., Morrow, D. A., & Davis, N. F. (2011). Online formative assessment
in higher education: A review of the literature. Computers & Education, 57(4),
2333–2351. https://doi.org/10.1016/j.compedu.2011.06.004
Graduate Student Association. ( July 2020). Submission to University of Melbourne’s
Semester 1 2020 assessment review. University of Melbourne. https://gsa.
unimelb.edu.au/wp-content/uploads/2020/07/Online-Assessment-
Submission-FINAL-copy.pdf
Graham, C. R., Woodfield, W., & Harrison, J. B. (2018). A framework for insti-
tutional adoption and implementation of blended learning in higher educa-
tion. The Internet and Higher Education, 18, 4–14. https://doi.org/10.1016/j.
iheduc.2012.09.003
Hernandez, K. A. C., Chang, H., & Ngunjiri, F. W. (2017). Collaborative
autoethnography as multivocal, relational, and democratic research:
162 Pham, Tran, Xuan, and Le
Majeski, R. A., Stover, M., & Valais, T. (2018). The community of inquiry and
emotional presence. Adult Learning, 29(2), 53–61. https://doi.org/10.1177/
1045159518758696
Nguyen, T. M., & Le, G. N. H. (2021). The influence of COVID-19 stress on psy-
chological well-being among Vietnamese adults: The role of self-compassion
and gratitude. Traumatology, 27(1), 86–97. https://doi.org/10.1037/
trm0000295
Omar, H. A., Ali, E. M., & Belbase, S. (2021). Graduate students’ experience and
academic achievements with online learning during COVID-19 pandemic.
Sustainability, 3, Article 13055. https://doi.org/10.3390/su132313055
Sahu, P. (2020). Closure of universities due to coronavirus disease 2019 (COVID-
19): Impact on education and mental health of students and academic staff.
Cureus, 12(4).
Tran, V., Le, G. N. H., & Thuy, T. L. (2022). Impacts of international education
shifts through transnation stories of three Vietnamese doctoral students. In
A. W. Wiseman (Ed.), Annual review of comparative and international education
2021, Vol 42A (pp. 93–105). Emerald Publishing House.
Xie, X., Siau, K., & Nah, F. F. H. (2020). COVID-19 pandemic—online education
in the new normal and the next normal. Journal of Information Technology Case
and Application Research, 22(3), 175–187. https://doi.org/10.1080/15228053.2
020.1824884
Yang, Y., & Cornelius, L. F. (2004). Students’ perceptions towards the quality of online
education: A qualitative approach. Association for Educational Communications
and Technology.
Yee, E., Jung, C., Cheriberi, D., Choi, M., & Park, W. (2022). Impacts of
transitioning to an online curriculum at a graduate school in South Korea due
to the COVID-19 pandemic. International Journal of Environmental Research and
Public Health, 19, Article 10847. https://doi.org/10.3390/ijerph191710847
The Unintended 9
Influence of COVID-19
Optimizing Student Learning by
Advancing Assessment Practices
through Technology
Katrina Carbone, Michelle Searle,
and Saad Chahine
DOI: 10.4324/9781003347972-12
The Unintended Influence of COVID-19 165
Research Context
Methodology
This qualitative study used a case study design. Yin (2018) defined a case
study as a comprehensive empirical inquiry that is positioned in context
to construct aggregate understandings of a phenomenon. The study
involved preservice teachers enrolled in the AEC. By studying this specific
group within a specific course, we aimed to facilitate a rich description
and understanding of how assessment and technology can support the
growth and development of preservice teachers.
168 Katrina Carbone, Michelle Searle, and Saad Chahine
Participants
Data were collected from one AEC cohort of preservice teachers (N = 16)
in Ontario. This cohort included primary/junior (Grades K–6) and inter-
mediate/senior (Grades 7–12) preservice teachers. Thirteen of the 16
preservice teachers enrolled in the AEC consented to their data being
included. Participation in the research was voluntary, and participants
were free to withdraw any data related to their course assignments or
activities. As we were the AEC instructors, participants were a sample of
convenience, given that they were enrolled in the course, but it was also
a purposeful sample, given the intended area of study.
Data Collection
Data were gathered from course activities, assignments, and three focus
groups held at two time points. Although the course syllabi were not
a direct data source, they informed the description of assignments and
course topics. Data collection took place both in person and online to
mirror the course delivery format. As well as consenting to participate in
the focus groups, participants also consented to the use of their course
assignments and activities for research.
Focus Groups
Data Analysis
Qualitative data were extracted from the course site and deidentified
before data analysis using a two-phase process. The initial analysis began
with us precoding course activities (e.g., Nearpod reports) and course
assignments the participants had agreed to share. Precoding involved
circling and highlighting significant passages worthy of attention
(Layder, 1998; Saldaña, 2016). The course activities and assignments
were developed to encourage personal and professional growth, and thus
included reflections on the learning process, technology integration, and
course material. We then engaged in open coding of the verbatim focus
170 Katrina Carbone, Michelle Searle, and Saad Chahine
Findings
The data extracted from the course activities and assignments were cross-
referenced with the focus group transcripts to provide insight into how
the AEC enabled learning about classroom assessment and the use of
technology. Identifying alignment between the data sources also provided
an opportunity to examine participants’ growth in knowledge and skills
needed for assessment literacy. The TPACK framework (Mishra & Koehler,
2006) was used to identify areas where technology had been integrated and
provided a lens for considering aspects of participants’ contemporaneous
experiences and future possibilities for assessment and technology. Four
overarching themes were found: infinite assessment possibilities, limited
access to resources, increased time commitment, and the learning curve.
The themes reflect preservice teachers’ perspectives about assessment edu-
cation and technology. In the discussion section, we examine our efforts to
develop assessment literacy in relation to the TPACK model.
Within the AEC course, interactive presentation tools (e.g., Nearpod, Poll
Everywhere, Pear Deck) were used to bolster student engagement and
model the use of technology with multifaceted options as part of peda-
gogy. One preservice teacher saw the use of Nearpod in their classroom
as effective for teaching but felt they “couldn’t do any more than one
lesson, because it just took so long to prepare,” and they were responsible
for three classes each day. After all, preservice teachers told us, “Teachers
are busy,” and instructional time is precious. Participants found that an
increased time commitment is required to plan for and then effectively
use technology for teaching and assessment in K–12 classrooms.
In addition to preparation time, a major hurdle in classroom work or
assessment sessions is the extra time students needed to set up or log
in. For instance, one preservice teacher shared their experience of the
amount of time needed for students to type in “a password that was lit-
erally on the board.” Despite the password being in plain sight, students
“still like to raise their hands to get help, . . . [and technology] uses a
lot of time.” The increased time commitment varied with class size,
students having their own devices, and school access to shared devices.
Where technology can streamline submission and feedback, learners’
The Unintended Influence of COVID-19 175
Discussion
The study findings provide a glimpse into the ways in which preservice
teachers think about assessment education, technology, and assessment
practices (e.g., assessment design, co-constructing learning goals, test
design, scoring) using technology. Given the dynamic nature of learning,
this discussion explores the complexity of developing assessment literacy
and how the many intersections offered through the TPACK framework
(Mishra & Koehler, 2006) align with the AEC course.
Participants enrolled in the course because they had an interest in
the topics or had goals related to eventual educational leadership. Most
course participants expressed uncertainty around assessment and a desire
to feel confident using data to make decisions about instruction and
grading. Our experiences and research show a lack of teacher confidence
about assessment education and also provide evidence of how central it
is to education because a significant amount of time is spent in activities
related to assessment and evaluation (e.g., Black & Wiliam, 2018; Looney
et al., 2018).
As instructors who have codeveloped and taught the AEC course
over multiple years, we often simultaneously conduct inquiry to con-
tinually improve our teaching and learning related to assessment (e.g.,
DeLuca et al., 2021; Searle et al., 2021). Although research into classroom
assessment (e.g., McMillan, 2013; Turner, 2012) provides valuable insight,
assessment literacy remains the core goal of the course. When it comes
to promoting assessment literacy, although the placement assignments
were not our responsibility as the AEC instructors, the different options
(e.g., in-person, remote synchronous, asynchronous modules) proved
to be a barrier to preservice teachers learning to apply assessment and
technology ideas. Course participants expressed regret in not having
in-person K–12 placement opportunities due to the COVID-19 pandemic.
Often, preservice teachers were curious about how virtual placements
would be applicable to in-person classrooms and whether these modified
placements might influence their professional learning or future class-
room practices.
The Unintended Influence of COVID-19 177
Conclusion
References
Koehler, M. J., Mishra, P., & Cain, W. (2013). What is technological pedagogical
content knowledge (TPACK)? Journal of Education, 193(3), 13–19. https://doi.
org/10.1177/002205741319300303
König, J., Jäger-Biela, D. J., & Glutsch, N. (2020). Adapting to online teaching
during COVID-19 school closure: Teacher education and teacher competence
effects among early career teachers in Germany. European Journal of Teacher
Education, 43(4), 608–622. https://doi.org/10.1080/02619768.2020.1809650
Layder, D. (1998). Sociological practice: Linking theory and social research. Sage
Publications.
Linder, K. E. (2017). The blended course design workbook: A practical guide. Stylus
Publishing.
Looney, A., Cumming, J., van Der Kleij, F., & Harris, K. (2018). Reconceptualising
the role of teachers as assessors: Teacher assessment identity. Assessment in
Education: Principles, Policy & Practice, 25(5), 442–467. https://doi.org/10.1080/
0969594X.2016.1268090
McMillan, J. (Ed.). (2013). SAGE handbook of research on classroom assessment. Sage.
Ministry of Education. (2010). Growing success: Assessment, evaluation, and reporting
in Ontario schools. www.edu.gov.on.ca/eng/policyfunding/growSuccess.pdf
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content know-
ledge: A framework for teacher knowledge. Teachers College Record, 108(6),
1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x
Mishra, P., Koehler, M. J., & Kereluik, K. (2009). The song remains the same:
Looking back to the future of educational technology. TechTrends, 53(5),
48–53. https://doi.org/10.1007/s11528-009-0325-3
Namyssova, G., Tussupbekova, G., Helmer, J., Malone, K., Mir, A., & Jonbekova,
D. (2019). Challenges and benefits of blended learning in higher education.
International Journal of Technology in Education, 2(1), 22–31. www.ijte.net/
index.php/ijte/article/view/6
Parra, J., Raynor, C., Osanloo, A., & Guillaume, R. O. (2019). (Re)Imagining
an undergraduate integrating technology with teaching course. TechTrends,
63(1), 68–78. https://doi.org/10.1007/s11528-018-0362-x
Patton, M. Q. (2014). Qualitative research & evaluation methods: Integrating theory
and practice. Sage.
Phillips, M. (2017). Processes of practice and identity shaping teachers’ TPACK
enactment in a community of practice. Education and Information Technologies,
22(4), 1771–1796. https://doi.org/10.1007/s10639-016-9512-y
Popham, W. J. (2011). Assessment literacy overlooked: A teacher educator’s con-
fession. The Teacher Educator, 46(4), 265–273. https://doi.org/10.1080/08878
730.2011.605048
The Unintended Influence of COVID-19 183
Popham, W. J. (2013). Classroom assessment: What teachers need to know (7th ed.).
Pearson.
Saldaña, J. (2016). The coding manual for qualitative researchers (3rd ed.). Sage
Publications.
Searle, M., Ahn, C., Fels, L., & Carbone, K. (2021). Illuminating transformative
learning/assessment: Infusing creativity, reciprocity, and care into higher
education. Journal of Transformative Education, 19(4), 339–365. https://doi.
org/10.1177/15413446211045160
Stiggins, R. J. (1991). Assessment literacy. Phi Delta Kappan, 72(7), 534–539.
Turner, C. E. (2012). Classroom assessment. In G. Fulcher & F. Davidson (Eds.),
The Routledge handbook of language testing (pp. 65–78). Routledge.
Yin, R. K. (2018). Case study research and applications: Design and methods (6th ed.).
Sage.
Part III
Teacher Educators and
Assessment in K–12 Contexts
Pedagogically Hacking 10
the System
Developing a Competency-
based Digital Portfolio
Kathy Sanford, Hong Fu, Timothy Hopper, and
Thiago Hinkel
In the last decade, the British Columbia (BC) provincial curriculum has
undergone a seismic restructuring, from a prescribed learning outcomes
curriculum with end-of-school provincial exams to a competency-based
curriculum with performance measures and proficiency scales. The
intent of the restructuring by the BC Ministry of Education and Child
Care (BCMoE) was to enable student success through “a concept-based
approach to learning and a focus on the development of competencies,
to foster deeper, more transferable learning” (BCMoE, n.d.-c, p. 4). In
support of competency-based curricula, Trilling and Fadel (2009) called
for educators to shift their practices towards “a new balance” (p. 38),
which, as we discussed in relation to the BC curriculum, has “indicated a
shift from direct instruction for predetermined and more easily measured
outcomes to evolving and more interactive learning, as well as collective
problem-solving” (Fu et al., 2018, p. 265).
In their book examining the evolution of the new BC curriculum,
Sanford and Hopper (2019) noted that it was developed within the larger
context of a revolution in understanding how people learn, which is
learning about learning. They further stated that for the existing system
DOI: 10.4324/9781003347972-14
188 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
to keep pace with the rapidly changing world, many educators were
working to shift BC education from
how tools and systems work and the powers that control them, and
then enacting novel uses for them based on pedagogical objectives. To
enact a competency-based curriculum, teachers often must overcome
both systemic and technological barriers and hack their way through by
“subverting constraints that undermine authentic knowledge generation”
(Hagerman et al., 2018, p. 122). In this sense, educationally intended
hacking is a response to a system that tends to compartmentalize people
and skills and, in the context of digital technologies, seeks to produce
a workforce that is “compliant and instrumental in their thinking and
practice” (Smith et al., 2018, p. xv). The concept of pedagogical hacking
aligns with a competency-based curriculum by considering the needs and
interests of students as individuals in their specific contexts.
Methodology
This study uses a narrative case study approach (May, 1997), considering
ways in which assessment can be not only more supportive of stu-
dent learning but also attentive to equity and the diversity of learners.
Drawing on Merriam (2009) and Smith and Sparkes (2020), we used a
qualitative descriptive process focused on participant interviews and
anecdotal insights. We also focused on the narratives generated by the
participants to explain their motivations and gained insights from their
experiences.
The significant technology hack that emerged for the participants was
the use of Google Sheets. In this hack, they took the freely available
spreadsheet program designed for numerical calculations and accounting
tasks to create a matrix that represented curricular competencies. See
Figure 10.1 for an example from the Grade 8 math curriculum.
In the matrix chart, the teacher broke the subject-area curriculum
(knowing) into curricular competencies; for example, in Figure 10.1,
one competency was thinking and communicating. Then, based on the
learning standards for that subject and grade level, the teacher divided
the curriculum into topic areas of understanding to be shown through
class activities and assignments that generated artifacts as evidence of
learning. Students then, in consultation with the teacher and based on
the evidence they generated and how they generated that evidence,
self-assessed themselves in relation to the topic areas as “emerging,”
“developing,” “proficient,” or “extending.” As shown in Figure 10.2,
the Google Sheets app allowed the student to shade in an area with the
comment tool to indicate they had achieved that level in a topic either
with hyperlinks to evidence or with reference to an experience witnessed
or confirmed by the teacher.
Pedagogically Hacking the System 193
Alice described her chart and scales as the “opportunity for students
to grow and to show their learning in these specific areas or to recog-
nize that they went way above and beyond the expectations.” Alice had
been teaching at a local high school for 14 years, and she was skilled in
194 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
Figure 10.2 Example of high school Grade 10 English chart with shading and commenting focused on new media.
Pedagogically Hacking the System 195
The challenge for Alice was to report on students at the midterm point of
her courses. Previously, with support from administration, she could put
no mark at midterm and report at the end of term when students had had
the opportunity to demonstrate their ongoing competence. However,
due to a change in administration, she had to fulfil the reporting order and
struggled with identifying a midterm number to report. Alice explained:
Other teachers in the district were interested in trying out the compe-
tency chart. After doing a 10-minute presentation at a professional devel-
opment session for 200 secondary teachers, Alice was “flooded with
emails from high school teachers that were like, ‘I’ve been really trying
to figure something out, and we really want to do something like it.’ ”
196 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
You would just put a link from the portfolio if it’s online into the
chart, and if you want to see while you’re developing in this area,
you just click and then there is the portfolio evidence right there. So,
it’s perfect in that sense.
She agreed that the competency chart was a kind of educational hacking:
Greg also found it best not to provide a number grade at midterm when
students have moved only halfway through their learning. He believed
the competency chart was “a kind of portfolio.” For Greg, “It’s really
about using technology to have ease of access to their work; just two
clicks and I can see their work.”
Using the comment function in Google was also a way for students to
upload and link their evidence to the chart. Greg started using the term
“hacking” after he went to an educational technology conference where
other teachers who were presenting were saying, “hack this, hack that”:
Greg also realized from other teachers that Google Slides could be
combined with Google Sheets to become a powerful assessment tool. As
he stated,
Like Alice, Greg believed that different from traditional marking, using
the competency chart could reduce students’ focus on comparing marks
and put more focus on their own learning; the change of the curriculum
and Greg’s assessment shift happened at the same time. The rollout of
the BC new competency-based curriculum created the conditions, along
with Greg’s knowledge of how to hack Google apps, to radically shift his
assessment practices.
“If you’re just getting three out of ten all the time, you never feel
successful.” Christine came to teaching with a different perspective on
assessment because she used to be a nurse. This experience influenced
her understanding of assessment in the best interest of students:
As a nurse, you spend your whole day assessing people from head
to toe, and you are not looking to give them a seven or a nine. . . .
This is where you are at; this is what I’m going to do as my course
of treatment; where are you going to get to? . . . When I came to
teaching after my nursing career, I looked at assessing my learner
like a patient. This is where you are at; this is where I want you to
go. . . . Everybody around me was using numbers, but it just did not
fit for me. . . . I worked with a lot of struggling learners, and I really
liked to work with struggling learners, so I know that one of the
things they really need is to experience success, and if you’re just
getting three out of ten all the time, you never feel successful.
Christine started trying alternative forms of assessment that did not focus
on giving numbers:
When I first started playing around, I also wasn’t sure where that
was gonna take me, and I didn’t see how easy it would be to not
use numbers; if you don’t want kids to ask you about numbers, you
have to not use them. It’s the only way; otherwise, it only seems
reasonable that they’ll ask you about them, and that will matter to
them.
Pedagogically Hacking the System 199
I think for the high achievers, their struggle is that they want to
quantify usually and so they ask questions like, “What does a 90
look like on here?” That tends to be their focus. I must shift them
to think about their learning as opposed to what the number is
because there’s so much more driven by that carrot at the end. The
second time I teach them it’s no problem at all because they know
their mark is not in jeopardy; it’s their learning that we’re focused
on, and they understand that with me. . . . For the struggling kids,
I think it’s probably kind of the bigness of the chart, the language
of the chart.
“I’d ask the students, ‘Where do you think you’re at?’ So we get them to
self-reflect to decide the grade.” After completing her education degree
in Victoria, Cathy went to Calgary to teach middle school, where she
experienced an assessment change to outcome-based assessment. The
students were still getting a grade at the end of the term, but in a variety
of categories (e.g., practical skills or theory). After moving back to BC
from Calgary, she noted,
200 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
By using this approach, the teacher wanted to get students away from
working at a grade or being driven by a grade to thinking about what
they were learning, what they could do, and what they needed to work
on.
Cathy also talked about the reporting requirement to give a final grade.
With the chart, she found that the process could be more personalized:
thought they were at. And then we would have a discussion around
it, and most of the time the kids know, like they know where they’re
at based on what they’ve seen from their chart.
Students could show their level in a way of their choice. For example,
when Cathy worked at the high school, students could still prepare for
a final exam as evidence of learning if they chose to take one. The chart
also allowed for continuity in learning and assessment for students. Cathy
gave the following example:
It worked out well, too, with some of our kids who you know didn’t
pass; so [Christine] and I both taught English 9. So, I had one student
the first semester and he didn’t end up passing English 9. . . . I think
transition to high school was kind of a tough transition for him, so
he just didn’t get a lot of work done, but I just passed his chart on
to her so she could see what he’d already accomplished. She could
just go from there.
Although Cathy believed that it was important for students to learn about
technology in positive ways, her motivation for doing the competency
chart was mainly the assessment and learning pieces:
I just love the authenticity of it, if that makes sense; like for me,
I said it felt so wrong to be just choosing this percentage, but this is
like actually they’re seeing their learning. There’s the evidence of it;
it’s not just like this is a 12 out of 20. It’s, you know, actually looking
at what they’re working on and what they’re able to do, and then
where is the next step? Where do I go from there?
Cathy saw her students moving away from a focus on percentages and
grades to talking about what they could do and what they needed to
work at.
Discussion
Conclusion
During the pandemic particularly, but also more generally, educators could
easily lose sight of the significance of authentic assessment as an inte-
gral aspect of learning. Educators tend to focus on learning activities and
course content while overlooking assessment as a driver for meaningful
learning. However, the four teachers in this study utilized assessment as
dialogue around learning experiences, enabling learning to be perceived
as ongoing, connected, and developmental. They personalized assessment
and worked with students to support their individual and collective needs
rather than viewing assessment as something done to students (Fu et al.,
2018; Schimmer, 2016).
The innovative assessment approaches outlined in this paper were
important throughout the two years of COVID-19, which demanded
Pedagogically Hacking the System 203
References
Abaci, S., Robertson, J., Linklater, H., & McNeill, F. (2021). Supporting
school teachers’ rapid engagement with online education. Educational
Technology Research and Development, 69(1), 29–34. https://doi.org/10.1007/
s11423-020-09839-5
BC Teachers’ Federation. (October 27, 2021). BCTF response to the draft K–12
reporting order. www.bctf.ca/whats-happening/news-details/2021/10/27/
bctf-response-to-the-draft-k-12-reporting-order
Black, P., & Wiliam, D. (2009). Developing the theory of formative assessment.
Educational Assessment, Evaluation and Accountability, 21(1), 5–31. https://doi.
org/10.1007/s11092-008-9068-5
Black, P., & Wiliam, D. (2018). Classroom assessment and pedagogy. Assessment
in Education: Principles, Policy & Practice, 25(6), 551–575. https://doi.org/10.1
080/0969594X.2018.1441807
Blades, D. (2019). Science education in British Columbia: A new curriculum
for the 21st century. In C. Tippett & T. Milford (Eds.), Science education in
204 Kathy Sanford, Hong Fu, Timothy Hopper, and Thiago Hinkel
DOI: 10.4324/9781003347972-15
Malfunction 207
Online Pedagogy
Sound Assessment
There are three forms of assessment: assessment of, for, and as learning
(Earl, 2003). Assessment of learning is a form of summative assessment
that often translates to marks and grades, whereas assessment for learning
centres learning over grading (Black & Wiliam, 1998) and assessment as
learning invites students into the assessment process (Earl, 2003). Current
perspectives discourage competition, judging, and ranking, and instead
recognize that assessment should support ongoing active learning and
joy (Fu et al., 2022). In this way, assessment is no longer something that
is done to students, but is done with, by, and for students (MECY, 2006).
This approach to assessment is in keeping with the etymological roots
of the term assessment—to sit beside: “Assessment seen as ‘sitting beside’
implies particular roles and relationships for learner and teacher, different
from those associated with assessment as ‘standing in front of,’ ‘looking
down on,’ or ‘peering over the shoulder” (Swaffield, 2011, p. 440).
Research has repeatedly demonstrated that AfL is a powerful prac-
tice to promote learning (Earl et al., 2011); for this reason, AfL should
represent the largest share of a teacher’s practice (Earl, 2003). The pri-
macy of AfL is evident in the policy directives, resources, and professional
development of nearly every Canadian province and territory (Earl et al.,
2011; MECY, 2006). AfL is anchored to three key elements: decentre con-
tent, centre students, and teacher professionalism.
Decentre Content
Centre Students
Teacher Professionalism
learning theory” (Earl et al., 2011, para. 17). All elements of this process
require strong, trusting relationships between teachers and students.
Because EdTech is designed within and for a neoliberal frame, it
perpetuates a culture of individualized task completion, grading, com-
petition, surveillance, and standardization that directly contradicts
assessment literature.
Findings
The educators in our study made clear that the intent of education is to
encourage critical and creative thinking, motivate student inquiry, foster
relationships, develop problem-solving skills, and practice discussion and
deliberation. With this understanding in mind, the participants could
not reconcile their participation in transmissive pedagogy that deepened
inequities and abandoned students with exceptionalities. They spoke
of the time they spent making videos, sending emails, uploading con-
tent, and tallying assignments. Most of their time was spent preparing
and uploading content and materials. These activities did not reflect the
participants’ understandings of pedagogy, assessment, or the purpose of
education. This incongruity was especially true for assessment:
Teacher participants spoke of being cut off from the bodily, the visceral,
experience of teaching and learning. These comments follow Stommel’s
(2018) critiques of the ways LMS “replace the playful work of teachers
and students with overly simplified algorithms that interface with far too
few of the dynamic variables that make learning so visceral and lively”
(p. 78). As Freire (2005) wrote, “We study, we learn, we teach, we know
214 Moore, de Oliveira Jayme, and Sanford
with our entire body” (p. 5). Teaching is a full-blooded, social, human
process (Connell, 1993). Teaching is “fundamentally emotional work
that involves getting up close to students and drawing heavily on social,
emotional resources and energy necessary for continual improvisation”
(Smyth, 2012, p. 15). The lack of visceral and emotional connection
online impacted the teachers’ capacity to respond to the particular
needs of their students. As Riley’s quote above elucidates, they were left
searching for facial expressions, voices, a sense of who their students
were. Without these connections, teachers lack the knowledge required
to properly engage AfL. Riley could not read their students, and in turn
did not feel the connection required to guide learning. Riley’s sentiments
were echoed by other participants:
And at home it’s really hard because I can’t see their little furrowed
brow, you know. Like I don’t know how to do it . . .. I don’t know how
to explain it, but like you know what, teachers know what kids need
from their body language. I know who needs help with something by
how they’re sitting or how they’re looking. You don’t know that online.
(Noah)
Online teachers lack the necessary interaction with students to make the
daily professional assessments required. AfL requires teacher professional
judgement, which requires the ability to see, understand, and connect
with students.
Beyond a lack of recognition of their students, the teacher participants
often did not recognize themselves: “And I had no interest in interacting
with my students, which is the absolute opposite of what I’m normally
like” (Dakota). For Dakota, the inability to be in the same physical envir-
onment with their students, to teach with their full body—alongside the
goosebumps, smiles, furrowed eyebrows, crossed arms, and exchanged
glances—destroyed their desire to connect with students. This disem-
bodied form of teaching contributed to demoralized feelings, as the teacher
participants could not fulfil the ethical requirements of their profession.
The functionality of EdTech not only interfered with the participants’
professional obligations but also often served as a reminder of these
failures:
of my life, where I want to help each and every one of those kids,
and the tyranny of that noise, and they make it seem so efficient
and so fluid. And it’s actually so awkward and time-consuming and
restrictive.
( Jules)
as data points” (Moore & de Oliveira Jayme, 2022b, para. 23). This focus
on data fundamentally changes the relationship between students and
teachers.
Beyond our serious concerns about students’ rights and necessary
classroom relationships, we question the value and validity of assessment
that is administered through testing software. Put simply, if one can cheat
so easily (a fear fuelled by marketing), does the assessment tool really
align with the current assessment literature? Or is the teacher simply
using a fancy tool to evaluate student recall? Moreover, these techno-
logical solutions miss the foundations of assessment: to be fair and valid.
Is it fair to expect someone to be recorded while they are taking a test?
Moreover, if the research shows that students are motivated and engaged
when they feel they are being treated fairly (Tierney, 2014), how would
this nonconsensual surveillance impact student achievement? When one
is being recorded, are their test results valid? Does any of it truly reflect
what a student knows or understands?
Discussion
“You know, technology does not lead to better outcomes; better teachers
will lead to better outcomes” (Riley). As educators who have embraced
selected technology in our classrooms, we recognize certain pedagogical
advantages. Online pedagogy, however, goes beyond the integration of
technology to be reliant on it. In this way, education risks becoming a
mere exercise of technology (Freire, 2014). In turn, teachers must actively
work against the default settings that centre teachers and content, regu-
late and surveil students, and conflate data collection with teaching
and learning. Moreover, as LaPointe-McEwan et al. (2021) reiterated,
although the principles of sound assessment remain, the way teachers
achieve these principles may look different online. When students and
teachers, rather than technology companies, propel curricula and
methods online, learning better reflects current understandings of cur-
riculum and pedagogy.
Humanizing Assessment
Conclusion
References
Sanford, K., Williams, L., Hopper, T., & McGregor, C. (2012). Indigenous
principles decolonizing teacher education: What we have learned. Education,
18(2), 18–33. https://doi.org/10.37119/ojs2012.v18i2.61
Smyth, J. (2012). Problematising teachers’ work in dangerous times. In
B. Down & J. Smyth (Eds.), Critical voices in teacher education: Teaching for
social justice in conservative times (pp. 13–25). Springer. https://doi.org/10.1007/
978-94-007-3974-1_2
Stommel, J. (2018). A user’s guide to forking education. In S. M. Morris &
J. Stommel (Eds.), An urgency of teachers: The work of critical digital pedagogy
(pp. 77–82). Hybrid Pedagogy.
Stommel, J. (May 11, 2020). Love and other data assets. www.jessestommel.com/
love-and-other-data-assets/
Stommel, J., Friend, C., & Morris, S. M. (Eds.). (2020). Critical digital pedagogy:
A collection. Hybrid Pedagogy.
Suurtamm, C., & Koch, M. J. (2014). Navigating dilemmas in transforming
assessment practices: Experiences of mathematics teachers in Ontario,
Canada. Educational Assessment, Evaluation and Accountability, 26(3), 263–287.
https://doi.org/10.1007/s11092-014-9195-0
Swaffield, S. (2011). Getting to the heart of authentic assessment for learning.
Assessment in Education: Principles, Policy & Practice, 18(4), 433–449. https://
doi.org/10.1080/0969594X.2011.582838
Swauger, S. (April 2, 2020). Our bodies encoded: Algorithmic test proctoring in higher
education. Hybrid Pedagogy. https://hybridpedagogy.org/our-bodies-encoded-
algorithmic-test-proctoring-in-higher-education/
Tierney, R. D. (2014). Fairness as a multifaceted quality in classroom assessment.
Studies in Educational Evaluation, 43, 55–69. https://doi.org/10.1016/j.stue
duc.2013.12.003
Vygotsky, L. S. (1962). Mind in society: The development of higher psychological
processes. Harvard University Press.
Wiliam, D. (2011). Embedded formative assessment. Solution Tree Press.
Leveraging the 12
Relationship Between
Assessment, Learning,
and Educational
Technology
Katrina Carbone, Michelle Searle,
and Lori Kirkpatrick
DOI: 10.4324/9781003347972-16
226 Katrina Carbone, Michelle Searle, and Lori Kirkpatrick
2017). Across various grade levels and subject content, technology can
also make learning more engaging and collaborative, increase motivation,
and encourage self-paced learning with student independence (Ciampa,
2014; Palaigeorgiou & Papadopoulou, 2018; Raja & Nagasubramani,
2018). However, the positive impacts of technology in learning environ-
ments remain contested by some (D’Arcy et al., 2014), and debates con-
tinue as researchers and educators caution that technological tools can be
distracting, increase opportunities for cheating, disconnect learners, and
lead to the deterioration of students’ competencies in reading, writing,
and mathematics (Alhumaid, 2019).
When the COVID-19 pandemic erupted and physical school spaces
were closed, much of education became mediated through technology.
North America and many other regions quickly transitioned to fully
remote schooling with a huge reliance on synchronous platforms such
as Zoom (https://zoom.us), Microsoft Teams (www.microsoft.com/
microsoft-teams), and Google Classroom (https://classroom.google.
com; LaBonte et al., 2021). In keeping with this shift, educators needed
to modify their assessment methods so that assessment could be offered
through technology. Research from prior to the pandemic found that
many educators reported feeling unprepared or uncomfortable using tech-
nology for teaching and assessment ( Johnson et al., 2016). As restrictions
related to the pandemic have eased, the predominance of technology in
education has remained (Willcott, 2021). For example, Ontario continues
to offer online classes as well as fully virtual schools.
Given the predominance of technology in education, it is imperative
to question how technology could support and enhance educational
practices, including assessment. Teachers need to adapt to the digital
environment and provide students with opportunities to develop the
skills required to be successful in the technological era. Figure 12.1 shows
the substitution, augmentation, modification, and redefinition (SAMR)
model developed by Puentedura (2006) to provide a heuristic for exam-
ining technology in assessment, which we mean to include all forms
of assessment delivered or completed with a device. Because limited
research has explored the use of the SAMR model in K–12 contexts, the
distinctions between each category would benefit from empirical evi-
dence about assessment and technology (Hamilton et al., 2016). With this
model in mind, in this chapter we ask, how do classroom teachers leverage
technology to support the evolution of their classroom assessment
conceptualizations and practices, and how might these understandings
translate to online learning contexts?
Assessment, Learning, and Educational Technology 227
Figure 12.1 The SAMR model. Adapted from The SAMR Model Explained (With
15 Practical Examples), by J. Best, 2015 (www.3plearning.com/blog/
connectingsamrmodel/) and from Learning, Technology, and the
SAMR Model: Goals, Processes, and Practice, by R. Puentedura, 2014
(www.hippasus.com/rrpweblog/archives/2014/06/29/.pdf ).
Research Context
Methodology
Initially, data related to the assessment PLC were collected for district
purposes related to understanding how teachers leveraged technology
as they developed assessment conceptualizations and practice. The PLC
was the driver for understanding how teachers integrated technology
with assessment. Now, we are analyzing the same data through quali-
tative secondary analysis (QSA) so that the findings can be shared with
academic and research audiences. QSA has a practical advantage of maxi-
mizing the use of existing data and extending original studies (Tate &
Happ, 2018).
Participants
Although all teachers in the district who received an iPad were invited to
join the PLC, only a small subset participated. In total, 61 teachers from
24 schools got involved. Participants were predominantly female (74%;
n = 44), and the majority of teachers were experienced, with 85% (n = 52)
indicating they had between 5 and 25 years of experience. Representation
across divisions was fairly even, with 43% (n = 26) elementary teacher
participants and 52% (n = 32) secondary teacher participants, and 5%
(n = 3) of participants identified themselves as program support coaches
assisting with implementation. Participants represented an array of cur-
riculum areas; the three most common were English, science, and math.
Data Collection
A substantial amount of data were collected from the PLC as part of the
program evaluation. For the purposes of this chapter, data from surveys,
online discussion board posts in the PLC, and arts-based activities are
included.
Assessment, Learning, and Educational Technology 229
Survey Data
Prior to the launch of the PLC, the inquiry team co-constructed a survey
for use at the launch and wrap-up of the PLC. At the introductory session,
participants completed the anonymous survey with a total of 11 questions:
nine closed and two open-ended. Questions included demographic infor-
mation and iPad-specific questions (e.g., comfort using the iPads). During
the concluding session, participants completed an anonymous survey
(n = 28) that included seven questions (one closed and six open-ended) to
share their experiences with the PLC and key takeaways about personal
and professional shifts. Both surveys were offered online using Qualtrics
(www.qualtrics.com); the online offering was in alignment with the
overall PLC experience.
The discussion board was an online platform that allowed for asyn-
chronous engagement in the PLC (Osborne et al., 2018). Participants
posted an answer to a question posed by the inquiry team (e.g., When
thinking about assessment and the devices, what are you celebrating and
what is challenging for you?), uploaded resources or exemplars, described
their experiences, reflected on their growth, and interacted with one
another using the comment feature on the forum. There were 167 posts
in total (n = 55); 64% of participants posted 10 or more times, and 30% of
participants posted 20 or more times. The length of the postings varied;
some were short comments and others were in-depth paragraphs.
Arts-informed Data
During the session to launch the PLC, image elicitation (Harper, 2002)
was used with images provided by the inquiry team to develop community
amongst the PLC members and generate data about participant perceptions
of assessment and technology. Participants were provided with prompts
to provoke their selecting and sharing with images. Prompts included
“Thinking about assessing with technology makes me wonder . . . ,”
and “A question that lingers for me is . . .” Participants responded to the
prompt with words, word clusters, or full sentences. At the last session of
the PLC, poetic inquiry was used. Poetic inquiry is an emergent research
methodology that has no fixed definition “because the work undertaken
through the methodology is not limited solely to artistic, aesthetic, educa-
tional or research-focused spaces” (Vincent, 2018, p. 49). In the PLC con-
text, poetic inquiry was used during the concluding face-to-face session so
that participants could reflect upon their experiences. Participants were
divided into groups and each group created folded poems, which as pieces
of anonymous collaborative writing, by responding to a series of prompts
(e.g., “Three words that best describe my assessment practices with the
devices,” and “One wish I still have is . . .”).
Data Analysis
Data (survey data, discussion board postings, arts-informed data) from the
PLC were analyzed using an inductive three-step coding cycle (Saldaña,
2013) with sessions for the authors of this chapter to engage in collabora-
tive dialogical sensemaking. In the first cycle, precoding, which involved
circling, highlighting, bolding, underlining, or colouring rich or significant
participant quotes or passages (Layder, 1998), was used to become familiar
with the specificities of the data and key technology and assessment ideas.
Next, the first author used the initial coding method to provide analytic
indications for future exploration (Charmaz, 2014). Using findings from
the initial coding approach, focused coding allowed for categorization
based on thematic or conceptual similarities (Saldaña, 2013).
Results
Two salient themes arose from the data. The first was related to the
PLC community. The second and most dominant theme was related
to expanded assessment practices. Subthemes for expanded assessment
Assessment, Learning, and Educational Technology 231
The PLC was a place where teachers shared their excitement for learning,
and how inquiry in the form of questioning can nourish passion. The
intentional efforts to build a community through the blended model
were well received by participants because they “felt a real sense of com-
munity with everyone” and explained “it was REALLY important to see
and talk with colleagues at the beginning and end of the project.” When
asked at the end of the PLC survey if the community was a useful space
for professional collaboration, 91% of respondents indicated they agreed
(n = 23) or strongly agreed (n = 7). Teachers were asked if they would
engage in a similar online community in the future, and 88% (n = 30)
responded “yes.”
Student Choice
[We are] overcoming the myth that students are proficient at all
these neat apps/formats and are itching to use a variety of them
if we give them choice. I find that is true for a very small number
of them. Most are either overwhelmed by the choice, and/or
revert back to what they know. So mandating certain formats, and
instructing them on the effective use of each, is still a focus in my
junior classes.
Meaningful Feedback
required to digest the feedback and work on their next steps.” Comments
related to teacher feedback signalled an overall desire to provide mean-
ingful feedback to promote learning and a recognition that this process
required an investment of time, for both educators and students.
Providing meaningful feedback through technology presented a
challenge for many teachers. Statements from two teachers included
“I still have work to do with regards to kids using feedback provided,”
and “Writing feedback on a screen is difficult and can be hard to read.”
Additional challenges noted included “screen fatigue” and technology as
“tedious.” There were attitudes related to the frustration of learning a
new approach that may or may not be useful or effective in promoting
student learning. Perennial issues around getting students to read and
apply feedback persisted in the online space with additional issues around
navigating technology and streamlining across multiple applications.
One way participants addressed the use of feedback was to distribute
classroom leadership by encouraging students to engage in peer-to-peer
teaching. Teachers used the devices as a way for students to “becom[e]
great leaders both with each other in class and with other classes.” By pro-
viding leadership opportunities to students, students had increased access
to and investment in feedback as well as an opportunity to develop skills
that they could carry into adulthood.
Differentiated Instruction
Learning Culture
went beyond being merely a flashy reward for students; technology could
be used to make “assessments meaningful, relatable, and authentic.”
In sum, teachers supported their reconceptualization of assessment
through expanded assessment practices by providing students with
choice, meaningful feedback, opportunities for differentiated instruction,
and emphasizing a culture of learning. Although these ideas are often
valued by educators, data showed that some participants were resistant
because the impact and benefit of these strategies were classroom specific,
depending on learner perceptions, interests, and skillset. Nevertheless,
the PLC provided an opportunity to explicitly integrate assessment and
technology, and over time, devices were no longer seen as free-time activ-
ities; rather, they were perceived as powerful educational tools by both
teachers and students.
Discussion
Conclusion
References
Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment
in Education: Principles, Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/
0969595980050102
Bush, M. D., & Mott, J. D. (2009). The transformation of learning with tech-
nology: Learner-centricity, content and tool malleability, and network effects.
Educational Technology, 3–20. www.jstor.org/stable/44429655
Charmaz, K. (2014). Constructing grounded theory (2nd ed.). Sage.
Ciampa, K. (2014). Learning in a mobile age: An investigation of student motiv-
ation: Learning in a mobile age. Journal of Computer Assisted Learning, 30(1),
82–96. https://doi.org/10.1111/jcal.12036
Cole, A., & Knowles, J. (2008). Arts-informed research. In J. G. Knowles & A. L.
Cole (Eds.), Handbook of the arts in qualitative research: Perspectives, methodolo-
gies, examples, and issues (pp. 55–71). SAGE Publications.
D’Arcy, J., Gupta, A., Tarafdar, M., & Turel, O. (2014). Reflecting on the “dark
side” of information technology use. Communications of the Association for
Information Systems, 35(5), 109–118. https://doi.org/10.17705/1CAIS.03505
DeCoito, I., & Richardson, T. (2018). Teachers and technology: Present practice
and future directions. Contemporary Issues in Technology and Teacher Education,
18(2), 362–378.
DeLuca, C. (2012). Preparing teachers for the age of accountability: Toward a
framework for assessment education. Action in Teacher Education, 34(5–6),
576–591. https://doi.org/10.1080/01626620.2012.730347
Fives, H., & Barnes, N. (2020). Navigating the complex cognitive task of class-
room assessment. Teaching and Teacher Education, 92, Article 103063. https://
doi.org/10.1016/j.tate.2020.103063
Fullan, M., & Langworthy, M. (2014). A rich seam: How new pedagogies find deep
learning. Pearson.
Hamilton, E., Rosenberg, J. M., & Akcaoglu, M. (2016). The substitution aug-
mentation modification redefinition (SAMR) model: A critical review and
suggestions for its use. TechTrends, 60(5), 433–441. https://doi.org/10.1007/
s11528-016-0091-y
Harper, D. (2002). Talking about pictures: A case for photo elicitation. Visual
Studies, 17(1), 13–26. https://doi.org/10.1080/14725860220137345
Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to
achievement. Routledge.
Herold, B. (2016). Technology in education: An overview. Education Week. www.
edweek.org/technology/technology-in-education-an-overview/2016/02
Huang, R., Spector, J. M., & Yang, J. (2019). Educational technology: A primer for the
21st century. Springer.
Assessment, Learning, and Educational Technology 241
DOI: 10.4324/9781003347972-17
Isolation/Adaptation/Education 245
available. They pose questions and then move to their own creative art
projects. To expand students’ studio skills, they view examples where
such skills have been historically executed. To invoke critical thinking
skills, teachers also plan art field trips in the community (Greene et al.,
2014). These elements prompted the second question: How can the
diverse aspects of the art curriculum, including art history, art theory,
art critique, technical art skills, and art making, be taught to students
in virtual classrooms?
With in-person learning, art assessments occur in day-to-day forma-
tive judgements made by art teachers about students’ artwork, informal
peer- and self-assessment made to student artists privately (Boughton,
2013), and summative assessments of completed work made publicly
(Boughton, 2013). These assessments were hampered in online learning.
Both teachers and students expressed concerns about assessment and
evaluation methods when first pivoting to online learning, sparking the
final two questions explored in this chapter: In a virtual setting, how does
an art educator mimic informal assessments done organically through
the creative working process in an in-person art classroom? When final
artworks are completed, how can learner outcomes be evaluated and
documented?
Researcher’s Background
Literature Framework
The creative process is used by artists around the globe. Lubart (2001)
defined it as a succession of thoughts and actions that result in ori-
ginal and appropriate productions. To establish the curriculum for
online art learning, I looked to the Creative Process section of the
Ontario curriculum (Ministry of Education, 2010, pp. 14–16). The
creative process embodies the basic tenets of art creation, announ-
cing art making as a process requiring both creativity and skill. The
credo sets visual and performing arts aside from other mandatory,
scholastic courses; its components are essential in all stages of art
thinking and creation, and a crucial consideration in art curriculum
design.
The circular creative process is integrated with critical analysis and
thinking entailing questioning, evaluating, making rational judgements,
finding logical connections, and categorizing; critical thinking demands
openness to a plurality of world views. Reflection and feedback from
peers and teachers are at the core of the art making process, branching
outwards in a radial configuration to include challenging and inspiring;
imagining and generating; planning and focusing; exploring and
experimenting; producing preliminary work; revising and refining;
presenting and performing; and reflecting and evaluating. These stages
enable art makers to navigate the process in a studio setting (Ministry
of Education, 2010, p. 16). As students gain confidence in the pro-
cess, they can fluidly, deliberately, and consciously move between the
stages—varying the order, where appropriate, to move art projects to
their fruition.
Art educators habitually witness students actively absorbed in the cre-
ative process in brick-and-mortar environments through classroom activ-
ities like viewing exemplars of work, drawing in sketchbooks, attempting
art skills or techniques, observing peers’ creative process then witnessing
their finished artworks, sharing work with peers, conferencing with the
teacher, and engaging in class critiques. The grand challenge in a virtual
environment was to offer opportunities to exercise aspects of the Ministry
of Education’s (2010) creative process in an online delivery model where
students were working in isolated environments.
Isolation/Adaptation/Education 247
Engagement
most successful in self-paced courses are those who are already successful
in school—being self-motivated and academically well prepared because
they know how to learn and assess the quality of their own work. Reich
contended that educators working online face the issue that a majority of
students are not autodidacts but rather are dependent on their teachers
to direct their learning. Human contact with teachers for instruction,
assessment, feedback, and experience is thus essential for learning. The
goal of developing a growth mindset in students, rather than just meas-
uring learning at the end of the course, is an integral part of the learning
process (Earl, 2012).
A/r/tography
Methods
Figure 13.1 The a/r/tographic process, 2020. Christina Yarmol, ink, felt tip and
metallic markers, found book. This example is the teacher’s altered
book demonstration shown under the document camera. Artwork
was completed off camera asynchronously.
Isolation/Adaptation/Education 251
Findings
Access to Materials
The first question to tackle visual art learning online was “What art
supplies do students have in their homes and how can we ensure equit-
able access to art-making media?” Answering this question was necessary
to devise projects that met curricular expectations and could be equitably
accessed by all students. I polled students through the hand-raising feature
on Google Classroom (https://classroom.google.com/) and a Google
Form (https://docs.google.com/forms/). I learned that few students had
access to a printer, but all students had access to a cell phone to photo-
graph their artwork. Many students had minimal art supplies at the ready
and were not able to purchase them with the lockdown in effect. For
students who required supplies, I made art kits that students picked up at
parking lot door of the art room. These kits contained many media: tem-
pera or watercolour paints, paint brushes, charcoal, soft graphite pencils,
pencil crayons, and a range of papers, including Bristol board, water-
colour paper, cartridge paper, and a sketchbook. Other supplies typic-
ally used in the classroom, like cardboard packaging, newspaper, sewing
needles, scissors, thread, and flour or cornstarch, were readily available to
students in their homes. When they were not available, alternative media
252 Christina Yarmol
were found. Encouraging the use of available mixed media with their art
kit materials enticed students to think creatively. Overcoming this initial
challenge set me up for achievable assignment planning, assessment, and
evaluation for students’ virtual learning (Wiggins & McTighe, 2011).
Figure 13.2 Educator’s fly on the wall, papier mâché with Roo’s centipede papier
mâché on right.
Table 13.1 Grade 9 rubric for observing and drawing still life compositions with
pencil shading.
Level 4 Level 3 Level 2 Level 1
Variety of pencil Variety of pencil Variety of pencil Variety of pencil
shades: shades: shades: shades:
• Excellent • Very good • Good display • Greater display
display of a display of a of a variety of of a variety
variety of variety of tones tones (3–4) of tones (1–3)
tones (+5) (4–5) required
• Some contour
line is used
Composition: Composition: Composition: Composition:
• All objects are • All objects are • Some objects are • Objects need to
extremely well well grounded well grounded be more clearly
grounded on on the picture on the picture grounded on the
the picture plane. plane. picture plane.
plane. • A somewhat • A formally or • Some objects can
• A dynamic, dynamic an informally be recognized by
formally or formally or balanced the viewer, but
informally informally composition it is difficult to
balanced balanced is attempted discern what is
composition composition is so that some represented.
is clearly achieved so that objects can be
achieved so most objects can recognized by
that all objects be recognized the viewer.
can be viewed by the viewer.
and recognized
by the viewer.
Likeness of Likeness of Likeness of Likeness of
drawing: drawing: drawing: drawing:
• Still life objects • Still life objects • Still life objects • Still life objects
are very easily are easily are somewhat are difficult to
recognized due recognized easy to recognize.
to the accuracy due to mostly recognize. • Shading
of the shading accurate, • Shading is requires further
and addition of believable accurate on attention.
shadows below shading some forms. • More attention
objects. • Shading defines • Shading defines on shading
• Shading defines almost all some forms required to give
all forms, not forms, giving giving them the impression
line, giving them a three- some three- of three
them a three- dimensional dimensionality. dimensions
dimensional quality.
quality.
Isolation/Adaptation/Education 259
rubric was posted, and students created the artwork. They were given
written feedback with the chat function based on the “look fors”: variety
of pencil shades, composition, likeness of drawing, and workmanship.
Students could improve their work after it was assessed; they were
able to resubmit it for final evaluation. In face-to-face learning, I readily
offer this strategy, but it was more difficult during the pandemic
due to the timelines imposed by a fast-paced quadmester schedule.
Nevertheless, many students took the opportunity to better their work,
and our conversations helped to train students to develop independent
self-assessment strategies. Students communicated that without these
conversations, they often felt immobilized to take next steps to adjust or
change their work to become more successful.
As part of the regular assessment process, I asked students to photo-
graph multiple iterations of their work and post them online for me or
others to add written or verbal commentary. In an online platform, exer-
cising the creative process (Ministry of Education, 2010) was activated by
inviting students to turn on their cameras to share their trials with the
class or by asking students to reveal their work through screen sharing.
Some students felt comfortable with this option and frequently exercised
260 Christina Yarmol
it, whereas others appreciated the private commentary given after posting
in the Google Classroom.
As noted earlier, Figure 13.3 is an example of technical painting skills;
it was completed before the painting project. The same student artist
completed the project in Figure 13.6, which shows the progress on her
landscape painting. Ree requested feedback during tutorial sessions and
in-class critiques, and the chat box enabled that feedback. I acknowledged
that artist-students could take my advice or discard it. It was up to them
to decide whether to alter their work.
In the large virtual class setting, students informally critiqued artwork
through the thumbs up emoji and notes in the chat box. Opening class
critiques in small group settings through breakout rooms, akin to elbow part-
ners in an in-person class, was helpful for students who were still gaining con-
fidence in their abilities. This strategy was helpful for receiving assessment
feedback about initial project ideas, rereading artists’ statements out loud
to one another, or dividing work for group art history projects. I randomly
visited these breakout rooms in session just as I would during an in-person
class setting. These interactive assessment for learning strategies helped build
relationships with my students and supported their sense of inclusion in the
online class community through a period of prolonged isolation.
that was not their own. To encourage the posting of original work created
by the student artists themselves, I made specific assessment look-fors
or guiding questions (see Figure 13.7). I requested the scaffolded process
work that I assigned, including sketchbook brainstorming, initial sketches,
completion of proposal sheets, and mounting of final progress imagery.
Preparatory work not only underlined the importance of the creative
process that I would have witnessed in person, but also taught students
that process is imperative in art production. The active recording of images
of their creative process continued throughout the online course delivery
model, and I provided guided templates for posting art processes and artists’
statements. Luke’s artistic process and artist’s statement for the Drawing
with Thread unit of the Grade 11 craft course are shown in Figure 13.8.
Figure 13.8 Luke: Drawing with thread, Luke met specific look-fors on the
assigned template. Panel A: Source photograph.
about the work they have created. Exhibition supports not only students’
growth as learners, but also their development as individuals; it offers
them an additional opportunity to engage in self-assessment and self-
reflection about their artwork that serves to support their self-regulation
and mental health.
Isolation/Adaptation/Education 263
the yearbook, and (b) presenting work at the school board level. Grade 11
and 12 students had the opportunity to work virtually with a street artist
in a school board show that was eventually mounted on the web as a still
show and compilation video on the board’s website. Students interpreted
the theme of adaptation in diverse ways, and we mounted the work
as a cohesive whole by using the backdrop of our reality—the Google
Classroom screen. Students expressed their ideas with acrylic paint on
Isolation/Adaptation/Education 265
Figure 13.8 (Continued)
wood panels (see Figure 13.9) and wrote truncated artist statements that
scrolled through the chat box on the right-hand side of the screen as a
1-minute GIF.
A second still frame of the artwork could be opened on the school
board website so that interested viewers could enlarge specific artworks.
The virtual graphic murals were created by a Grade 12 student, Nor. Nor’s
individual work with her artist’s statement is displayed in Figure 13.10.
Digital Portfolios
Figure 13.9 Still of adaptation project, digital file. The exhibition submission was
a PowerPoint of artwork accompanied by a GIF file showing one-
line artists’ statements.
Figure 13.10 Nor’s For Wheelchair Access, Please Go Through This Maze, acrylic on
wood panel.
Isolation/Adaptation/Education 267
Figure 13.11 Sample of digital portfolio. This sample depicts four out of 15 items
from Elizabeth’s 2020 Grade 10 culminating digital portfolio.
Isolation/Adaptation/Education 269
to in-person art learning, but it did result in a rich experience for stu-
dent artists that culminated in the acquisition of skills, the develop-
ment of art appreciation, and the creation of successful art projects
by students who made a concerted effort to remain engrossed in
their learning.
This a/r/tographic self-study presents an experienced art educator’s
pivot to deliver a hands-on, in-person art curriculum on a virtual learning
platform, offering innovative ways to mobilize knowledge. Art teachers
need models of ways to service students in both in-person and virtual
school environments as online learning becomes more prevalent. The
theoretical practices of assessment and evaluation of visual arts, and the
practical, inventive ways to deliver the curriculum to meet provincial
standards offered in this chapter, can be used to support in-service or pre-
service teacher programs for visual art, technology, home economics, or
English/media classrooms.
References
Blaikie, F., Schönau, D., & Steers, J. (2004). Preparing for portfolio assessment
in art and design: A study of the opinions and experiences of exiting
secondary school students in Canada, England and the Netherlands.
International Journal of Art & Design Education, 23(3), 302–315. https://doi.
org/10.1111/j.1476-8070.2004.00409.x
Boughton, D. (2013). Assessment of performance in the visual arts: What, how,
and why. In A. Karpati & E. Gaul (Eds.), From child art to visual culture of
youth–New models and tools for assessment of learning and creation in art educa-
tion. Intellect Press.
Cooper, D. (2010). Talk about assessment: Strategies and tools to improve learning.
Nelson.
Dikici, A. (2009). An application of digital portfolio with the peer, self, and
instructor assessments in art education. Egitim Arastirmalari-Eurasian Journal
of Educational Research, 36, 91–108.
Earl, L. M. (2012). Assessment as learning: Using classroom assessment to maximize
student learning (2nd ed.). Corwin Press.
Greene, J. P., Kisida, B., & Bowen, D. (2014). Value of field trips. Taking students
to an arts museum improves critical thinking skills, and more. Education Next,
14(1), 79–86.
Haanstra, F., & Schönau, D. W. (2007). Evaluation research in visual arts edu-
cation. In L. Bresler (Ed.), International handbook of research in arts edu-
cation (Vol. 16, pp. 427–444). Springer International. https://doi.org/
10.1007/978-1-4020-3052-9_27
270 Christina Yarmol
This chapter outlines how to use the SAMR model (Puentedura, 2006),
referring to substitution, augmentation, modification, and redefin-
ition to redesign assessments and assignments for K–12 classrooms
for the online environment. The examples and descriptions emerged
out of my literature review and in reflection on my work with teacher
candidates having to complete the remainder of their certifying prac-
ticum in a remote setting given the onset of COVID-19 in the spring of
2020. Although not a formal research project, a great deal of learning
occurred as teacher candidates and teacher educators had to figure out
new ways of teaching and assessing. This chapter aims to share those
lessons learned that I found most valuable when working with teacher
candidates to support them in adapting to remote instruction during
their field experiences.
I look first at the criteria used to implement effective assessment
practices that align with 21st century learning regardless of the envir-
onment: face-to-face, online, or hybrid. Second, I look at how these cri-
teria impact planning and assessment when teaching online or remotely.
Then I explore the SAMR model (Puentedura, 2006) and its implications
for planning and assessment. I also look at how the online environment
impacts both progressive skill development and discrete content know-
ledge attainment. I end with a section on benefits and considerations for
technology use.
DOI: 10.4324/9781003347972-18
272 Sheryl MacMath
It is important that the terms assessment and evaluation are clearly under-
stood. Assessment is the gathering of evidence to determine what
students do or do not know or can or cannot do so that teachers can plan
activities that meet students where they are at (Black & Wiliam, 1998;
Dueck, 2014; O’Connor, 2011; Schwimmer, 2011; Stiggins & Chappuis,
2012; Venables, 2020). Teachers teach, assess, reteach, or move on to the
next concept in a continual cycle designed to maximize student success.
In contrast, evaluation is about making a judgement about students’
learning to date. Does it meet expectations? Is it proficient? What per-
centage does it reflect? In essence, evaluation is summative (Brookhart
et al., 2020; Dueck, 2014; Feldman, 2020; Schwimmer, 2011; Venables,
2020). It represents a moment in time rather than a progression over time.
Conversely, assessment is formative. Recognizing this key difference, it
is important to consider where teachers spend their time: Is most time
spent assessing or evaluating student work? Classrooms that maximize
student success are focused more on assessment activities than evaluative
activities (Black & Wiliam, 1998; Earl, 2003; Himmele & Himmele, 2011).
I often get asked, “What does that look like, really?” It means that there
are usually at least three times as many assessments that are not for marks
as there are ones for marks. Instead of giving an assessment a grade or
score, either look at it and make decisions about the next teaching activity
or provide descriptive feedback for students: There is no grade, score, or
percentage. It also means a focus on observing and listening to students
to adjust instruction; assessment becomes intertwined with the teaching
process (Cooper, 2022).
This distribution of assessment and evaluation time has an interesting
impact on the classroom and the assignments used in class (Feldman,
2020). In an assessment-focused classroom, the focus on learning tends to
increase students’ intrinsic motivation, so they soon stop asking, “Is this
for marks?” In addition, the amount of marking time tends to decrease.
When teachers are not focused on gathering scores, they can minimize
what is collected from students (Dueck, 2014). Rather than a worksheet of
20 questions, one could have students complete three exit ticket questions
to see if they understood the concepts taught that day. Assessment-
focused teachers are more selective about what they collect, as assessment
is about gauging student understanding rather than collecting home-
work or checking completion. An important caveat is that students may
Using the SAMR Model to Design Online K–12 Assessments 273
the most critical component is being skill driven rather than content
driven; this shift is a huge one for teachers. Traditionally, school has been
about acquiring content: learning dates, times, events, concepts, and so
on. However, with the advent of the internet and search engines, content
is easy to acquire at any moment (Cooper, 2022; Dumont et al., 2010).
Students do not need to memorize facts and concepts when they can be
looked up on the spot. What they need are the critical thinking skills to
evaluate which sources are credible. They need to know how to collab-
orate effectively with others to analyze ideas. They need to be able to
apply concepts to new problems. They need to be able to build, construct,
analyze, evaluate, organize, and present. Teachers have always wanted
students to develop these skills, but with recent technology explosions,
they have become critical. Content still has a place; students need con-
tent to analyze, but there is no need to acquire content that is not used to
develop skills.
How does this consideration impact assessments and evaluation? To
start, teachers do not need the traditional long tests that require students
to regurgitate memorized content. They need more projects where
students are applying what they know (Fogleman et al., 2011; Halvorsen
et al., 2012; Han et al., 2015). Tests have a place (especially if they are
more skill focused), but they should not be the primary source of evi-
dence. Some practical applications include the following:
• Science assessments are less about tests of content and more about lab
reports and analyses.
• Social studies assessments focus on students critically thinking through
historical and current events and how these events inform the types of
citizens students will become, rather than tests of dates, names, and
places.
• Math assessments have more real-life problems and applications than
procedural questions.
• Language literacy is about writing for purpose and audience and
reading to analyze and critique.
Online Assessments
Can they go outside? Can they visit areas in their community? Can
they be building or experimenting with their hands?
One of the big considerations when working online (or anytime when
choosing to work with digital technologies) is the selection of programs,
apps, and tools (Burns, 2017). When making their selections, teacher
candidates should maintain focus on the types of evidence they are
gathering. They want to select a program, app, or tool that best aligns
with their evidence, not one that seems cool. Teacher candidates should
therefore not start by looking at a program or app. Instead, they should
focus on the type of evidence they want to gather and then look for an
option that enables them to get that evidence in an easy-to-use format
that also (one hopes) enables some variation for multimodalities. Teacher
candidates may also want to think about how they will organize or track
that evidence and remain focused on assessments as and for learning.
A fantastic resource for this is the book Tasks Before Apps (Burns, 2017).
A conceptual framework that can help teacher candidates focus on the
evidence they need is the SAMR model (Puentedura, 2006).
When my teacher candidates had to move online with their teaching, they
often asked me, “How do I take what I usually do and move it online?”
First, it is important to remember that the online environment required
some alterations to tasks. Teacher candidates were not able to move
through a classroom to observe, so consideration needed to be given to
how they would ensure regular check-ins as well as time for collaboration
and peer work. They needed to think about how they could use side-by-
side assessments and station rotations with their assessment tasks. Second,
technology enabled teacher candidates to engage students in multimodal
assessments. As such, rather than seeking to simply move an assessment
online, they could think about how they could engage students in more
creative ways. For this purpose, the SAMR model became very useful
(Puentedura, 2006).
The SAMR model’s (Puentedura, 2006) elements of substitution,
augmentation, modification, and redefinition represent a progres-
sion for classroom teachers as they move from assigning paper and
pencil tasks to using technology: Each step moves further from the
Using the SAMR Model to Design Online K–12 Assessments 277
Substitution
Substitution is the first stage in the model and involves just a replacement
of tools (Puentedura, 2006). The actual task does not change. For example,
rather than creating a poster in class, students could use a program like
Adobe Express (www.adobe.com/express/) or Poster Creator by Crested
Developers (bit.ly/3HnXFKC) to make an online poster. Other than
completing the assignment using technology, no additional multimodal
opportunities are integrated compared to the original task.
Augmentation
Modification
Redefinition
The last stage of the SAMR model involves the creation of a task that
does not exist without the chosen digital technology (Puentedura, 2006).
At this stage, teachers are not simply altering a previous assignment; they
are creating a new one. For example, students could create a blog, a social
media post, a website, or an iPhone (e.g., “Show what was learned in
an author study by creating the iPhone of that individual. What music
and apps would the author have on their phone, who would be in their
contacts, and what pictures would be in their photos?”). These tasks,
which were not used in schools until those digital technologies came
into existence, have students think about what they are learning in
different ways. Students may find these tasks engaging given the novelty
and potential creativity involved. They also require students to work at
a much higher level of critical thinking: Rather than just summarizing,
they are having to apply, evaluate, and critique in a variety of multimodal
ways. Redefinition tasks can exist within a work plan or as placemat tasks.
I found that as teacher candidates moved through the SAMR model
(Puentedura, 2006), opportunities increased for students to collaborate
and show their learning in multimodal ways. Whether using technology
in the face-to-face classroom or online, digital technologies can help
Using the SAMR Model to Design Online K–12 Assessments 279
I define progressive skills as those that are developed over time—and the
order in which they are learned matters. For example, when teaching
addition, teachers may start with using manipulatives to add, then draw
pictures of addition sentences, and then use symbols. After that, teachers
may introduce regrouping using 10 frames, manipulatives or pictures,
and finally symbols on a place value chart. The skill progression scaffolds
over time. Another progression example would be teaching students how
to write an essay. They learn how to write a thesis statement, they create
an outline for their paragraphs, they research for their paragraphs, they
write the body, they add in connecting statements and imagery, and then
they edit and publish. Math and language literacies tend to have a great
deal of progressive skill development. There are many different learning
progressions; the two above are just examples.
Beyond being a progression, it is important to note that here I am
talking about skills, not just knowledge. In these instances, how a stu-
dent completes the task is the focus, not just what a student knows.
To evaluate a skill, teacher candidates should focus on observing and
talking with students. In the literature, this was most evidenced in math
interviews, such as having a student solve a problem and talking out loud
while they did it (Forbringer & Fuchs, 2014), and using guided reading
groups (Fountas & Pinnell, 2016). As such, evidence is usually gathered
in small group settings or individually to enable feedback (Tucker, 2020).
The focus is on connecting with students: Assessments need to occur on a
regular basis to maintain the focus on student creation of evidence rather
than how the teacher candidates will teach.
280 Sheryl MacMath
Date
Tested:
Different wavs to
Wavs to make 17
Correct order ud
Correct symbols
Highest number
Different wav to
Wavs to make 9
Make a storv
Make a storv
Regroup 10s
Student
Left to right
Build 20 - 6
Build 34-18
Skip bv 10s
Build 13-8
5 + ? = 11
Skip bv 2s
Skip bv 5s
Un-group
Names
Start at 3
Build 7-3
Symbols
Symbols
24 + 37
13 + 8
6+3
Figure 14.2 (Continued)
Side-by-Side Assessments
of smaller tasks). It was critical for the teacher candidate and student to
connect on a regular basis to check progress and discuss next steps. When
teaching face-to-face, it is easy to walk around the room and check on
students’ progress. When teaching remotely, teacher candidates needed
an alternative to that movement through a room.
These check-ins were accomplished through individual meetings
when students presented what they had done, received brief and
descriptive feedback on what they had completed, and discussed next
steps (see Brookhart, 2017). These meetings were usually kept short:
no more than 5 to 10 minutes. This timeline was achieved only if both
the teacher candidate and the student were clear on what they were
reviewing, with specific things to look for. As a result, the recording
sheet for the side-by-side assessment needed to clearly outline the steps
needed so that the teacher candidate and the student could focus on only
one or two things at a time. Figure 14.3 illustrates a recording sheet for
Discrete Units
noticed that a student was not adding to their workflow plan, the former
contacted the latter to check on them. This responsiveness was important
during remote schooling given that students were often working on their
own. An added value of these one-on-one check-ins was that there was no
additional marking after class was done: It took place during the meeting.
In addition, the workflow plan included a specific step for peer feedback.
This was one way to incorporate a variety of peer-to-peer interactions
that also represented an assessment as a learning opportunity. Given the
isolation of remote schooling, it was important to plan specific opportun-
ities for peer-to-peer interaction.
Using the SAMR Model to Design Online K–12 Assessments 287
Figure 14.6 (Continued)
References
Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through
classroom assessment. Phi Delta Kappan, 92(1). https://doi.org/10.1177/
003172171009200119
British Columbia Ministry of Education. (n.d.). English language arts: Get
started. Government of British Columbia. https://curriculum.gov.bc.ca/
curriculum/english-language-arts
Brookhart, S. (2017). How to give effective feedback to your students (2nd ed.). ASCD.
Brookhart, S., Guskey, T. R., McTighe, J., & Wiliam, D. (2020). Eight essen-
tial principles for improving grading. Educational Leadership, 78(1). https://
uknowledge.uky.edu/cgi/viewcontent.cgi?article=1050
Burns, M. (2017). Tasks before apps: Designing rigorous learning in a tech-rich class-
room. ASCD.
Clark, S. (2014). Outstanding formative assessment: Culture and practice. Hodder
Education.
Cooper, D., & Catania, J. (2022). Rebooting assessment: A practical guide for balan-
cing conversations, performances, and products. Solution Tree.
Dueck, M. (2014). Grading smarter, not harder: Assessment strategies that motivate
kids and help them learn. ASCD.
Dumont, H., Istance, D., & Benavides, F. (2010). The nature of learning: Using
research to inspire practice: Practitioner guide to the innovative learning environ-
ments project. Centre for Educational Research and Innovation. www.oecd.
org/education/ceri/50300814.pdf
Earl, L. M. (2003). Assessment as learning: Using classroom assessment to maximize
student learning. Corwin.
Farmer, H. M., & Ramsdale, J. (2016). Teaching competencies for the online
environment. Canadian Journal of Learning and Technology, 42(3), 1–17. https://
files.eric.ed.gov/fulltext/EJ1110313.pdf
Feldman, J. (2020). Taking the stress out of grading. Educational Leadership, 78(1).
www.ascd.org/el/articles/taking-the-stress-out-of-grading
Using the SAMR Model to Design Online K–12 Assessments 291
Feldman, J., & Reeves, D. (2020). Grading during the pandemic: A conversation.
Educational Leadership, 78(1). www.ascd.org/el/articles/grading-during-
the-pandemic-a-conversation
Fogleman, J., McNeill, K. L., & Krajcik, J. (2011). Examining the effect of teachers’
adaptations of a middle school science inquiry-oriented curriculum unit on
student learning. Journal of Research in Science Teaching, 48(2), 149–169.
Forbringer, L., & Fuchs, W. (2014). Response to instruction (RtI) in math:
Evidence-based interventions for struggling students. Routledge. https://doi.
org/10.4324/9781315852270
Fountas, I. C., & Pinnell, G. S. (2016). Guided reading: Responsive teaching across the
grades (2nd ed.). Heinemann.
Halvorsen, A.-L., Duke, N. K., Brugar, K. A., Block, M. K., Strachan, S. L., Berka,
M. B., & Brown, J. M. (2012). Narrowing the achievement gap in second-
grade social studies and content area literacy: The promise of a project-based
approach. Theory and Research in Social Education, 40(3), 198–229. https://doi.
org/10.1080/00933104.2012.705954
Han, S., Capraro, R., & Capraro, M. M. (2015). How science, technology, engin-
eering, and mathematics (STEM) project-based learning (PBL) affects high,
middle, and low achievers differently: The impact of student factors on
achievement. International Journal of Science and Mathematics Education, 13,
1089–1113. https://doi.org/10.1007/s10763-014-9526-0
Himmele, P., & Himmele, W. (2011). Total participation techniques: Making every
student an active learner. ASCD.
International Society for Technology in Education. (2019). ISTE standards:
Coaches. www.iste.org/standards/iste-standards-for-coaches
MacMath, S. (2017). Assessment driven planning. Pearson Canada.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content know-
ledge: A framework for teacher knowledge. Teachers College Record, 108, 1017–
1054. http://one2oneheights.pbworks.com/f/MISHRA_PUNYA.pdf
O’Connor, K. (2011). A repair kit for grading: 15 fixes for broken grades (2nd ed.).
Pearson.
Puentedura, R. R. (August 2006). Transformation, technology, and education
[PowerPoint slides]. Maine Department of Education. http://hippasus.com/
resources/tte/puentedura_tte.pdf
Schwimmer, T. (2011). 10 things that matter from assessment to grading. Pearson
Canada.
Selwyn, N. (2012). School 2.0: Rethinking the future of schools in the digital age. In
A. Jimoyiannis (Ed.), Research on e-learning and ICT in education (pp. 3–16). Springer.
Stiggins, R. J., & Chappuis, J. (2012). An introduction to student-involved assessment
for learning (6th ed.). Pearson.
292 Sheryl MacMath
DOI: 10.4324/9781003347972-19
294 Holden, DeLuca, MacGregor, and Cooper
(e.g., Ministry of Education [MOE], 2022). Rather than viewing the pan-
demic as a temporary issue, this context suggests that lasting research
into online and classroom assessment is critical.
As we have written elsewhere (Cooper et al., 2022), changes to
assessment and evaluation practices during the pandemic contributed
to widespread concern about the quality of teaching, learning, and
assessment opportunities available to students in online environments
(Education Endowment Foundation [EEF], 2020). We refer interested
readers to our synthesis of these changes in Cooper and colleagues (2022).
Likewise, Gallagher-Mackay et al. (2021) have offered an especially useful
description of the online learning landscape in Canada during COVID-19.
Gallagher-Mackay and Brown (2021) provided a similar Ontario-specific
account. This chapter examines how 17 Ontario secondary school
teachers used formative classroom assessment to meet their students’
learning needs and grapple with systemic challenges during an especially
difficult time for educators. Our findings describe emergency remote
teaching and emergency assessment (Cooper et al., 2022; Hodges et al.,
2020), and offer direct insights into how teachers, teacher educators, and
other stakeholders can adapt classroom assessment practices to diverse
online learning environments.
(Black & Wiliam, 2018; Cowie & Harrison, 2016). The validity of
those decisions depends upon a teacher’s ability to modify their view
of students’ learning and adapt that understanding into new contexts
(Kane, 2006).
Formative assessment, or its contemporary designation, assessment
for learning, necessarily leads to a view of assessment that is teacher- and
student-led. Research has consistently shown that teacher-led efforts,
rather than top-down initiatives, are some of the most effective means
to advance sustainable change in education (Hargreaves & Fullan,
2012; McNamara & O’Hara, 2004; Tschannen-Moran, 2009). Moreover,
research in classroom assessment suggests that students must be actively
involved in monitoring, regulating, and assessing their own learning in
addition to the feedback they receive from teachers and peers (Andrade,
2013; Shepard et al., 2018). Brookhart (2013) argued that “self-regulation
strategies and capabilities, or the lack of them, may be the defining
feature that separates successful and unsuccessful students” (p. 43).
Classroom assessment is therefore a contextualized endeavour, one
rooted in the local, changing needs of teachers and students (McMillan,
2013). Our firm belief in formative classroom assessment underpins this
chapter. By examining how individual teachers adapted their classroom
assessment practices for online learning, we contend that researchers
and policymakers will be better positioned to support and enhance the
assessment practices taking place both online and in brick-and-mortar
schools.
Methods
meant to generalize across all teachers in all classrooms (Stake, 2010). For
example, the potential biases of sampling using social media (e.g., demo-
graphic tendencies, see Morstatter & Liu, 2017) mean that the sample
should not be read as representative of all Ontario secondary school
teachers. Nor do we suggest that the extant data allow for generalizations
about assessment variations within specific subject areas or grade levels.
Rather, participants offered insights into their individual practices during
the initial weeks of the pandemic—insights which we contend will be of
interest to researchers, teacher educators, and policymakers.
Semistructured interviews (Merriam & Tisdell, 2015; Patton, 2002)
lasting upwards of an hour were conducted with each participant using
Zoom, focusing on topics related to planning, instruction, and assessment
during the remote learning period of the pandemic (see Table 15.2).
Data Analysis
and practices that reflected their individual expertise and contexts. As Jodi
recalled, “There was a lot of concern that we were going to be forced to
teach a certain way online. And I’m so thankful that they didn’t.” Raphael
and Stuart echoed Jodi’s sentiments: Both appreciated not being forced
into modes of assessment that did not align with their skill sets and com-
fort levels. Yet, on the other hand, participants recognized that varying
practices from teacher to teacher introduced significant challenges for
students who were learning online and for their parents. Kiley observed,
The last 8 weeks has kind of felt like I’m talking to a wall because
there’s no return from the students. I have no sense that they’re
getting it. As much as I can call them and ask them, and say “reach
out if you’ve got any questions,” they’re not going to. They say that
they’re doing fine.
Final Thoughts
there is a hot iron here that we need to strike to figure out as much
as we can about what worked and what did not work in this situ-
ation, especially because we know the [Ontario] government is
moving forward with furthering eLearning.
References
Cooper, A., DeLuca, C., Holden, M., & MacGregor, S. (2022). Emergency
assessment: Rethinking classroom practices and priorities amid remote
teaching. Assessment in Education: Principles, Policy & Practice, 29(5), 534–554.
https://doi.org/10.1080/0969594X.2022.2069084
Cooper, A., Timmons, K., & MacGregor, S. (2021). Exploring how Ontario
teachers adapted to learn-at-home initiatives during COVID-19: Blending
technological and pedagogical expertise in a time of growing inequities.
Journal of Teaching and Learning, 15(2), 81–101. https://doi.org/10.22329/jtl.
v15i2.6726
Corbett, M., & Gereluk, D. (2020). Rural teacher education: Connecting land and
people. Springer. https://doi.org/10.1007/978-981-15-2560-5
Corwin, Z. B., & Clemens, R. F. (2020). Analyzing fieldnotes: A practical guide. In
M. R. M. Ward & S. Delamont (Eds.), Handbook of qualitative research in education
(pp. 409–419). Edward Elgar. https://doi.org/10.4337/9781788977159.00047
Council of Chief State School Officers. (April 20, 2023). Revising the definition
of formative assessment. https://ccsso.org/resource-library/revising-definition-
formative-assessment
Cowie, B., & Harrison, C. (2016). Classroom processes that support effective
assessment. In G. T. L. Brown & L. R. Harris (Eds.), Handbook of human and
social conditions in assessment (pp. 335–350). Routledge. https://doi.org/10.4324/
9781315749136
Crossouard, B. (2011). Using formative assessment to support complex learning
in conditions of social adversity. Assessment in Education: Principles, Policy and
Practice, 18(1), 59–72. https://doi.org/10.1080/0969594X.2011.536034
Cuban, L. (2013). Inside the black box of classroom practice: Change without reform in
American education. Harvard Education Press.
DeCuir-Gunby, J., Marshall, P. L., & McCulloch, A. W. (2011). Developing and
using a codebook for the analysis of interview data: An example from a pro-
fessional development research project. Field Methods, 23(2), 136–155. https://
doi.org/10.1177/1525822X10388468
DeLuca, C., & Johnson, S. (2017). Developing assessment capable teachers in
this age of accountability. Assessment in Education: Principles, Policy and Practice,
24(2), 121–126. https://doi.org/10.1080/0969594X.2017.1297010
DeLuca, C., & Volante, L. (2016). Assessment for learning in teacher education
programs: Navigating the juxtaposition of theory and praxis. Journal of the
International Society for Teacher Education, 20(1), 19–31.
Doucet, A., Netolicky, D., Timmers, K., & Tuscano, F. (2020). Thinking about
pedagogy in an unfolding pandemic: An independent report on approaches to distance
learning during COVID-19 school closures. UNESCO. www.oitcinterfor.org/en/
node/7809
Adapting Classroom Assessment Practices for Online Learning 309
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Sage.
Phillippi, J., & Lauderdale, J. (2018). A guide to field notes for qualitative research:
Context and conversation. Qualitative Health Research, 28(3), 381–388. https://
doi.org/10.1177/1049732317697102
Shepard, L. A. (2019). Classroom assessment to support teaching and learning.
Annals of the American Academy of Political and Social Science, 683(1), 183–200.
https://doi.org/10.1177/0002716219843818
Shepard, L. A., Penuel, W. R., & Pellegrino, J. W. (2018). Using learning and motiv-
ation theories to coherently link formative assessment, grading practices,
and large-scale assessment. Educational Measurement: Issues and Practice, 37(1),
21–34. https://doi.org/10.1111/emip.12189
Sokolowski, R. (2000). Introduction to phenomenology. Cambridge University Press.
Stake, R. E. (2010). Qualitative research: Studying how things work. Guilford Press.
Stobart, G. (2008). Testing times: The uses and abuses of assessment. Routledge.
Swaffield, S. (2011). Getting to the heart of authentic assessment for learning.
Assessment in Education: Principles, Policy and Practice, 18(4), 433–449. https://
doi.org/10.1080/0969594X.2011.582838
Thomas, D. R. (2006). A general inductive approach for analyzing qualitative
evaluation data. American Journal of Evaluation, 27(2), 237–246. https://doi.
org/10.1177/1098214005283748
Timperley, H. S. (2015). Continuing professional development. International
Encyclopedia of the Social & Behavioral Sciences (2nd ed.), 4, 796–802. https://
doi.org/10.1016/B978-0-08-097086-8.92134-2
Tschannen-Moran, M. (2009). Fostering teacher professionalism in schools: The
role of leadership orientation and trust. Educational Administration Quarterly,
45(2), 217–247. https://doi.org/10.1177/0013161X08330501
Weller, M. (2020). 25 years of ed tech. Athabasca University Press. https://doi.
org/10.15215/aupress/9781771993050.01
Wiggins, G. P. (1993). Assessing student performance. Jossey-Bass.
Equity in Action 16
Virtual Learning Environment
Considerations
Sharlene McHolm
DOI: 10.4324/9781003347972-20
Equity in Action 313
Thinking Narratively
My starting point for this chapter was my interest in how teacher know-
ledge of in-person classrooms translated to the online environment with
the sudden transition into the VLE. Using Clandinin and Connelly’s
(2004) framework of narrative inquiry, I stepped back to consider and
contemplate the sudden shift. Their framework “entails thinking within
the three commonplaces of narrative inquiry—temporality, sociality,
and place” (Clandinin, 2013, p. 38). In this chapter, I draw on the notion
that narrative inquiry is a way of understanding experience across time,
places, and relationships (Clandinin, 2013).
The initial weeks of the pandemic were ones of trauma, confusion,
and discovery for both students and teachers. As a seasoned principal of
17 years, I found myself in the unexpected position of leading teachers
in a domain that I had relatively little experience with. I had taught in
a virtual high school, and I had completed three degrees through dis-
tance learning, but all those environments used asynchronous models.
The direct transposition of in-person teaching to the synchronous world
314 Sharlene McHolm
of online learning left me searching for best practices in the Wild West
of newly established VLEs. As did other educational leaders, I grappled
with questions about who these classrooms were serving and who they
were not. The historical patterns of underserved groups and the wide
variety of technological skills of teachers, students, and their families
began to illuminate a disparaging effect on underserved groups. I focused
on themes of equity for neurodivergent and marginalized learners.
This chapter chronicles my personal reflections and the rich
discussions I shared with three fellow administrators of stand-alone
VLE elementary schools servicing over 15,000 students at their peaks,
to 500 students currently. Our joint inquiry included interviews, pro-
fessional sharing of documents, and emails from 2020 to 2022, with a
culminating semistructured interview I conducted with them in the fall
of 2022 to consolidate my reflections through our shared experiences.
The questions that shaped my inquiry into the VLE were framed around
what constituted excellent teaching environments (in person and vir-
tually), how teachers selected digital tools that supported all types of
learners, and what insights from these practices might improve educa-
tional outcomes for students. Discussions and written communication
documented our journey as teachers struggled to become more compe-
tent virtual learning educators.
As a researcher and practising administrator, I am positioned as a
non-Indigenous individual with expertise in special education and educa-
tional leadership. I have worked in two publicly funded school districts in
Ontario. I acknowledge that my lived experiences are and were different
than many of the students and families that I was trying to support. Thus,
as a part of the dominant structures within the educational landscape,
it was critical to reflectively account for my roles (both intentional and
unintentional) within the current systems.
In-person learning and VLEs share fundamental principles of learning.
These principles include establishing a rich learning environment, cultivating
strong relationships amongst students and teachers (social constructivism
and connectivism), and using evidence-based pedagogical understandings
(cognitive processes) for specific content learning (Hollingshead & Carr-
Chellman, 2019). Superseding these conditions for learning, educators
must focus on the equity and human rights process of removing sys-
temic barriers experienced by Indigenous, racialized, 2SILGBTQ++,
and neurodivergent learners (Masta, 2018). All students must have edu-
cational opportunities in public education that lead to equal life oppor-
tunities after graduation (Masta, 2018). My shared experiences with the
Equity in Action 315
Definitions
Figure 16.1 The TPACK framework. From “Using the TPACK Image,” by M.
J. Koehler, 2011 (http://matt-koehler.com/tpack2/using-the-tpack-
image/). Copyright 2011 by tpack.org. Reproduced with permission.
During this second phase, educators and students settled into the work
of learning again. The families that selected the VLE demanded a
higher quality education. School leaders engaged with staff to reflect on
the practices during Phase 1 and to consider what could be improved.
Kimmons et al. (2020) provided the PICRAT matrix (see Figure 16.2)
for this analysis. The PICRAT matrix focuses on the student, requiring
Figure 16.2 The PICRAT model. Reproduced from “The PICRAT Model for
Technology Integration in Teacher Preparation” by R. Kimmons,
C. Graham, & R. West, 2020, Contemporary Issues in Technology
and Teacher Education, 20(1) (https://citejournal.org/volume-20/
issue-1-20/general/the-picrat-model-for-technology-integration-in-
teacher-preparation/). Copyright 2020 by the Society for Information
Technology and Teacher Education. Reproduced by permission of
Creative Commons Attribution 3.0 United States (CC BY 3.0 US).
320 Sharlene McHolm
selected matched the intended purposes. Only in Phase 3 did the staff
know enough about why they were selecting a particular digital tool
and how they would use it. Having district-approved tools ensured that
teachers had pragmatic choices to advance learning.
The VLE has great options for diagnostics. In Phase 3, I saw
conversations, observations, video conference platform polls, and the use
of shared brainstorming tools, including Jamboard (https://jamboard.
google.com/), Prezi (https://prezi.com/), and Kidspiration Maps (www.
diagrammingapps.com/kidspirationmaps) to invite conversation and
collaboration. I also saw how students used the applications to solidify
their thinking, making their metacognition visible.
The power of the VLE is its potential to differentiate the various needs
in a classroom. One area I pondered was why universal design was not
happening to the degree that I would have hoped: In this third phase,
I thought that it would be automatic. One administrator remarked,
“The majority of teachers, about 85%, still did not differentiate in the
fall of 2022. They had their slide decks and they just continued.” But
through strategic scaffolding of professional development, strides were
made across schools. Early adopters were paired with each other, and
more resistant staff were paired with moderate-to-adopt individuals who
shared personality traits.
Although still early in this work, all three administrators using this
approach noted positive gains. One stated, “In classrooms, I am seeing
differences in practices. I can talk with teachers, and they can articulate
why they are doing something in a particular way.” Another adminis-
trator shared, “I am proud of the progress they are making. We are doing
things like moderated marking to create benchmarks for VLE teachers.”
This practice, although simple, was beginning to show promise. A third
administrator said, “We are beginning to see a change and our student
survey supports our thinking that student voice and differentiation makes
all the difference for engagement.” As in all schools, some people are nat-
ural leaders and some need support to move towards better pedagogical
and assessment practices.
for long periods of time. VLE students must also have sufficient internet
bandwidth, skills using technology, and self-motivation. In the earliest
stages of province-wide distance learning, these parameters were true,
but as my colleagues and I took the time to develop the structures,
improve teachers’ technological knowledge, and support them in peda-
gogically sound practices in the VLE, I saw that our assumptions were
flawed.
Many students with special education needs benefitted academically
from VLEs, particularly during Phases 2 and 3. This, at first glance, seems
illogical, as providing hands-on assessments was found to be more diffi-
cult. One administrator noted that teachers felt it was “difficult to triangu-
late data, as there were too many students to speak to one on one,” but in
that circumstance, it was evident that tailored professional development
was required. When asked whether these academic improvements were
related to the learning platform or a family member redirecting the stu-
dent to stay on task, an administrator responded, “It worked really well
for the ones whose parents could do that one-to-one.” With sensitivities,
distractions, and the mental taxation of unstructured social landscapes
removed, students with behavioural needs could focus their energies
solely on the learning happening in the classroom.
Instructional tools readily available assisted multilingual and
neurodivergent learners. Closed captioning with real-time translation
and transcripts (for hearing-impaired students and those with verbal
processing disorders) was effective accommodation. The use of pacing
options available through flipped classroom methods, slowing recordings
down when viewing, screen readers, speech-to-text plugins, online
summarizers, repetitious viewing, and noise-cancelling headphones
all assisted students with learning differences. Digitally chunking
assignments helped those with focus and attentional issues. Teacher cre-
ativity shone through different information delivery approaches while
keeping student interest high (Hollingshead & Carr-Chellman, 2019).
Picciano (2021) reinforced the importance of student engagement
within the VLE, and student-centred collaborative projects with process-
oriented guided inquiry naturally included students with disabilities into
the learning.
Mental health issues such as anxiety were also mitigated through the
VLE. One administrator suggested that
VLE for kids with anxiety was actually really, really good for a lot of
them because they didn’t have to face the daily going to school. . . .
324 Sharlene McHolm
In general, kids with anxiety were thriving because they knew what
to expect, [and those anxiety provoking situations were] removed.
All the additional social situations, such as facing bullies or being in large
groups, were controlled for, allowing students to focus on their learning.
The four of us involved in this inquiry agreed that the reduction in stress
allowed many students to attend more consistently and to receive their
necessary accommodations in an almost invisible way.
One remarkable finding was that many of the students with individual
education plans (IEPs)—that is, those who needed formal extra supports—
could be served in the VLE without needing those IEPs. One adminis-
trator found that 46.7% of their IEPs were based on accommodations
used by all students in the VLE. The administrator recalled the excite-
ment of families when their child could come off an IEP. One remote
learning school pledged to try using the VLE without IEPs for one year,
and “every single parent agreed to have their child taken off their IEPs,”
according to the school’s principal. The empowerment of some students
previously thought as disadvantaged has been amazing to see.
One of the greatest strengths that the VLE environment offers is con-
fidentiality. For students with special learning requirements, they can
access the VLE environment seamlessly with their peers and receive
real-time accommodations. Simple tools like voice to text, real-time lan-
guage translation, viewing videos at a slower speed, or viewing videos
repeatedly can be used with complete discretion. Products, success cri-
teria, and modified programs can be provided without classroom peers
seeing accommodations or students being withdrawn to work with a spe-
cial education or resource teacher. This is, by far, the greatest outcome of
the VLE for students with special needs.
Weaknesses of VLE
The biggest problem with the VLE during Phases 1 and 2 was related to its
context. In short, people were selecting the VLE for its safety, rather than
for its educational power. The wrong people flocked to the platform ini-
tially. The transitions back and forth between the learning environments
of Phase 1 and 2 resulted in low commitment to building competency
and applying tools judiciously. Engagement (of both staff and students)
also varied. In Phase 3, a stabilization occurred, which addressed many of
these weaknesses.
Equity in Action 325
The VLE represents a vast opportunity for both educators and students.
As the world shifts to more personalized options in many facets of life,
the VLE is an expression of this trend. It is not a one-size-fits-all type of
educational system (Sugmawati & Winarni, 2023). It allows learners who
need repetition to have it. It facilitates accommodations seamlessly and
invisibly, leveraging technological functionality for privacy (Chen et al.,
2023). It has the potential to move hundreds of students off IEPs because
these needs are met through the modality. As communities look to the
future of work, with increasing virtual working environments, students
who have been learning in a VLE may have advantages.
VLEs have challenges: Teachers must be precise with lesson planning,
success criteria, and assessment tools (Means et al., 2014). Teachers in
the VLE have fewer opportunities to organically hear misconceptions
than they do when teaching in person. Interactions in the VLE need to
be orchestrated to maximize purposeful student-student interactions
(Hodges et al., 2020). The VLE requires educators to be explicit about
what they are teaching and reflective as to why they are teaching it in
a particular way (Mukhtar et al., 2020). Assessment within the VLE
holds the same challenges and possibilities as in a brick-and-mortar
school.
As critical as it is for a teacher to have strong mathematical skills
or literacy development understandings, so too does a teacher in the
VLE need specific skills in the precision of their activities (Hodges et al.,
2020; Mukhtar et al., 2020). In the VLE, educators can track and reflect
on what works and what could be improved. The permanency of digital
resources, recordings, and multimedia creates accountability that is not
available in the in-person classroom. Anyone can see the activities and
content and review them: Although differentiation can be invisible, all
the educators’ decisions are more visible than in the brick-and-mortar
school.
Equity in Action 327
Conclusion
Throughout the pandemic, Ontario, like all provinces and territories, was
forced to invest in the infrastructure, technological skills, and educator
capacity needed to shift VLE from a temporary educational necessity to a
powerful learning environment. Reflection on shifts in the VLE landscape
has been uplifting for me. I can see the possibilities within the virtual
space. Learning in the VLE can be a rich and ideal place for students to
attend school. The evolution of the platforms used in virtual classrooms
and the educators leading them accelerated exponentially in the pan-
demic. This tragic impetus does have its silver lining. Now, a space that
traditionally excluded neurodiverse, medically fragile, and marginalized
students is acting as a haven for some. It is not the space for everyone, but
with great attention to building relationships, critically thinking about
pedagogy, and implementing strong assessment practices, the VLE has
become more than a viable option for many children.
References
Mukhtar, K., Javed, K., Arooj, M., & Sethi, A. (2020). Advantages, limitations
and recommendations for online learning during COVID-19 pandemic.
Pakistan Journal of Medical Sciences, 36(COVID19-S4), S27–S31. https://doi.
org/10.12669/pjms.36.covid19-s4.2785
Picciano, A. G. (2017). Theories and frameworks for online education: Seeking an
integrated model. Online Learning, 21(3), 166–190. https://doi.org/10.24059/
olj.v21i3.1225
Picciano, A. G. (2021). Seeking an integrated model. In C. Fuentes (Ed.), A
guide to administering distance learning (pp. 79–103). Brill. https://doi.
org/10.1163/9789004471382_005
Porter, S. G., Greene, K., & Esposito, M. C. (2021). Access and inclusion of
students with disabilities in virtual learning environments: Implications for
post-pandemic teaching. International Journal of Multicultural Education, 23(3),
43–61. https://doi.org/10.18251/ijme.v23i3.3011
Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes,
M. A., Wade, C. A., & Woods, J. (2014). The effects of technology use in
postsecondary education: A meta-analysis of classroom applications. Computers &
Education, 72, 271–291. https://doi.org/10.1016/j.compedu.2013.11.002
Schmitz, M.-L., Antoniette, C., Cattaneio, A., Gonon, P., & Petko, D.
(2022). When barriers are not an issue: Tracing the relationship between
hindering factors and technology use in secondary schools across Europe.
Computers & Education, 179, Article 104411. https://doi.org/10.1016/j.
compedu.2021.104411
Shamir-Inbal, T., & Blau, I. (2021). Facilitating emergency remote K–12 teaching
in computing-enhanced virtual learning environments during COVID-19
pandemic—Blessing or curse? Journal of Educational Computing Research, 59(7),
1243–1271. https://doi.org/10.1177/0735633121992781
Sugmawati, D., & Winarni, R. (2023). The problems of using online-based indi-
vidual learning programs during the COVID-19 pandemic. SCIENTIA: Social
Sciences & Humanities, 2(1), 337–342. https://doi.org/10.51773/sssh.v2i1.171
Trust, T., & Whalen, J. (2021). K–12 teachers’ experiences and challenges with
using technology for emergency remote teaching during the COVID-19 pan-
demic. Italian Journal of Educational Technology, 29(2), 10–25. https://doi.
org/10.17471/2499-4324/1192
Tsekleves, E., Cosmas, J., & Aggoun, A. (2016). Benefits, barriers and guideline
recommendations for the implementation of serious games in education for
stakeholders and policymakers. British Journal of Educational Technology, 47(1),
164–183. https://doi.org/10.1111/bjet.12223
Ubuntu at a Distance 17
Online Assessment for Care,
Justice, and Community
Sarah Elizabeth Barrett
DOI: 10.4324/9781003347972-21
Ubuntu at a Distance 331
Conceptions of Assessment
The starting point for this study was not theory but experience (Lindsay-
Dennis, 2015; Rose & Adams, 2014). I needed a way to make sense of what
332 Sarah Elizabeth Barrett
teachers said about their experiences. I sought a theory that could encom-
pass the three aspects of assessment described earlier: accuracy, equity,
and well-being. To that end, I used ubuntu, an Indigenous sub-Saharan
African ethical framework, to examine the data. Known by various names
in African languages (ubuntu being the Xhosa term), ubuntu does not
map neatly onto any one of the four commonly known Western ethical
frameworks of deontology, virtue, utilitarian, or caring (Bonn, 2007) but
rather represents a conception of ethical character and conduct rooted
in community. Ubuntu is so ingrained in sub-Saharan African cultures
that it can be difficult to provide a clear definition (Bonn, 2007), but it
boils down to the idea that “a person depends on others to be a person”
(Bonn, 2007, p. 863) or “I am because we are” (Brock-Utne, 2016, p. 31).
Specifically, ubuntu espouses a balance between three domains of ethical
action: justice, care, and community (Letseka, 2012; Metz & Gaie, 2010).
I use ubuntu as a conceptual lens with the recognition that it is not a
monolithic and essentialist feature of sub-Saharan African perspectives
(Waghid & Smeyers, 2012). Rather, ubuntu is a constellation of perspectives
revolving around community and human dignity which, through my own
African and Caribbean roots, forms the basis of the ethical framework
I brought to this research. Similar to other scholars (Letseka, 2012), I do
not argue here that ubuntu is the only ethical framework that espouses
a balance between justice, caring, and community norms (Metz & Gaie,
2010), but rather that it is the conceptual lens through which I undertook
the analysis of the data from this study.
Context
In Ontario, school buildings closed on March 12, 2020, and ERT began
on April 6, 2020, with most school districts opting for online platforms.
However, there were delays getting devices to students who did not have
them, and good quality internet access was either unavailable or finan-
cially unattainable for many. School boards did send cellular internet
hubs out to students, but this process delayed the participation of many
students who were often already marginalized (Barrett, 2021). Further,
the emotional load students were carrying during the pandemic was
exacerbated by disrupted routines and their separation from friends
(Bozkurt et al., 2020).
In response to these extraordinary circumstances, the Ontario Ministry
of Education mandated that, although teachers were still expected to
Ubuntu at a Distance 333
assess and assign grades to students’ work, final grades should not be
negatively affected by ERT (Lecce, 2020). All Ontario school boards took
this instruction to mean that grades could not drop below where they
stood when the school closures began. This policy protected students
whose circumstances prevented them from participating fully in ERT
from immediate academic penalty; however, it also meant that any stu-
dent who was satisfied with the grades they had earned up until March 12
could opt out. Many older students chose to work instead of attending
school and, as the school closures continued into the spring, many fam-
ilies chose to opt out, as well (Barrett, 2021).
Methods
Conducted in the late spring and summer of 2020, this study involved
a survey and in-depth interviews. Participants were Ontario K–12 class-
room teachers who had been teaching in person and had had to switch
to ERT. Although they ranged in experience from novice to semi-retired,
the majority had more than 10 years’ experience teaching, with an almost
even split between elementary and secondary teachers. Three hundred
of the 762 survey participants volunteered to be interviewed, and the
50 interviewees were purposefully sampled (Creswell, 2013) to approxi-
mately match the demographics of the survey participants (age, years
of experience, elementary or secondary panel, gender) and provide var-
iety in geographic location (north; southwest; east; rural, urban, or sub-
urban; and Greater Toronto and Hamilton Areas). The one-hour (on
average) interviews conducted by videoconference included questions
about how the ERT affected their ability to assess students’ academic pro-
gress and their relationships with students. I examined the full transcripts
of interviews for themes through a process of decontextualization and
recontextualization (Tesch, 1990). Decontextualization involved open
coding of the transcribed interviews and collapsing the codes into larger
themes. Recontextualization involved analyzing these larger themes
through an ubuntu lens and relating the themes to the three aspects of
ubuntu: justice, caring, and community. I related assessing students’ aca-
demic needs to justice, accurate and equitable assessment to community,
and assessment of well-being to caring.
The qualitative analysis reported in this chapter focuses on interview
data and was guided by the following questions: How did teachers experi-
ence assessment during ERT? What can these experiences tell educators
334 Sarah Elizabeth Barrett
Findings
The participants in this study were concerned with the extent to which
they could respond to and account for students’ academic needs, noting
that, in normal circumstances, a lot of the assessment they did was in
the moment, with the purpose of constantly adjusting their teaching to
suit the class (e.g., changing the pace of instruction, adding clarification).
These adjustments were challenging to make in online platforms. Once
ERT was underway, as teachers attempted to engage their students and
assess the success of the lesson or how to pace instruction, they were
faced with what many described as a brick wall. For example, Stacey, a
high school social sciences teacher, said,
I don’t know what the kids are thinking. I can’t look at their faces and
say, “OK, you’re confused right now.” Because I know I can recog-
nize those faces . . . when kids either are drifting off because they’re
bored, or they’re confused, or they just don’t know what’s going on.
[I was] venting to other staff members, [saying], “Am I the only one
in this position? Am I doing something wrong? . . . Am I doing too
much? Am I inadvertently pushing my students away?” There was a
lot of questions about it. “Are they not engaging because of me? Or
are they not engaging because that’s just their choice in general?”
And there was always that question, “Am I doing enough?”
Losing [academic integrity] makes you feel like you’re losing every-
thing. Why should we be assessing and giving marks or grades to
the student? We could, for example, be saying, “OK, I’ll give you
a participation mark, if you participate.” But assessing or giving
marks to tests and quizzes, we don’t feel is right. [Under normal
circumstances], teachers are in control of tests and quizzes.
We know how to conduct a quiz or test in class. We take all the
precautions. . . . We want to have certain integrity in the assessment
process. So, [right now, with ERT], we are not in control of that. So,
we don’t feel it’s right. (Frank)
In a classroom, I can tell. “OK, that kid’s having a bad day. I’ll chat
with them quietly. We’ll put this [class work] aside for now.” [With
ERT], I couldn’t get that personal feel. I couldn’t get that personal
touch. And so [I would ask myself], “Was this child really produ-
cing D work or is this child having some struggles?” “Is this student
Ubuntu at a Distance 337
It was so hard to be away from them and know that they were having
their own struggles that we couldn’t really deal with, or having the
students that wouldn’t show up for anything, and you didn’t really
know what was going on.
Here, body language was referenced again, being used not just to gauge
the pace and effectiveness of a lesson or to ascertain if a student was
performing at their optimal level academically, but also to determine the
student’s psychological and social well-being for its own sake. Further,
Willis referred to ongoing assessment of students’ social interactions and
how they indicated well-being and areas of concern.
The teachers in this study expressed profound unease with the quality of
assessment that they were able to enact during ERT. They faced difficulty
ensuring that their assessments in the moment were sufficient to gauge
student needs, that their grading was accurate (i.e., reflective of students’
actual knowledge, skills, and understanding) and equitable (students’ aca-
demic needs were being met), and that their determination of student
well-being met their standards of care. These experiences are mirrored in
the literature (Moorhouse & Tiet, 2021).
In my analysis, I draw parallels between the communitarian ethos of
ubuntu and the participants’ assessment experiences. Assessing students’
academic needs is related to justice; accurate and equitable assessment is
related to community; and assessment of well-being is related to caring.
For example, whether teachers are in a position to respond to students’
academic needs and adjust accordingly is a question of justice, where
justice is defined as everyone having what they need to thrive. Ubuntu
states that justice is a communal and positive endeavour (Viriri & Makaye,
340 Sarah Elizabeth Barrett
How can the experiences of the participants in this study inform teacher
educators and assessment approaches in online environments? The
assessment experiences of the participants in this study demonstrate
that the collapse of classroom communities during ERT undermined
teaching and learning, including assessment. The antidote to this collapse
is community building. At the core of community building is care. The
teacher must continually assess individual students’ integration into the
342 Sarah Elizabeth Barrett
References
Bozkurt, A., Jung, I., Xiao, J., Vladimirschi, V., Schuwer, R., Egorov, G., Lambert,
S. R., Al-Freih, M., Pete, J., Olcott, D., Jr., Rodes, V., Aranciaga, I., Bali, M.,
Alvarez, A. V., Jr., Roberts, J., Pazurek, A., Raffaghelli, J. E., Panagiotou, N.,
de Coëtlogon, P., . . . Paskevicius, M. (2020). A global outlook to the interrup-
tion of education due to COVID-19 pandemic: Navigating in a time of uncer-
tainty and crisis. Asian Journal of Distance Education, 15(1), 1–126. https://doi.
org/10.5281/zenodo.3878572
British Columbia Ministry of Education and Training. (2022). Curriculum and
assessment. Government of British Columbia. https://www2.gov.bc.ca/gov/
content/education-training/k-12/support/curriculum-and-assessment
Brock-Utne, B. (2016). The “ubuntu” paradigm in curriculum work, language
of instruction and assessment. International Review of Education, 62(1), 29–44.
https://doi.org//10.1007/s11159-016-9540-2
Chaminuka, L., & Ndudzo, D. (2014). Students and staff perceptions on exam-
ination malpractice and fraud in higher education in Zimbabwe. Asian
Journal of Humanities and Social Sciences, 2(2), 78–90. https://ajhss.org/pdfs/
Vol2Issue2/9.pdf
Creswell, J. W. (2013). Qualitative inquiry and research design: Choosing among five
approaches (3rd ed.). SAGE.
Ferretti, F., Santi, G. R. P., Agnese Del, Z., Garzetti, M., & Bolondi, G. (2021).
Assessment practices and beliefs: Teachers’ perspectives on assessment
during long distance learning. Education Sciences, 11(6), Article 264. https://
doi.org/10.3390/educsci11060264
Gay, G. (2021). Culturally responsive teaching: Ideas, actions, and effects. In H.
R. Milner, IV, & K. Lomotey (Eds.), Handbook of urban education (2nd ed.,
pp. 212–233). Taylor & Francis Group.
Le Roux, J. (2000). The concept of “ubuntu”: Africa’s most important contribu-
tion to multicultural education? Multicultural Teaching, 18(2), 43–46.
Lecce, S. (2020). Letter to parents from the Minister—March 31, 2020. Government
of Ontario. www.ontario.ca/page/letter-ontarios-parents-minister-education#
section-11
Letseka, M. (2012). In defence of ubuntu. Studies in Philosophy and Education,
31(1), 47–60. https://doi.org/10.1007/s11217-011-9267-2
Lindsay-Dennis, L. (2015). Black feminist-womanist research paradigm: Toward
a culturally relevant research model focused on African American girls.
Journal of Black Studies, 46(5), 506–520. www.jstor.org/stable/24572888
Mayaka, B., & Truell, R. (2021). Ubuntu and its potential impact on the inter-
national social work profession. International Social Work, 64(5), 649–662.
https://doi.org/10.1177/00208728211022787
Ubuntu at a Distance 345
Metz, T., & Gaie, J. B. R. (2010). The African ethic of ubuntu/botho: Implications
for research on morality. Journal of Moral Education, 39(3), 273–290. https://
doi.org/10.1080/03057240.2010.497609
Moorhouse, B. L., & Tiet, M. C. (2021). Attempting to implement a pedagogy of
care during the disruptions to teacher education caused by COVID-19: A col-
laborative self-study. Studying Teacher Education, 17(2), 208. https://doi.org/
10.1080/17425964.2021.1925644
Nunavut Department of Education. (2008). Ilitaunnikuliriniq—Foundation for
dynamic assessment as learning in Nunavut schools. https://gov.nu.ca/sites/
default/files/ilitaunnikuliriniq_eng.pdf
Ontario Ministry of Education. (2010). Growing success: Assessment, evaluation,
and reporting in Ontario schools. Toronto: Queen’s Printer for Ontario
Oviawe, J. O. (2016). How to rediscover the ubuntu paradigm in education.
International Review of Education, 62(1), 1–10. https://doi.org/10.1007/
s11159-016-9545-x
Peled, Y., Eshet, Y., Barczyk, C., & Grinautski, K. (2019). Predictors of aca-
demic dishonesty among undergraduate students in online and face-to-face
courses. Computers and Education, 131, 49–59. https://doi.org/10.1016/j.
compedu.2018.05.012
Rose, E., & Adams, C. (2014). “Will I ever connect with the students?”: Online
teaching and the pedagogy of care. Phenomenology & Practice, 7(2), 5–16.
https://doi.org/10.7939/R3CJ8803K
Rovai, A. P. (2002). Building sense of community at a distance. The International
Review of Research in Open and Distance Learning, 3(1), 1–16. https://doi.
org/10.19173/irrodl.v3i1.79
Tartavulea, C. V., Albu, C. N., Albu, N., Dieaconescu, R. I., & Petre, S. (2020).
Online teaching practices and the effectiveness of the educational process in
the wake of the COVID-19 pandemic. Amfiteatru Economic, 22(55), 920–936.
https://doi.org/10.24818/EA/2020/55/920
Tesch, R. (1990). Qualitative research: Analysis types and software tools. Falmer Press.
Unal, A., & Unal, Z. (2019). An examination of K–12 teachers’ assessment beliefs
and practices in relation to years of teaching experience. Georgia Educational
Researcher, 16(1), 4–21. https://doi.org/10.20429/ger.2019.160102
Viriri, E., & Makaye, J. (2020). Unhu/Ubuntu and examination malpractice in
Zimbabwe: Perceptions of selected stakeholders from Masvingo urban sec-
ondary schools. Journal of New Vision in Educational Research, 1(2), 321–338.
http://ir.gzu.ac.zw:8080/jspui/bitstream/123456789/367/1/UnhuUbuntu
%20and%20examination%20malpractice%20in%20Zimbabwe%20
Perceptions%20of%20selected.pdf
346 Sarah Elizabeth Barrett
Note: Page numbers in italics indicate figures and those in bold indicate tables.
digital portfolios 265, 267, 268 feedback; see also online learning
Dillman, D. A. 77 – 78 feedback: assessment
discrete units 284 – 285, 285, 286, principles and 75 – 76, 81,
286 – 287, 287 – 288, 288 81 – 82, 85 – 86; effective,
distance learning, defined 316 principles of 126 – 127; online
Donald, D. 114 learning 126 – 141
Doyle, E. 55 – 56 feedback literacy 132, 132 – 133
Flip (web-based platform) 118
EdTech malfunction 206 – 221; formative assessments: decolonizing
assessment of learning and assessment through
209 – 211; centre students and 119 – 121; described 93 – 94;
210; decentre content and in online teaching practices
209 – 210; demoralized teacher 97 – 99; teacher candidates’
participant feelings and 213 – 216; perceptions of online 100 – 101
Freirian dialectical methodology formative classroom assessment
and 217 – 218, 218; humanizing 294 – 296, 300; defined 294 – 295
assessment and 216 – 220; online Freeman, K. 56
pedagogy and 208 – 209; overview Freire, P. 213 – 214, 217
of 206 – 208; regressive practices Freirian dialectical methodology
and 212 – 213; study context 211; 217 – 218, 218
study findings 211 – 216; teacher Friesen, S. 295
professionalism and 210 – 211
Education Endowment Foundation Gallagher-Mackay, K. 294
(EEF) 304 Garrison, D. R. 145, 147, 155 – 156,
E-learning, defined 316 157, 158, 315
emergency remote teaching (ERT) Gikandi, J. 156
330–343; accuracy/equity Global Online Learning Alliance 312
concerns 335–337; assessment Gonzalez, J. 134 – 135
and, expanding conceptions of good performance, clarifying 127 – 130
339–341; context of 332–333; Google Sheets, pedagogical hacking
overview of 330; students’ and 192 – 193, 193
academic needs and 334–335; Graduate Student Association,
students’ well-being and Melbourne University 146
338–339; study methods used Graham, C. R. 156, 315
333–334; teacher education Grossman, P. 62
and, implications for 341–343 Growing Success (Ontario Ministry of
e-portfolios 131 Education) 225, 331
evaluation, defined 272 Guasch, T. 133 – 134
evidence of learning, assessment
principles and 76 – 77, 86, 247 hacking, educational contexts of 190
exceptional students in VLE, Hanson, A. 118
supporting 322 – 324 Hattie, J. 77, 93, 94, 126, 248, 267
external tensions 41 – 42 Hopper, T. 187 – 188
Hughes, M. 116, 117 – 118
Fadel, C. 187
Failure to Disrupt: Why Technology identity development, tensions in 37
Alone Can’t Transform Indigenizing practice 113; see also
Education (Reich) 248 – 249 decolonizing assessment practices
Index 351
redefinition stage of SAMR model single point rubric (SPR) 134 – 137,
278 – 279 135, 136 – 137
reflective practice 139 small group assessment 280, 281 – 282,
Reich. J. 248 – 249, 255, 267 282
remote learning, defined 316 Smith, B. 190, 191
repertoire of practice 19 Smith, L. T. 115
rubric cocreation 257, 258 – 259, 260 social presence, CoI model 148
rubrics, online learning feedback and Solder, R. 56, 61
129 – 130 Sparkes, A. 191
station rotation 280, 281 – 282, 282
Saldaña, J. 78 Stefani, L. A. 55
SAMR model see substitution, Stevens, D. D. 129 – 130
augmentation, modification, Stiggins, R. J. 164
and redefinition (SAMR) Stommel, J. 213
model Struve, M. E. 129
Sanford, K. 187 – 188, 218 students’ academic needs, ERT and
scaffolded stages to close 334 – 335
performance/learning goals students’ well-being, ERT and
gap 138, 139 338 – 339
Schimmer, T. 195, 196, 200 Styres, D. 114 – 115
screencasts, online learning feedback substitution, augmentation,
and 128 modification, and redefinition
Scriven, M. 93 – 94 (SAMR) model 226, 227,
Scull, J. 16 – 17 239, 271 – 290; assessment/
secondary school teacher adaptations evaluation comparison
293 – 307; authentic 272 – 273; for assessments
performance assessments 276 – 279; assessments as, for,
and 302 – 304; formative and of learning framework
classroom assessment 273 – 276; augmentation stage
294 – 296; overview of 277; benefits/considerations
293 – 294; phenomenological for technology use 288 – 290,
design method explained 289; discrete units and
296 – 299; student feedback/ 284 – 285, 285, 286, 286 – 287,
self-assessment and 304 – 306; 287 – 288, 288; modification
successes/challenges in stage 278; overview of 271;
300 – 302 progressive skill development
self-assessment 120 – 121, 273; online and 279; redefinition stage
learning feedback and 278 – 279; side-by-side
130 – 131; positive outcomes assessments 282 – 284, 283;
of, by level 131 small group assessment and
self-checkout education 213 280, 281 – 282, 282; substitution
self-perception, positive 134, 135, 135, stage 277
136 – 137, 137, 138 substitution stage of SAMR model 277
shared brainstorming tools 322 summative assessments: decolonizing
Shepard, L. 234 – 235 assessment through 119 – 121;
Shulman, L. 18 described 94; elements of,
side-by-side assessments 282 – 284, 283 of importance to teacher
354 Index