You are on page 1of 45

Running head: Peer-mentoring model of technology training for teachers

Educational Design Research (EDR) project


Title: Peer-mentoring model of technology training for teachers

Submitted By:
Mashiur Rahaman

For the Class:


EIPT 5970 Conducting Educational Design Research
Semester: Fall 2020
Running head: Peer-mentoring model of technology training for teachers 2

Abstract:

Demands for the inclusion of instructional technology in classrooms have grown expediently for
last couple of decades. While none deny the importance of such new world initiative to
improve learning environment, it has been noted that more attentions were given at studying
student experiences at introduction of a new technological tools, as compare to the very group
of people whom we rely upon for meaningful integration. Driven by the never-ending
innovation of instructional technologies, we have continued to put increasing pressure on our
teachers who are already overworked, underpaid, and largely unrecognized. Realizing the need
for teachers’ role in effective technology integration in instruction, this educational design
research project aims to take upon a generic professional development model and modify them
by infusing a change in learning environment in the form of interactive peer-mentoring training
model. This EDR project will introduce Learners-Tech-Support or (LTS), an online model for
technology training as intervention to study what impact the controlled changes in learning
environment might have on learning. This project will generate training module specific to
Canvas Learning Management System (LMS) that could be generalized for any technology
training in this genre.

Keywords: teachers training, technology training, peer-mentoring, professional development,


online training, etc.
Running head: Peer-mentoring model of technology training for teachers 3

List of Contents
Introduction --------------------------- Page 4
Problem Analysis -------------------- Page 4
Literature Review ------------------- Page 7
Research Questions ----------------- Page 12
Proposed intervention ------------- Page 14
Field-based investigation --------- Page 14
Design --------------------------------- Page 17
Prototype ----------------------------- Page 22
Timeline ------------------------------ Page 27
Evaluation & Reflection ---------- Page 27
Determine Methods --------------- Page 33
Conclusion --------------------------- Page 41
References -------------------------- Page 42
Running head: Peer-mentoring model of technology training for teachers 4

Introduction
Expectations from our schoolteachers have grown exponentially over the years. In the
already ‘resource-starved’ schools, teachers receive little (if any) training on how to use an
instructional tool and are expected to satisfy technically sophisticated student clients. These
training are mostly not ‘demand driven’ and are often imposed on them (teachers), ignoring the
fact that one-stop training is hardly effective in technology skills development (Ruggiero &
Mong, 2015). Such push for new technology integration also ignores the fact that not all
teachers (learners) are equally receptive to new technologies (Graves and Bowers, 2018).
Besides, the existing trend of ‘external resource (trainer)-to-teacher’ training method has been
proven largely ineffective (Davis, Preston & Sahin, 2009) and have yield poor rate of knowledge
transfer (Ibrahim and Al-Shara, 2007, July).
It is time to think out of the box and develop a ‘participant-centered’ peer-mentoring
based interactive training method. This Educational Design Research (EDR) project is one such
initiative. Target audience for this study is schoolteachers of all levels who recognize the need
for technology integration in their classrooms.

Problem analysis
McKenney and Reeves (2019) described a problem as “the discrepancy between the
existing and the desired situation” (p. 93). They have also defined the term ‘solution’ to
describe the educational intervention that is created in response to the problem in hand.
Reality is, knowledge of how to use a technology as a tool is different from knowing its
instructional usages. Our existing training protocol/modules for instructional technology lacks
contents of instructional usages. It presents our problem in hand, which is, only a handful
teacher uses newly introduced technology in class in a given school district in ways that is
transformative with the respect to knowledge of how instruction is planned, implemented, and
evaluated. On the same note, our solution to the problem, as McKenney and Reeves (2019)
suggested, should offer an educational intervention in response. Assuming the existing training
protocol lacks subject relevancy and are not problem driven, this EDR project propose an
intervention to offer continuous, demand-driven, and community-based professional training
model for teachers.
Running head: Peer-mentoring model of technology training for teachers 5

There are two perspective to approach the problem in hand, that is reductionist and
system approaches (McKenney & Reeves, 2019). From the reductionist perspective, we seek to
understand the problem, its direct and indirect causes, and to analyze each components of the
problem. But to understand the problem more in-depth (system approach), we also need a
holistic understanding of the problem, and how components in this system interact. Using both
the reductionist and system perspectives during problem analysis, our aim is to:
- first portray the situation as it is and provide explanations for this
- second assess what is desired, and
- third distinguish between potentially changeable and unchangeable elements
in the target setting.
We realize the importance of understanding problem ‘jurisdiction’ in design research
(McKenney & Reeves, 2019), so this EDR project will attempt to address only the limitations in
existing teachers’ professional development training strategies in terms of technology infusion.
Daniel C. Edelson (2006) explained the role of design research as “it (design research)
begins with the basic assumption that existing practices are inadequate or can, at least, be
improved upon, so that new practices are necessary (p.103).” Following his assumption, we
have inadequate training models in place that has failed to improve teachers’ tech-skills to
improve instruction using technology as intended. On the other hand, we see a need for ‘new
training model’ to change the existing situation.
There are three fundamental issues, according to McKenney and Reeves (2019), that
needs to be probed during the informed exploration of a problem (as part of the initial
orientation). Those are:
- What is the current situating?
- What is the desired situation?
- What is already known or suspected about causes for this discrepancy?
In current situation, we see the existing technology training methods are failing to make
teachers ‘willing and enthusiastic’ participants to integrate new technologies in their
classrooms. Desired situation would be, as a result of the training, all teachers see the newly
introduced technology as a useful tool to use, making their instruction easier, more effective,
Running head: Peer-mentoring model of technology training for teachers 6

and engaging. But we all know that is not the case. Known cause for this discrepancy between
the desired and existing situations could be of many folds. One such cause is the absence of
post-training support. Also, there seems to be a disconnect between the trainee-trainer
knowledge bases. Training module, designed and offered by the external resources (tech-
experts) lacks issue/subject relevancy, thus obstruct successful transfer of knowledge. Third,
the existing one-stop training module ignores the establish fact that not all teachers are equally
receptive to technological changes. This EDR project aims to explore all these problems and find
solutions through controlled intervention.

What do we want to know about the context?


The one-stop tech-trainings are mostly not ‘demand driven’ and are often imposed on
teachers. School systems are already under a lot of financial pressure. These ineffective training
has not only been a waste of school resources and time, but also a wasted opportunity to
integrate the latest technologies in effective instructions. We need to change the way we train
our teachers. Instead of ‘imposing’ a technology on our teachers, we need to focus on what our
teachers are asking for, and train them for a specific problem instead of the whole ‘how to use’
module.

What do we want to know about the stakeholder's needs and wishes?


To understand the stakeholders’ needs and wishes, first we need to identify who are
directly and indirectly affected by the problem in discussion. In this EDR project, we consider
schoolteachers as the primary stakeholders. On the other hand, non-teaching members of the
school like the school administrators, principal, and others could be considered as the indirect
stakeholders. Roles of these indirect stakeholders are no way less important though as they
help create desired learning environment. Conducting a research without their cooperation
would be a futile attempt.
There is another indirect stakeholder group we can identify for this project, external IT
resources who are often ‘brought in’ for training to the school at the inception of any new
technology. Involvement of these external resources are important in designing initial ‘how to
Running head: Peer-mentoring model of technology training for teachers 7

use’ trainings and assessing performances. They could be considered as Subject Matter Experts
(SME) of technology to whom teachers could turn for tech assistance if they (teachers) feel
needed at any stage.

Literature Review
In literary investigation for this EDR project, we will attempt to identify what issues
other researchers have located so far in terms of technology infusion in instructions, what limits
educators to incorporate instructional technologies in classrooms, what solution they
(researchers) proposed to counter the problem in hand, and possible gaps in theoretical and
practical understanding of the problem.

Need for technology training and more:


It is beyond any doubt in the fact that integration of instructional technology in
classrooms are not a luxury, but a necessity. But there seems to be a misunderstanding over the
concept of ‘integration.’ Introducing a tool like computer or laptop does not ensure technology
integration in classrooms, because technology alone does not lead to change, rather it is a
teacher’s ability to use the technology drive change in education (Carr, Jonassen, Litzinger, &
Marra, 1998). There lies the problem. We have introduced technologies as tools to schools but
paid little attention to those who will use them. This gap was evident in 1995’s Office of
Technology Assessment report sponsored by the US Congress (Congress, U. S. Office of
Technology Assessment (1995). The assessment identified the lack of teachers training as the
greatest roadblock to technology integration into school curriculum. The report also revealed
that most school districts spend less than 15% of their technology budget to teacher training.
Technology could be seen as a knowledge system (Hickman, 1990) that comes with its
own biases, and affordance (Bromley, 1998; Bruce, 1993) making some more applicable in a
situation than others. To make a technology tool effectively integrated in education system, a
teacher needs to feel comfortable/competent enough to develop an understanding of the
complex web of relationship between users, technologies, practices, and tools (Koehler &
Mishra, 2005). A true technology integration as described by Koehler and Mishra (2005) is the
Running head: Peer-mentoring model of technology training for teachers 8

understanding and ability to negotiate the relationships between three components of


Knowledge, i.e. “Technological, Pedagogical, and Content Knowledge (TPCK)” (p. 2).
In reality, lack of technology in American classrooms is a major concern in education
today. A nation-wide survey conducted by Gates Foundation (Abbott, 2003) found that more
than half (53%) teachers do not use technology in classrooms on regular basis. The trend
continued in surveys conducted a few years later that found over 80% of K-12 teachers are
using computer mainly for administrative functions (National Teacher Survey, 2005). Lack of
teacher willingness was not the problem here as the survey also revealed that ‘slightly more
than half’ teachers have attempted to integrate computers into routine instructions. It was
more intriguing to see what the National Teacher Survey (2005) revealed about the technology
training. Teachers who participated in this survey reported that the technology training they
have received at the introduction has focused on administrative application, rather than
instructional applications. On top of that, it was reported that one third of teachers have
received little to no training about ways to integrating computers into their lessons or received
any training on instructional software (National Teacher Survey, 2005).
We have identified ample literary evidence to support the idea that teachers display
negative attitude towards instructional technology integration in classrooms largely because of
inadequate technology training (Congress, U. S. Office of Technology Assessment, 1995;
Reynolds & Morgan, 2001; Yildirim & Kiraz, 1999; Yildirim, 2000). Obviously the basic ‘how to
use’ tech-training is unlikely to ensure the successful infusion of technology in instruction.
Effective technology integration requires “teacher’s participation in intensive curriculum-based
technology training” Zhao & Bryant (2006, P. 53). Such training is required to move them pass
the attainment of basic operating skills to seamlessly integrate technology into the curriculum
(Baylor & Ritchie, 2002; Becker, 2001; Redish, 1997; Reynolds & Morgan, 2001; Roberts, 2003;
VanFossen, 2001; Wenglinsky, 1998).
In 2006, Yali Zhao and Frances LeAnna Bryant conducted a study with one question in
mind, i.e. “can teacher technology integration training alone lead to high levels of technology
integration?” They attempted to address the issue by examining two qualitative datasets
related to technology integration training. One set of data was collected from teachers of social
Running head: Peer-mentoring model of technology training for teachers 9

studies and their post-training technology integration levels. The second dataset was collected
from elementary teachers focused on the role of mentors. Four questions guided the research:
(a) How do these teachers perceive the technology integration training they received?
(b) What impact does technology training have on their use of technology in the
classroom?
(c) What are the barriers that still exist inhibiting these teachers from more frequent
and effective use of technology?
(d) What effect does peer coaching/mentoring after the basic training have on these
teachers’ use of technology in the classroom? (Zhao & Bryant, 2006, p. 54).
The study result revealed that the social studies teachers expressed the need for ‘one-
on-one follow up support to training’. The elementary teachers, as revealed by the study,
reported that post-training mentorship to be the most effective technology related staff
development they have experienced. This study validates the argument that ‘how-to-use’
technology training is helpful but there is a need for continuous post-training tech-supports or
one-to-one mentoring in order to ensure useful integration of instructional technologies in
classrooms (Zhao & Bryant, 2006).

Preconditions for effective tech-training:


In an article published in the Journal of Staff Development in 1997, Glenn A. Brand, an
Ontario based computer teacher, presented a list of preconditions that has to be considered for
an effective technology integration plan.
Free time: Teachers must be given free time to learn about a technology and eventually
transfer acquired knowledge to classrooms.
Varying needs taken into account: A technology training model must take the individual
differences of our teachers into account and supplement individual strengths in the process.
‘Magic-bullet’ model doesn’t exist: Teacher training program must not expect ‘one size
fits all’ strategy to work. Rather, it should be accepted that not every teacher will leave the
training with same level of skills.
Running head: Peer-mentoring model of technology training for teachers 10

Relevancy matters: The best way to encourage teachers to infuse technologies in


instruction is ‘invest in someone with experiences in both teaching and technology.
Collaborative learning environment: Because teachers come to the training with varied
levels of technology skills and experiences, the technology training must provide a ‘non-
threatening’ collaborative learning environment that is sensitive to individualism.
Remuneration and recognition: When a teacher take the effort to take out time to learn
a new technology, that effort must be recognized if not rewarded by incentives or
remuneration.
Maintaining focus on instruction: The technology training must have a clear focus on
instructional usages instead of administrative use of a tech-tool (not how to use instruction).
Make training learner centric: The technology training needs to stimulate learning by
engaging learners to reflect on the benefits and limitations of teaching with technology.
Ensure administrative support: Active involvement of school administrators (non-
teaching components of the school) is critical to encourage teachers seeking out for training
and finding uses in instruction (Brand, 1998).

Not all teachers are equal:


A Columbia University study conducted in 2018 successfully categorized four types of
technology using teachers (Graves & Bowers, 2018). Those categories are:
Dexterous: Flexible and first to accept any new technology introduced to them. These
are wide ranging teachers who are eager to integrate technology for different modes and
purposes of instruction.
Evaders: Opposite to Dexterous teacher category. These group of teachers are naturally
resistant to any technology inclusion. Evaders are resistant to use technology in every way,
including sending emails to students and taking daily attendance. If they have to, this group of
teaches use technology out of external pressure.
Assessors: These group of teachers are selective adopters. They are most comfortable
with using technology as drill and practice software, directing students to use technology to
Running head: Peer-mentoring model of technology training for teachers 11

practice basic skills in content areas like mathematics and reading. Teachers in this category
have limited technological pedagogical content knowledge.
Presenters: Presenters are teachers with low technological pedagogical content
knowledge. They prefer using technology to aid with lectures, while also guiding students to
use presentation software to produce written texts and presentations.

Way forward:
After an initial training, the lack of ongoing technical and curriculum supports can hinder
the effective technology integration in classrooms. Zhao and Bryant (2006) argue that
technology training that our teachers receive are mostly ‘short-term’ with no or minimum
follow-up support following the initial training, making them less effective. This concept was
put on test by Marilyn K. May (2000) in her study titled: ‘Mentoring for Technology Success.’
She evaluated the possible effects of mentoring, following a basic technology training, and
found that when one teacher serves as mentor to other teachers as follow-up peer-mentor, the
likelihood of successful integration was three times higher than the group without follow-up
peer-mentoring supports (May, 2000). In addition, teachers with the follow-up mentoring
supports reported ‘confidence’ in using technology, and it reportedly helped them work
through complexed technical issues. Teachers who received peer-mentorship also expressed
desire to continue integrating technologies in classrooms (May, 2000).
Similar outcome also came out of a study conducted by Susan E. Davis (2002) who
studied “The effect of one-on-one follow-up sessions after technology staff development
classes on transfer of knowledge to the classroom.” As part of her investigation, she studied
teachers from the Georgia Technology Integration (InTech) training. She found that participants
who received one-to-one follow-up assistance showed higher levels of technology integration
as compare to participants without follow-up assistance. Thus, it is established that the post-
training follow-up support and peer-mentoring system are vital for successful integration of
technology in instruction. Such added support foster collaboration and support, to address daily
challenges, and ultimately ends in more frequent and effective use of technology in the
Running head: Peer-mentoring model of technology training for teachers 12

classroom (Carlson & Gadio, 2002; Di Benedetto, 2005; May, 2000; O’Dwyer, Russel & Bebell,
2004).
Additionally, Koehler and Mishra (2005) argue that the common basic training
(workshop methods) of technology training focus on developing software and hardware skills.
But they do not help teachers understand how technology interact with particular pedagogies
or specific subject matters. In response to these needs, they proposed ‘the learning by design
approach’ where teachers participate in designing technological artifacts. “By participating in
design, teachers are confronted with building a technological artifact while being sensitive to
the particular requirements of the subject matter to be taught, the instructional goals to be
achieved, and what is possible with the technology” (Koehler & Mishra, 2005).

Research Questions and Justifications


“Computers aren’t magic, teachers are,” quoted by Price (2015, p. 9) to indicate our
misdirected approach to integrate technology in instruction. There has been a wealth of literary
evidence to prove that improvement of teachers’ ability to use technology should be the prime
focus for technology trainings, instead of dumping technology tools on them. So, many see the
biggest challenge for effective integration of instructional technologies in school are in the area
of institutional capacity and teacher professional development, rather than availability of
instructional technologies (Lopes 2003). The existing one-stop tech-trainings are mostly not
‘demand driven’ and are often imposed on teachers. School systems are already under a lot of
financial pressure. These ineffective training has not only been a waste of school resources and
time, but also a wasted opportunity to integrate the latest technologies in effective
instructions. We need to change the way we train our teachers. Instead of ‘imposing’ a
technology on them, we need to focus on what our teachers are asking for, and train them for a
specific problem at a time in a learning environment conducive to them.
Literature investigation revels the need for unconventional ways of teachers training. In
this project, I am designing a ‘participant-centered’ interactive training platform for teachers to
improve their working knowledge of Canvas LMS system. Target audience for this project are
the schoolteachers of all levels.
Running head: Peer-mentoring model of technology training for teachers 13

Research goals/purpose
After reviewing literature, it has been established that the existing teachers’ training
models in terms of technology integration have largely been ineffective. The main problem is
the lack of sustainability of the current training model where an external training resource is
appointed to ‘train’ teachers on ‘how to use’ at the launch of new technology. This one-stop-
training model leave teachers to figure out their own problems with minimum to none follow-
up supports. Considering these, the purpose of the research project is to develop a sustainable
technology training model for teachers. The model needs to be of following characteristics:
- Comes with a support-platform along the training resource
- Are demand driven
- Continuous
- Interactive
- Participant-centered
- Based on peer-mentoring community learning model

Research questions:
RQ: How changes in training/learning environment impact technology integration among
teachers. This question could be broken down into following sub-sections.
Ø How the freedom to choose ‘training time’ impacts adopting a new technology?
Ø How the freedom to choose ‘mentor’ impacts learning about a new technology?
Ø How the freedom to make ‘content-specific inquiry’ impacts learning?
Ø How inquiry ‘anonymity’ impacts learning process about a new technology?
As part of this design-based research project, I propose to introduce Learners-Tech-
Support or (LTS) as intervention. This interactive ‘peer-mentoring’ support platform will be
offered to learners (teachers) who show initial interests to use Canvas LMS System in
instruction. LTS will take shape as a website, housing ‘unidirectional’ training contents as basic
training module. Participants will have the freedom to pick one specific training components at
a time, and ask peers for content-specific assistance if they feel needed. Each new inquiry will
generate a ‘inquiry chain’ concerning the specific training content.
Running head: Peer-mentoring model of technology training for teachers 14

Proposed intervention
Literature investigation revels the need for unconventional ways of teachers training, it
is time to develop a ‘participant-centered’ interactive training platform. Target audience for
such intervention are the schoolteachers of all levels.
As part of this design-based research project, we propose to introduce Learners-Tech-
Support or (LTS) as intervention. This post-training ‘community-support platform’ will be
offered to learners (teachers who participated in the basic training) organized, moderated and
patronized by the teachers of the same training group.
LTS will take shape as a website containing ‘unidirectional’ training contents as basic
training module, like an instructional video or PowerPoint slides on ‘how to use’ the technology
as it is being introduced. As designed intervention, research project will ensure following
aspects:
- The basic training content will be followed by a ‘survey’ in order to determine
initial knowledge transfer.
- A safe/secure communication channel will be introduced for participants of the
training module where they will be encouraged to register ‘anonymously’ and
seek help from community. The ‘anonymity’ will ensure freedom of expression
and will safeguard participants from possible peer-shaming.
- Frequency and quality of the inquiries/call for assistance will be recorded for
analysis. Each participant will be cataloged in three comparative columns, based
on the initial survey performance, posts requesting tech-assistance, and post to
help others in response to their requests for assistance.
- Different sets of ‘transfer of learning assessment survey’ will be introduced
with a regular interval to determine participants efficiency in learning and use of
the new technology in workplace.

Field-based investigation
Field based investigation as part of this design research project will offer the researcher
to assess the extent of the problem in real context and be in personal contact with the
Running head: Peer-mentoring model of technology training for teachers 15

stakeholders. As the proposed design research project targets school educators as target
audience, it is important for the researchers to be familiar with the learning environment of the
schools and the administrative policy of the school district regarding teachers’ professional
development.
Field-based investigation will address components suggested by McKenney and Reeves,
(2018):
Planning:
Refine focus: The research is focused on assessing teachers’ professional development.
This focus is broad at the early stage and are subjected to get narrowed down as the project
progresses.
Frame questions: Analysis question at this stage of field-based investigation will frame
around the limitations of teachers training in terms of technology infusion and the what could
be done to help.
Select strategy: Selecting a workable strategy to conduct the field-based investigation in
vital. Strategy at this stage should be based on the research question and the constraints of the
study like time, personal, costs, access to respondents. Considering this, this field-based
investigation will assume SWOT analysis (strengths, Weakness, Opportunities, and Threats) as
strategy to address problem, context, and needs related questions. In response to problem
related question, this study will investigate how a change or addition of community learning
and/or peer-mentoring mitigate gap between technology knowledge and their usage in
instruction.
In context related question, the SWOT analysis strategy will investigate how peer-
mentoring and/or community learning encourage less tech-savvy teachers to use technology in
instruction.
In needs related question, the SWOT analysis strategy for this study will investigate
participating teachers’ responses to request for tech-help from others. Also, this study will look
into how the quality and quantity of requests and responses change over the time.
Determine methods: This educational design research project will apply ‘quasi-
experimental’ research method to assess the causal impact of the designed intervention on
Running head: Peer-mentoring model of technology training for teachers 16

target population. Participating teachers will be randomly divided into two groups. Both the
group will receive same technology training through website but only one group will get access
to the ‘communication platform.’ The group with access to the ‘communication platform’ is our
‘conditioned group’ and the one with no access will be controlled group. Both groups will
receive ‘transfer of learning assessment survey’ at the same time throughout the study for
comparison of their changing instructional technology usage efficiency (if there is any).
Survey, observation, and comparative data analysis will be used as tools in this research project.
Document plan: The field-based investigation is expected to take about a month to
create instrument, engage participants, collect data and analyze them for reporting. Since this
project is about efficient infusion of instructional technologies in classrooms, sincere attempts
will be made to arrange financial contribution from related tech-companies before the field-
based investigation begins. In order to maintaining project deadlines, policy related formalities
like IRB approval, administrative permission from the school districts will be applied and
arranged before the field-based investigation is initiated.

Fieldwork:
Prepare instruments: As part of project intervention tool, a training website will be built
for this purpose containing the basic training module/contents. Data will be collected through a
‘transfer of learning assessment survey’ and observation of participants’ activities on the
discussion platform. The survey questioners will be determined ahead of time to obtain
approval from IRB.
Engage participants: This EDR project will include teachers who show interest in
learning about instructional technologies and their usage in classrooms. These participants will
be approached as part of the field-based investigation and consents will be collected from
willing participants.
Collect data: Main sources of data for this field-based investigation will be the survey
responses and discussion participation. Each participant will be cataloged in three comparative
columns: Scores of the initial ‘survey’ performance, posts requesting tech-assistance, and post
to help others in response to call for assistance.
Running head: Peer-mentoring model of technology training for teachers 17

Meaning Making:
At the end of the field-based investigation, collected data will be compared and
analyzed in order to determine the accuracy of the iteration. If the need for modification of
strategy is felt, it will be accommodated in the research design. If needed, IRB will be notified or
updated in case of major changes.

Design
“Computers aren’t magic, teachers are,” quoted by Price (2015, p. 9) to indicate our
misdirected approach to integrate technology in instruction. There has been a wealth of literary
evidence to prove that improvement of teachers’ ability to use technology should be the prime
focus for technology trainings, instead of dumping technology tools on them. So, many see the
biggest challenge for effective integration of instructional technologies in school are in the area
of institutional capacity and teacher professional development, rather than availability of
instructional technologies (Lopes 2003). The existing one-stop tech-trainings are mostly not
‘demand driven’ and are often imposed on teachers. School systems are already under a lot of
financial pressure. These ineffective training has not only been a waste of school resources and
time, but also a wasted opportunity to integrate the latest technologies in effective
instructions. We need to change the way we train our teachers. Instead of ‘imposing’ a
technology on them, we need to focus on what our teachers are asking for, and train them for a
specific problem at a time in a learning environment conducive to them.

Technology in focus: Canvas LMS System


Instead of investigating technology in general, this project will look specifically into
‘Canvas LMS System (a popular Learning Management System). In order to concentrate
attention during prototype design, this project will design a modified training module as
intervention focused only on one Canvas feature (designing and managing assessments). Time
is right for this study as many schools have recently adopted Canvas LMS System as
instructional tool to going into virtual or blended instruction mode (Marachi & Quill, 2020).
Running head: Peer-mentoring model of technology training for teachers 18

Following the trend of higher educational institutions, schools across the country are
also being under pressure to integrate LMS system amid growing trend of blended instruction
(Towne, 2018). He conducted the study to investigate teachers’ motivational and attitude
factors for integrating the LMS Canvas into their blended-learning courses. The study found that
schools that have chosen to implement an LMS face multiple challenges in motivating teachers
and students to accept and integrate the new technology into their course curriculum (Towne,
2018).
In my professional experience in course designing, I have also witnessed how faculty
members struggle with understanding and using Canvas in instruction. Designing and managing
assessments seem to generate the greatest number of inquiries and calls for assistance. This
project will take this experience and attempt to modify the way we train school teaches one
learning aspect at a time.
Afshari, et al. (2009) investigated what prevents teachers to integrate instructional
technologies in classrooms. They have compiled a list of 12 aspects with proven record of
having impact on technology integration. This list could be used to explain the main
characteristics or strategies for the learning environment modification this EDR project is
proposing (to conserve time and keep the project in focus, we will only explain aspects from the
list which are relevant to this project’s goal. The complete list has been discussed in literature
review).
- Availability of Vision and Plan about the Contribution of ICT to Education:
It is important that teachers are involved in the planning process of technology integration. A
proposed technology demand should generate from the teachers, not from the administration
or the tech-industries. For this, teachers must have opportunities to study, observe, reflect, and
discuss their practice, including their use in instructions, in order to develop a sound pedagogy
that incorporates technology (Kearsley & Lynch, 1992). This proposed EDR project aims to
create a learning environment in schools where teachers are not only be a part of planning a
technology integration, but also have voice to decide how they want to be trained about its use
in instruction.
Running head: Peer-mentoring model of technology training for teachers 19

- Level of and Accessibility to the ICT Infrastructure:


Unsupported technology infrastructure is one of the main cause of frustrations among teachers
in integration of instructional technologies in classrooms. Using up-to-date hardware and
software resources is a key feature to diffusion of technology (Gulbahar, 2005). Teachers need
to have the access to the latest version of technology software, supported by solid and reliable
hardware system. The continuous training mode as proposed by this EDR project thus ensure
the timely upgradation of instructional technology and ensure keeping learners informed about
any changes.
- Availability of Time, to Experiment, Reflect and Interact:
There is no better ways to learn a new technology other than putting them to practice. But this
approach costs significant amount of free time. According to Mumtaz (2000), lack of free time
to practice a new technology is one of the factors that hinders technology integration in
schools. Similarly, a study conducted by National Center for Education Statistics (2000) also
revealed that 82% of participating teachers thought lack of release time was the most
significant factor that prevented them from using technology in instructions. Teachers felt that,
with their regularly scheduled classes, they did not have enough opportunities to practice using
technologies in their classes. Hence, this EDR project aims to give teachers freedom to choose
when and where they want to learn about the technology in discussion.
- Available Support to Technology -Using Teacher in the Workplace:
The National Council for the Accreditation of Teacher Education (NCATE) (1997) in a study
found that lack of technology supports significantly contribute in demotivating teachers in
integrating technology in instructions. Teachers did not want to use computers because they
were not sure where to turn for help when something went wrong while using computers.
Therefore, lack of technical support is very stressful for the teacher, which may affect the
teacher's willingness in the adoption of ICT (Tong & Trinidad, 2005).
This concern could be addressed in the proposed EDR project as the peer-mentoring system
ensures availability of tech-help at all times. Similarly, the community-based learning system
means teachers are not limited to receive help from one person or a group.
Running head: Peer-mentoring model of technology training for teachers 20

- Level and Quality of Training for Teachers and School Principals:


According to Schaffer and Richardson (2004), when technology is introduced into teacher
education programs, the emphasis is often on ‘teaching about technology’ instead of ‘teaching
with technology’. So, the lack of understanding of using technology is one of the reasons that
teachers do not systematically use technology in their classes. Teachers need to be given
opportunities to practice using technology during training so that they can see ways in which
technology can be used to augment their classroom activities (Rosenthal, 1999). Teachers are
more likely to integrate technology in their courses, when training modules provide them time
to practice with and to learn, share and collaborate with colleagues. Addressing this particular
need is the foundation of this proposed EDR project.
- Effective Training Program:
In planning of a professional development program for teachers, it is ironic that the needs of
the teachers are given the least attention. Although teachers play the most important role in
the learning environment, they are often not consulted regarding changes to teaching/learning
procedures (Bangkok, 2004). Teachers development should be collaboratively created, in fact,
the teachers’ needs under changing conditions have to be continuously assessed. According to
Spillane (1999), teachers who have a strong engagement towards their own professional
development are more motivated to undertake activities, which lead to a better understanding
of the goals of an innovation. Similarly, Fullan (1992) pointed out that teachers who are actively
involved in their own professional development are more able to implement changes in their
teaching. These characteristics of successful professional development training is the basis on
which the proposed EDR project is founded.

This EDR project will follow the footstep of Kortecamp and Croninger (1996) who proposed
a model that was successfully implemented in a teacher education program at New England
University (UNE). This model consisted five interrelated components:

a. Familiarization with hardware and software


b. Partnering with mentors
c. Developing personal projects
Running head: Peer-mentoring model of technology training for teachers 21

d. Becoming mentors
e. Keeping current

The EDR project will follow each component and will contribute to the theoretical
understanding of the model. Literature investigation revels the need for unconventional ways
of teachers training, it is time to develop a ‘participant-centered’ interactive training platform.
Target audience for such intervention are the schoolteachers of all levels.

As part of this design-based research project, I propose to introduce Learners-Tech-


Support or (LTS) as intervention. This interactive ‘peer-mentoring’ support platform will be
offered to learners (teachers) who show initial interests to use technology in classrooms.
LTS will take shape as a website, housing ‘unidirectional’ training contents as basic
training module, like an instructional video or PowerPoint slides on ‘how to use’ the technology
as it is being introduced. As designed intervention, research project will ensure following
aspects:
- Introduction of the basic training content will be followed by a ‘survey’ in order
to determine initial knowledge transfer.
- A safe/secure communication channel will be introduced for participants of the
training module where they will be encouraged to register ‘anonymously’ and
seek help from peers. The ‘anonymity’ will ensure freedom of expression and will
safeguard participants from possible peer-shaming.
- For analysis, each participant will be cataloged in three comparative columns,
based on the initial survey performance, posts requesting tech-assistance, and
post to help others in response to their requests for assistance. Another column
will track their use of technology in instructions.
- Different sets of ‘transfer of learning assessment survey’ will be introduced
with a regular interval to determine participants efficiency in learning and use of
the new technology in workplace.
Running head: Peer-mentoring model of technology training for teachers 22

Prototype

In my attempts to address the need for personalized mentor/peer-based training, it will


be important to involve teachers who are already part of the education system. As we are
dealing with teachers who already understand the need for technology integration in
classrooms, motivation is considered relatively high. Using the resources collected during the
field-based investigation, the research team will meet with teachers during a designated break-
time to brainstorm how they want to see the solution to look like. A professional with
experiences in online course building will accompany the research team during the meeting.
Use of related terminologies will be decided during the meeting as well.

Design: Mapping Solutions

The online mentor-support platform is meant to be an extension to the existing basic


training contents. Presenting the training module with the proposed peer-mentoring platform
as intervention is expected to result in better learner engagement and gradual improvement in
individual’s ability to use and integrate targeted technology in instructions. A preliminary set of
design principals have been identified from the literature review and filed-based investigation
in support to this content design. Those are:

• Use the basic training content to create foundation for the skill development.
• Breakdown the basic training content into several learning steps.
• Use each step as ‘anchor point’ to post questions, learning suggestions.
• Identify the role of subject matter expert (SME) in the training module.
• Identify the role of ‘mentor teachers’ who are expected to offer assistance on
demand.
• Identify and communicate the role research facilitator in relation to the designing
and maintaining the online training platform.

These preliminary design principals will be used to creating a prototype for the
proposed learning platform.
Running head: Peer-mentoring model of technology training for teachers 23

Concept Construction:

A basic website will be created in order to house all the artifacts/components of the
project. The prototype itself will be constructed using an open-source learning management
system like moodle.com. This particular learning management system (LMS) enables users to
create, track, and manage the eLearning programs. The open source tool makes its source code
open to everyone to inspect, modify, and enhance according to their education needs.

• Using the basic training as base, the prototype will offer a peer-communication
platform where learners will be able to raise ‘flag’ at any particular step of the basic
training and anonymously seek content specific assistance (elaborations).
• Respondents (mentors) will offer content specific assistance on their free will,
initiating ‘inquiry chain’ of assistance for each step.
• Each of these ‘inquiry chain’ will remain available in ‘archive’ and will ‘pop open’
whenever a new request is posted related to the specific step of the basic training
content.

Artifact: Website
The website is a purpose-built online platform that will be made available to all
teachers/learners who register themselves for the professional development program. The
website will carry following contents:
Homepage of the website will contain FOUR ‘tabs.’ Those are ‘Welcome Note, Training
Module, Assistance Tracker, Help & Suggestion Box’. (Description of each tab will be addressed
and explained in brief below)

TAB 1: Welcome Page (to introduce the website and the training modules in brief.
Contents: Welcome to the Canvas training program. This program is designed to
improve your understanding and ability to use Canvas LMS system in your daily instructional
activities, with the help of your fellow teachers.
Running head: Peer-mentoring model of technology training for teachers 24

To channel your content specific needs, we have broken down the entire Canvas LMS
System into several learning modules. Each module will be further broken down into stages. As
participant of this specialized training course, you can stop and ‘flag’ at any stage during the
training to seek specific clarifications/assistances from your peers.

TAB 2: To introduce the individual training modules.


(Design Note: Names of each module will be tossed and verified by the teachers during field-
based investigation to ensure easy understanding)
Canvas training is broken down into following ‘learning modules’:
• How to build a new course into ‘empty course-shell’
• How to send ‘announcements’ to all students
• How to set-up ‘weekly class module’
• How to set up ‘Quizzes’
• How to set up ‘Assignments’
• How to set up ‘Exams’
• How to put components (announcements, lectures, quiz/assignment/exam) into ‘weekly
module’
• How to make a module (or each component within the module) visible/public to/for all
students
• How to assess/grade ‘Quizzes’
• How to assess/grade ‘Assignments’
• How to assess/grade ‘Exams’
• How to ‘Post Grades’

Designing porotype for a learning module.

For this porotype design, we will pick one specific learner module and break it down into
its learning component as ‘stages’.
Running head: Peer-mentoring model of technology training for teachers 25

Module Name: How to set up ‘Quizzes’


Stage A: Copy empty quiz shell
Stage A.1: Set up Quiz Options
Stage A.1.1: Make answer options to shuffle
Stage A.2: Set up due date and available date
Stage B.1: Add quiz questions
Stage B. 2: Select answer option (Multiple choice/fill-in-blanks/ True or False/Match Answer)
Stage B.3: Mark the correct answer
Stage B.4: Use of Comment boxes to explain correct/incorrect answer
Stage C: Save and make the quiz public (available to students)
Stage D: Edit/Update a published quiz
Design Note: How the content specific inquiry to work: During any stage of the specific
training module, participating learners will be able to ‘raise flag’ and leave content specific
comments, inquiries or suggestions. At any ‘stage’ comments/inquiries will generate ‘inquiry
chain’ for that particular component of the module.
Design Note: Participants will be allowed to register ‘anonymously.’ Only the research
moderators will have access to database containing learners’ actual (identifiable) identities.

TAB 3: Assistance Tracker


This tab will be linked to the TAB 2’s specific learning modules. In here, learners could
access to what others have commented, inquired or suggested about a particular module and
stages. This will be a quick access tab for anyone who would like to see if his/her learning issues
have already been made or addressed by others. Participants will also be allowed to make
additional comments, inquiries or suggestions on the thread.
Design Note: For record keeping, any activity in this tab will be documented.

TAB 4: Help & Suggestion Box


This tab will give learners to contact website moderators and research team if they face
any complications regarding the website, access to the training modules or inquiry chains. This
Running head: Peer-mentoring model of technology training for teachers 26

tab will also have a sub-tab where learners could leave suggestions, comments on how they
think this training process could be made more effective.

Artifact: Survey Tools


The EDR project has incorporated three surveys in its design. The first survey will be
introduced to the learners immediately after the first introduction of the training modules. This
will establish their basic level of understanding in learning a new technology. The second set of
‘Transfer of learning assessment survey’ will be introduced in the middle of the implementation
to observe rate of knowledge transfer in the process. The last set of ‘Transfer of learning
assessment survey’ will be launched at the end of the third month when the implementation is
officially ended. Outcome of this survey will determine the final rate of knowledge transfer for
each individual participant.

Artifact: Checklist
The check list will be a self-reporting tools for the participants where they will mark and
report their own learning progresses (at least) once a week.

Implementation planning:
The porotype will be made available to participants at least two week before the school
is scheduled to start a new semester. The timing is important as the comparatively ‘free’ time
should encourage teachers to pay more attention to new ideas, and ways to use technology in
instructions.
Duration of the project implementation should be three months following the semester
opening.
Following the end of implementation, research team will have around two months to
analyze data, modify training components if needed, and then re-introduce the professional
development training before the next semester begins.
Running head: Peer-mentoring model of technology training for teachers 27

Timeline:

EDR project Timeline

3 months 3 months 2 months

Needs Field Based Artifact Launch Alpha Testing Beta Testing Gamma Testing Evaluation Re-launch
Assessment Investigation Generation website - Developer - Pilot - Tryout -Design EDR Project
Screening Modification

Preparation Time 2 weeks 1st month 2nd month 3rd month Post-
before into the into the (end) of the Intervention
semester program program program
begins

The entire project is expected to take eight months to complete its first application. Out
of this time period, first three months will be for preparation and artifact production, middle
part (three months) for application, and the last two months are for evaluation and
modification.
Following the end of implementation, research team will have around two months to
analyze data, modify training components if needed, and then re-introduce the professional
development training before the next semester begins.

Evaluation & Reflection

Literature investigation revels the need for unconventional ways of teachers training. In
this project, I am designing a ‘participant-centered’ interactive training platform for teachers to
improve their working knowledge of Canvas LMS system. Target audience for this project are
the schoolteachers of all levels.
Artifacts and evaluation tools will be pre-assessable prior to the scheduled
implementation. Following the end of implementation, research team will have around two
months to analyze data, modify training components if needed, and then re-introduce the
professional development training before the next semester begins.
Running head: Peer-mentoring model of technology training for teachers 28

Evaluation Plan

Forms
Phases Intended intervention Implemented Attained intervention
implementation
Alpha testing Test design
evaluation for the
‘Modified Training
Module’

To determine if the
intervention (LTS) is
feasible and
conducive to improve
working knowledge.

Survey: The first


survey will be
introduced to the
learners immediately
after the first
introduction of the
training modules.
This will establish
their basic level of
understanding in
learning a new
technology.

Checklist: Self-
reported, obtained
Running head: Peer-mentoring model of technology training for teachers 29

each week to track


learning and
implementation
progress
Beta Testing Explore if the
‘Modified Training
Module’ adjust with
localized needs and
limitations

To explore if the
school infrastructure
supports
implementation of
the ‘Modified
Training Module’

Survey: The second


set of ‘Transfer of
learning assessment
survey’ will be
introduced in the
middle of the
implementation to
observe rate of
knowledge transfer in
the process.

Checklist: Self-
reported, obtained
Running head: Peer-mentoring model of technology training for teachers 30

each week to track


learning and
implementation
progress
Gamma Testing Measure if the
implementation of
‘Modified Training
Module’ changes
learning behavior for
teachers/learners

Focus to assess if the


changes in training
have
positive/negative
impact on learning
behavior.

Survey: The last set of


‘Transfer of learning
assessment survey’
will be launched at
the end of the third
month when the
implementation is
officially ended.
Outcome of this
survey will determine
the final rate of
knowledge transfer
Running head: Peer-mentoring model of technology training for teachers 31

for each individual


participant.

Checklist: Self-
reported, obtained at
the end of
intervention to
determine learning
and implementation

The process of evaluation follows an empirical cycle consisting of three phases:


Planning, Fieldwork, and Meaning Making. Reflection process however could take organically
or through more structured techniques.
NOTE: See Appendix A for a complete pictures of evaluation plan in table format.

Evaluation: Planning
Establishing focus:
Voluntary learners’ participation is the fundamental aspects in peer-mentoring model of
professional skills developments. So, my focus at the planning phase would be to generate free
flowing interactions among learners, relating to the working knowledge of the specific
technology in training. This will be a ‘research on intervention’ where we seek to understand
how to improve a training design (of interactive training module) as formative goals, while also
assessing how well it is working to engender desirable phenomena (forming mentor-learner
relationship) as summative goals.
Three stages of intervention testing will be included in the planning.
Alpha testing: This intervention testing stage will assess whether the intervention is applicable
to the target audiences, and whether it is capable to engage participants in voluntary exchange
of inquiries/responses.
Beta testing: Recognizing the localized challenges our learners (teachers) face in learning and
implementing a new instructional technology, this stage of intervention testing will ensure
Running head: Peer-mentoring model of technology training for teachers 32

participants have accepted the changes brought by the intervention and institutional
sentiments are conducive for the new approach.
Gamma testing: This stage of intervention testing will help researchers determine the effect the
intervention has induced on learners changing behaviors (like willingness to seek mentor
feedback, and implementation of the newly learned instructional skills in practice).

- Framing question:
From a practical perspective, the primary concern is how and to what extent the problem is
being addressed by the intervention. From a theoretical perspective, the main concern is
understanding how an intervention does or does not work, and more importantly ‘why.’
Considering these, I will form following questions, separate for each stages of intervention
testing:

Questions for Alpha testing:


How well the designed intervention invokes ‘content-specific inquiry chains’?
And/Or, What changes in the construction of intervention prototype needs to incorporate to
generate involuntary ‘content-specific inquiry chains’?

Questions for Beta testing:


How relevant and usable do participants perceive the intervention for the context of localized
learning environment?
And/Or What are the challenges the participants encounter to be engaged with the newly
enacted training module? (like lack of free time, lack of access to internet, lack of administrative
supports, etc).
And/Or What modifications seems important to help participants overcome the challenges (if
there is any).
Running head: Peer-mentoring model of technology training for teachers 33

Questions for Gamma testing:


How effective the intervention is to encourage teachers (learners) in enhancing working
knowledge?
And/Or What is the long-term impact of the modified training modules enacted by the
intervention?

Selecting strategy:
Following the Matrix 6.2, I am considering using ‘Developer Screening’ as my strategy at
‘Alpha testing’ stage of my intervention. This particular strategy is especially helpful for
studying the internal structure of a design or constructed prototype. This will also help me to
understand how the intervention will likely work in the targeted setting. At the ‘Beta testing’
stage, I will deploy ‘Pilot’ evaluation strategy. This particular strategy will help me to get a sense
of how the intervention will perform in localized contexts. This will also help me to determine
what kind of real-world realities needs to be addressed for the design to have a chance of
success under representative conditions. At ‘Gamma testing’ stage, I intend to use ‘Tryout’
evaluation strategy so that the intervention and all its evaluation components are ‘filed-tested’.

Determine Methods:
Guided by the Matrix 6.3, I see ‘questionnaires/checklist’ as the appropriate method for
‘Developer Screening’ and ‘Pilot’ evaluation strategies during alpha and beta testing stages
accordingly. This method is suitable for tracking progress from the beginning to the end of the
intervention. At the last stage of the intervention (for tryout strategy) I intend to include
‘logs/journals’ along with ‘questionnaires/checklist’ as my preferred methods.

Document Planning:
Every step of this designed research intervention will be planned and documented
accordingly to keep track of time and other expenses. A google spreadsheet will be created and
maintained throughout the planning and implementation process of the implementation. To
Running head: Peer-mentoring model of technology training for teachers 34

use at the tryout strategy, a ‘rubric’ will be generated to evaluate participants’ ‘logs/journals’
entries.

Field Work:
- Prepare instruments
Three sets of ‘Assessment surveys’ will be created to apply and assess changes in working
knowledge for the participants. One set of ‘checklists’ will also be created to track participants
performances and levels of understanding related to the technology in training. Use of
participants’ ‘logs/journals’ will be introduced to all participants at the very beginning of the
intervention and will be encouraged to maintain.
- Engage participants
A database of willing teachers (as participants) will be constructed during the field-based
investigation. The same group of participants will be addressed and involved for the
intervention during its different stages of testing.
- Collect Data
Using Qualtrics (OU sponsored survey and data analysis tools), most data will be auto collected
and analyzed multiple times during the intervention process.

Meaning Making:
- Analyze Data
With the help of Qualtrics (OU sponsored survey and data analysis tools), collected data will be
analyzed to find out patters and changes in learning behaviors among participants.
- Consider Findings
The research team will meet to analyze periodically throughout the intervention period to
sense the trajectory of the collected data. At the end of the intervention, researchers will get
about two months to analyze and consider the data in order to determine if the research goals
are addressed and met.
Running head: Peer-mentoring model of technology training for teachers 35

- Report Study
At the end of data analysis, the findings will be shared with all stakeholders including sponsors.
The study report will also include ‘researchers’ feedback, suggestions, recommendations’ about
possible modification needed to consider before the next scheduled implementation.

Appendix A: Evaluation Plan

Forms
Phases Intended intervention Implemented Attained intervention
implementation
Alpha Survey: The first survey will
testing be introduced to the
learners immediately after
the first introduction of the
training modules. This will
establish their basic level
of understanding in
learning a new technology.

Checklist: Self-reported,
obtained each week to
track learning and
implementation progress
Beta Survey: The second set
Testing of ‘Transfer of learning
assessment survey’ will
be introduced in the
middle of the
implementation to
Running head: Peer-mentoring model of technology training for teachers 36

observe rate of
knowledge transfer in
the process.

Checklist: Self-reported,
obtained each week to
track learning and
implementation
progress
Gamma Survey: The last set of
Testing ‘Transfer of learning
assessment survey’ will
be launched at the end
of the third month when
the implementation is
officially ended.
Outcome of this survey
will determine the final
rate of knowledge
transfer for each
individual participant.

Checklist: Self-reported,
obtained at the end of
intervention to
determine learning and
implementation
Questions for testing
Alpha testing Beta Testing Gamma Testing
Running head: Peer-mentoring model of technology training for teachers 37

Alpha How well the designed


testing intervention invokes
‘content-specific inquiry
chains’?

Or/and

What changes in the


construction of
intervention prototype
needs to incorporate to
generate involuntary
‘content-specific inquiry
chains’?
Beta How relevant and
Testing usable do participants
perceive the
intervention for the
context of localized
learning environment?

Or/and

What are the


challenges the
participants encounter
to be engaged with the
newly enacted training
module?
Running head: Peer-mentoring model of technology training for teachers 38

Or/and

What modifications
seems important to
help participants
overcome the
challenges (if there is
any).
Gamma How effective the
Testing intervention is to
encourage teachers
(learners) in enhancing
working knowledge?

Or/and

What is the long-term


impact of the modified
training modules
enacted by the
intervention?
Selecting strategy
Alpha testing Beta Testing Gamma Testing
Alpha Developer Screening
testing
Beta Pilot
Testing
Gamma Tryout
Testing
Determine Methods
Running head: Peer-mentoring model of technology training for teachers 39

Alpha testing Beta Testing Gamma Testing


Alpha questionnaires/checklist
testing
Beta questionnaires/checklist
Testing
Gamma questionnaires/checklist
Testing +
logs/journals

Reflection Plan

Strategy Preparation Image forming Conclusion drawing


Point 1. Does unanticipated 1. What restricts a 1. Does ‘anonymous
behavior like peer shaming participant from seeking inquiry’ protects learners
play a role in learning help from peers? from possible peer
behavior? shaming?

2. Does availability of 2. When learners get 2. Learners-Tech-Support


practice time during the adequate time to or (LTS) as intervention
online training impacts practice what they are launched in different
learning a new technology learning in real life, they times of the academic
skill? working knowledge years could yield
improves different levels of
effectiveness.
Line During summer break, What role free time to To what extent the
teachers/learners (actor) practice play in making modified training
devotes more time in the instruction program (Learners-Tech-
learning and implementing meaningful/effective? Support)
the new skills in practice launched/implemented
Running head: Peer-mentoring model of technology training for teachers 40

(process) to create during summer break


effective instructional (longer break) and
materials (product) for the winter break (shorter
next semester class break) determine the
lessons. outcome (effective use
of the new skills in
instruction)
Triangle One group of Does the absence of If mentors (well-versed
teachers/learners refused recognition/praise mentors) are given the
to help others or respond within the learner group option to identify
to an inquiry by others discourage these well- themselves, would that
despite they show a solid versed teachers to encourage these group
mastery of the tech-skills participate in peer- of teachers to breakout
in training. mentoring activities? and assist others?
Circle This study use survey data, Monitoring the activity The activity logs need to
checklists, and activity logs logs would become a incorporate a reaction
to analyze mentor-learner challenge as it would be portal (like buttons for
proceed in the modified difficult to assess how helpful, not helpful)
learning environment. many participants found options for each inquiry
solutions without chain. If a participant
posting any inquiry by finds an inquiry chain
just going through the helpful in addressing the
archived inquiry chains. issue, he/she is looking
answers for, should mark
them ‘helpful’ or vice-
versa.
Running head: Peer-mentoring model of technology training for teachers 41

Conclusion
This educational design research project is intended to be a work in progress, subject to
modification after each application. At first iteration, designing artifacts will take most
resources and time, but will require minimum maintenance in followings. Artifacts like ‘Survey
Tools’ and ‘Checklist’ managements will need constant modifications (even during each
iteration) to keep them relevant. This EDR project presents an idea of professional
development for teachers, which is recommended to be assessed and modified following each
application. Once its effectiveness is field tested, this model of professional skill development
could be tailored for any field for skills development.
Running head: Peer-mentoring model of technology training for teachers 42

References

Abbot, M. L. (2003). State challenge grants TAGLIT data analysis: A report prepared for the Bill
& Melinda Gates Foundation. Retrieved November, 25, 2010.
Afshari, M., Bakar, K. A., Luan, W. S., Samah, B. A., & Fooi, F. S. (2009). Factors affecting
teachers' use of information and communication technology. Online Submission, 2(1),
77-104.
Bangkok, U. N. E. S. C. O. (2004). Integrating ICTs into education: Lessons learned. UNESCO
Bangkok. Retrieved March, 21, 2011.
Becker, H. J. (2001, April). How are teachers using computers in instruction. In annual meeting
of the American Educational Research Association, Seattle, WA.
Baylor, A. L., & Ritchie, D. (2002). What factors facilitate teacher skill, teacher morale, and
perceived student learning in technology-using classrooms?. Computers &
education, 39(4), 395-414.
Brand, G. A. (1998). What research says: Training teachers for using
technology. Journal of staff development, 19, 10-13.
Bromley, H. (1998). Introduction: Data-driven democracy? Social assessment of educational
computing. Education, technology, power, 1-28.
Bruce, B. C. (1993). Innovation and social change. Cambridge University Press.
Carlson, S., & Gadio, C. T. (2002). Teacher professional development in the use of
technology. Technologies for education, 118-132.
Carr, A. A., Jonassen, D. H., Litzinger, M. E., & Marra, R. M. (1998). Good ideas to foment
educational revolution: The role of systemic change in advancing situated learning,
constructivism, and feminist pedagogy. Educational Technology, 5-15.
Congress, U. S. Office of Technology Assessment (1995). Teachers and technology: Making the
connection. Report Summary. Washington, DC: US Government Printing Office. OTA-
EHR-616.
Davis, S. E. (2002). The Effect of One-on-one Follow-up Sessions After Technology Staff
Running head: Peer-mentoring model of technology training for teachers 43

Development Classes On Transfer of Knowledge To the Classroom: An Action Research


Study (Doctoral dissertation, Valdosta State University).
Davis, N., Preston, C., & Sahin, I. (2009). Training teachers to use new technologies impacts
multiple ecologies: Evidence from a national initiative. British journal of educational
technology, 40(5), 861-878.
Di Benedetto, O. (2005, June). Does technology influence teaching practices in the classroom.
In National Educational Computing Conference 2005 Conference Philadelphia, PA.
Retrieved June (Vol. 1, p. 2006).
Edelson, D. C. (2006). Balancing innovation and risk. Educational design research, 100-106.
Fullan, M. (1992). Successful School Improvement: The Implementation Perspective and
Beyond. Open University Press, Philadelphia, USA.
Graves, K. E., & Bowers, A. J. (2018). Toward a Typology of Technology-Using Teachers: A Latent
Class Analysis (LCA) of the NCES Fast Response Survey System Teachers’ Use of
Educational Technology in US Public Schools, 2009 (FRSS 95).
Gülbahar, Y. (2007). Technology planning: A roadmap to successful technology integration in
schools. Computers & Education, 49(4), 943-956.
Hickman, L. A. (1990). John Dewey's pragmatic technology. Indiana University Press.
Ibrahim, M., & Al-Shara, O. (2007, July). Impact of interactive learning on knowledge retention.
In Symposium on Human Interface and the Management of Information (pp. 347-355).
Springer, Berlin, Heidelberg.
Kearsley, G., & Lynch, W. (1992). Educational leadership in the age of technology: The new
skills. Journal of research on computing in education, 25(1), 50-60.
Koehler, M. J., & Mishra, P. (2005). Teachers learning technology by design. Journal of
computing in teacher education, 21(3), 94-102.
Kortecamp, K., & Croninger, W. R. (1996). Addressing barriers to technology diffusion. Journal
of Information Technology for Teacher Education, 5(1-2), 71-82.
Lopes, M. (2003). Incorporation of information and communication technologies in schools: The
“Internet for Everyone” Project in Panama. Ministry of Education, Panama.
May, M. K. (2000). Mentoring for Technology Success.
Running head: Peer-mentoring model of technology training for teachers 44

McKenney, S., & Reeves, T. C. (2018). Conducting educational design research. Routledge.
National Council for Accreditation of Teacher Education, Washington, DC. (1997). Technology
and the New Professional Teacher. Preparing for the 21st Century Classroom. ERIC
Clearinghouse.
National Teacher Survey. (2005). This independent national survey was commissioned by CDW
G. Retrieved October 1, 2020 from website
http://newsroom.cdwg.com/features/2005NatlTeacherSurvey.pdf
O'Dwyer, L. M., Russell, M., & Bebell, D. J. (2004). Identifying teacher, school and district
characteristics associated with elementary teachers' use of technology: A multilevel
perspective. education policy analysis archives, 12, 48.
Price, J. K. (2015). Transforming learning for the smart learning environment: lessons learned
from the Intel education initiatives. Smart Learning Environments, 2(1), 16.
Redish, T. C. (1997). An evaluation of a one-year technology professional development program:
The InTech project (pp. 1-200). Georgia State University.
Reynolds, C., & Morgan, B. (2001). TEACHERS'PERCEPTIONS of TECHNOLOGY IN-SERVICE: A
CASE STUDY. In Society for Information Technology & Teacher Education International
Conference (pp. 982-986). Association for the Advancement of Computing in Education
(AACE).
Roberts, B. S. (2002). Using computers and technology in the social studies classroom: A study of
practical pedagogy (pp. 1-284). Georgia State University.
Rosenthal, I. G. (1999). New Teachers and Technology: Are They Prepared?. Technology &
Learning, 19(8), 22-24.
Ruggiero, D., & Mong, C. J. (2015). The teacher technology integration experience: Practice and
reflection in the classroom. Journal of Information Technology Education, 14.
Schaffer, S. P., & Richardson, J. C. (2004). Supporting technology integration within a teacher
education system. Journal of Educational Computing Research, 31(4), 423-435.
Spillane, J. P. (1999). External reform initiatives and teachers' efforts to reconstruct their
practice: The mediating role of teachers' zones of enactment. Journal of curriculum
Studies, 31(2), 143-175.
Running head: Peer-mentoring model of technology training for teachers 45

Tong, K. P., & Trinidad, S. G. (2005). Conditions and Constraints of Sustainable Innovative
Pedagogical Practices Using Technology. International Electronic Journal for Leadership
in Learning, 9(3), n3.
Towne, T. (2018). Exploring the Phenomenon of Secondary Teachers Integrating the LMS
Canvas in a Blended-Learning Course.
Vanfossen, P. J. (2001). Degree of Internet/WWW use and barriers to use among secondary
social studies teachers. International Journal of Instructional Media, 28(1), 57.
Wenglinsky, H. (1998). Does it compute? The relationship between educational technology and
student achievement in mathematics.
Yildirim, S. (2000). Effects of an educational computing course on preservice and inservice
teachers: A discussion and analysis of attitudes and use. Journal of Research on
computing in Education, 32(4), 479-495.
Yildirim, S., & Kiraz, E. (1999). Obstacles in integrating online communications tools into
preservice teacher education: A case study. Journal of Computing in Teacher
Education, 15(3), 23-28.
Zaritsky, R., Kelly, A. E., Flowers, W., Rogers, E., & O’Neill, P. (2003). Clinical design
sciences: A view from sister design efforts. Educational Researcher, 32(1), 32-34.
Zhao, Y., & Bryant, F. L. (2006). Can teacher technology integration training alone lead to high
levels of technology integration? A qualitative look at teachers’ technology integration
after state mandated technology training. Electronic Journal for the Integration of
Technology in Education, 5(1), 53-62.

You might also like