You are on page 1of 111

ABSTRACT

SUPPORTING TEACHERS’ INTEGRATION OF

TECHNOLOGY WITH E-LEARNING

By

Andrew T. Fitzgerald

December 2015

Teachers need training to integrate technology into classroom curriculum,

activities, and pedagogy. The adoption of the Common Core State Standards and

statewide computer based assessments, coupled with technology’s rapid rate of

innovation and change, has only increased the need to help support teachers’

development of these necessary skills. The purpose of this project was to create an

online-based e-learning professional development training module for teachers to develop

their technological, pedagogical, and content knowledge (TPACK) and skills. The design

of the training module incorporated e-learning design principles, adult learning

principles, and current research on developing teachers’ TPACK. To provide feedback

on the design, teachers from two middle schools in Southern California were invited to

use the training module, and were surveyed regarding their experiences. Results of the

survey indicate participants gained knowledge and skills for using their school computer

lab, integrating technology into their classroom instruction, and overall, were pleased

with the e-learning training module.


SUPPORTING TEACHERS’ INTEGRATION OF

TECHNOLOGY WITH E-LEARNING

A PROJECT REPORT

Presented to the Department of Advanced Studies in Education and Counseling

California State University, Long Beach

In Partial Fulfillment

of the Requirements for the Degree

Master of Arts in Education

Committee Members:

Stephen Adams, Ph.D. (Chair)


Lesley Farmer, Ed.D.
Vanitha Chandrasekhar, Ph.D.

College Designee:

Marquita Grenot-Scheyer, Ph.D.

By Andrew T. Fitzgerald

B.M.E., 2003, Florida State University

December 2015
ProQuest Number: 1603340

All rights reserved

INFORMATION TO ALL USERS


The quality of this reproduction is dependent upon the quality of the copy submitted.

In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

ProQuest 1603340

Published by ProQuest LLC (2015). Copyright of the Dissertation is held by the Author.

All rights reserved.


This work is protected against unauthorized copying under Title 17, United States Code
Microform Edition © ProQuest LLC.

ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346
Copyright 2015
Andrew T. Fitzgerald
ALL RIGHTS RESERVED
ACKNOWLEDGEMENTS

First and foremost, I would like to thank my wife, Roxanna. I could not have

completed this project without her incredible support and patience. I would like to thank

Dr. Lesley Farmer for giving me the opportunity to explore the idea of using the e-

learning format to provide a professional development experience for public school

teachers, and for pointing me into the direction of TPACK research. Thank you to Dr.

Vanitha Chandrasekhar and the Long Beach Unified School District for giving me the

opportunity to test my project with two middle schools. A big thank you to my

committee chair, Dr. Stephen Adams, for guiding and directing me through the research,

development, and writing of this project and paper. Thank you Marnelle Leonard for

being the muse of the module’s scenarios. I would also like to thank my aunt, Dr. Julie

Judd, for always providing a fresh perspective on the project’s development, direction,

and goals.

iii
TABLE OF CONTENTS

Page

ACKNOWLEDGEMENTS ......................................................................................... iii

LIST OF TABLES ....................................................................................................... vi

LIST OF FIGURES ..................................................................................................... vii

CHAPTER

1. INTRODUCTION ............................................................................................ 1

Background ................................................................................................ 1
Statement of the Problem ........................................................................... 3
Proposed Solution ...................................................................................... 5
Importance of Project ................................................................................. 6
Overview of the Project Report ................................................................. 8
Definition of Terms.................................................................................... 8

2. LITERATURE REVIEW ................................................................................. 10

Introduction ................................................................................................ 10
Technological Pedagogical Content Knowledge ....................................... 11
Adult Learning ........................................................................................... 16
Lowe and Holton’s Framework for Effective Computer Based
Instruction for Adults ........................................................................... 19
Content, Interface, Design and Strategy .................................................... 24
Summary .................................................................................................... 41

3. METHODS OF DESIGN AND DEVELOPMENT ......................................... 43

Overview .................................................................................................... 43
Design of the Project.................................................................................. 43
Development Process ................................................................................. 51
Preliminary Testing of the Module ............................................................ 53
Summative Evaluation ............................................................................... 56

iv
CHAPTER Page

4. RESULTS, DISCUSSION, AND CONCLUSION .......................................... 60

Results from the Survey ............................................................................. 60


Summary of Findings ................................................................................. 67
Limitations ................................................................................................. 69
Recommendations ...................................................................................... 70
Conclusion ................................................................................................. 72

APPENDICES ........................................................................................................ 73

A. PRELIMINARY TEST RECRUITMENT EMAIL ......................................... 74

B. PRELIMINARY TEST INFORMED CONSENT FORM .............................. 77

C. PRELIMINARY TEST ONLINE SURVEY QUESTIONS ............................ 81

D. PRELIMINARY TEST PHONE CALL SURVEY QUESTIONS .................. 83

E. RECRUITMENT MESSAGE FOR EMAIL, LMS, AND FLYER ................. 85

F. INFORMED CONSENT FORM ...................................................................... 87

G. PARTICIPANT ONLINE SURVEY QUESTIONS........................................ 90

H. E-LEARNING TRAINING MODULE WEBSITE ......................................... 95

REFERENCES ............................................................................................................ 97

v
LIST OF TABLES

TABLE Page

1. Application of the CBI Framework, Adult Learning Principles, and


E-Learning Design Principles into the Module’s Design .......................... 61

2. Effectiveness of the Module’s Design to Produce the Desired Learning


Outcome ..................................................................................................... 63

3. Module Design’s Facilitation of Communication and Collaboration Among


Participants................................................................................................. 64

vi
LIST OF FIGURES

FIGURE Page

1. TPACK framework ........................................................................................... 12

2. Development process of TPACK ...................................................................... 16

3. Conceptual framework for effective CBI ......................................................... 20

4. Flow model of practice ..................................................................................... 41

5. TPACK knowledge and skill gaps .................................................................... 45

6. Organizational strategy of the e-Learning module ........................................... 46

7. Screen design and navigational control ............................................................ 47

8. Instructional content area .................................................................................. 49

9. Screenshot of website hosting the e-Learning module ..................................... 50

10. Home screen graphic illustrating the progression through the module’s
learning content.......................................................................................... 5

vii
CHAPTER 1

INTRODUCTION

Background

The 2004 report, Ready or Not: Creating a High School Diploma that Works,

presented a deficiency in high school graduates' basic English and math skills that falls

short of the basic skills required by employers (Achieve, Inc., 2004). The report called

for states to update and align their academic standards with the demands of the real world

workforce. In 2009, the National Governors Association (NGA) created the Common

Core State Standards (CCSS) to provide a new set of academic standards for the English

language arts (ELA) and mathematics. These standards were created to overcome

graduating high school seniors’ ill-preparedness for college in a growing globalized

competitive marketplace, and to align the academic expectations of high school graduates

across the United States. Since their inception, the CCSS have been adopted by 43 states

(National Governors Association Center for Best Practices, Council of Chief State School

Officers, 2014). In conjunction with CCSS adoption, states have also adopted a new

student achievement assessment model, eschewing the traditional paper and pencil

method for a new computer-based assessment system. Implementation of the new

technology infused ELA and math CCSS, along with using new statewide computer-

based assessments to measure student achievement, has amplified the role of technology

will need to serve for both students and teachers.

1
Many of the new ELA and math standards require students to use technology,

beginning as early as kindergarten. The CCSS standards require students to develop

digital skills such as keyboarding, searching for digital sources on the internet, using

software for writing and publishing, creating digital media, and collaborating on digital

projects (National Governors Association Center for Best Practices, Council of Chief

State School Officers, 2010). The CCSS for the ELA and mathematics subjects were

created with the intention that their design be adapted into other curriculum areas such as

history, science, and the arts (National Governors Association Center for Best Practices,

Council of Chief State School Officers, 2010). Ultimately, the technological skills

developed through the ELA and math standards will be reinforced as other subject areas

integrate their usage. In her commentary, The Light Ahead, Nancy Doorey cites

technological skills as necessary for the future citizen and valued employee (Doorey,

2012). The CCSS embrace this idea, integrating the use of technology throughout every

child's educational experience (The K-12 Center, 2012).

States adopting the CCSS are also required to implement a system for students to

complete online computer based assessments. Currently, there are two major consortia

for providing computer-based assessments to students, Smarter Balance Assessment

Consortium (SBAC) and the Partnership for Assessment of Readiness for College and

Careers (PARCC). Beginning in third grade, the digital assessments are administered to

students throughout the school year, measuring their English language arts and

mathematics skills. The estimated test time for completing these assessments range from

7 to 8.5 hours (SBAC, 2012). In order for school districts to comply with computer

based assessment requirements, current technology equipment is needed to allow the

2
enormous number of students the opportunity to complete the test within a specified

testing window. Test taking scenarios include students using the school site computer

lab, or using a mobile lab of laptops or tablets. Students will require the technical skills

necessary to effectively operate these machines in order to provide an accurate

assessment of their knowledge.

Statement of Problem

When students are lacking basic computer skills, it ultimately becomes the

teacher's responsibility to educate them, but who will teach the teachers? Research

shows teachers have basic digital skills such as checking and sending email, accessing the

web, and word processing (Paraskeva, Bouta, & Papagianni, 2008), but lack the

pedagogical knowledge and technology skills necessary for integrating technology into

curriculum (Blackwell & Yost, 2013; Gong, Chen, Cheng, Yang, & Huang, 2013). This

can be attributed to a myriad of reasons, such as lack of proper training (Konan, 2010;

Uzunboylu & Ozdamli, 2011; Yucel & Kocak, 2010), reluctance (Prasertsilp & Olfman,

2014), self-efficacy, and teacher beliefs (Voogt, Fisser, Roblin, Tondeur, & Braak, 2013).

Seasoned teachers have the basic skills for mundane computer tasks, but their knowledge

of using software can fail to extend beyond the utilization of word processing and

productivity skills. New teachers, although more accustomed to using technology, lack

the pedagogical skills to design lessons that utilize technology (Blackwell & Yost, 2013;

Tillery, Varjas, Meyers, & Collins, 2010; Uzunboylu & Ozdamli, 2011). Teachers not

only need to be aware of the dynamically changing field of technology tools (Blackwell

& Yost, 2013; Mishra & Koehler, 2006), but also be competent using them, able to create

engaging digital instructional materials, make documents available online, use office

3
applications, know the tools in the applications they use, and possess troubleshooting

skills for maintenance and repair (Mishra & Koehler, 2006; Yucel & Kocak, 2010).

Integrating technology into the classroom curriculum creates an unfamiliar,

dynamic learning environment (Mishra & Koehler, 2006). Management routines and

procedures such as student seating arrangements, answering questions, handling

interactions with students, and troubleshooting students’ behaviors need to be adjusted to

accommodate for technology use (Gong et al., 2013; Voogt et al., 2013; Voyiatzaki &

Avouris, 2014). Students will have varying degrees of digital literacy skills, requiring

teachers to differentiate instruction to avoid boredom and misbehaviors (Yucel & Kocak,

2010). Teachers will need to possess greater multitasking skills in order to troubleshoot

hardware or software related issues during a lesson (Voyiatzaki & Avouris, 2014). The

change in the teachers’ role of instructors of students’ learning to facilitators of the

learning environment and activities is necessary, and can be a daunting challenge for

teachers (Mishra & Koehler, 2006; Pareskeva et al., 2008; Voyiatzaki & Avouris, 2014).

Teachers need to possess the knowledge and skills to operate classroom hardware

and software, pedagogical skills to integrate technology into curriculum, and understand

how specific technologies are used for specific content subjects (Koehler, Mishra, &

Cain, 2013). However, research shows teachers lack these skills (Koehler et al., 2013),

are not properly trained in their pre-service programs (Uzunboylu & Ozdamli, 2011) and

are not supported with professional development opportunities (Voyiatzaki & Avouris,

2014). Teachers are having to acquire the necessary knowledge and skills through trial

and error instances (Tillery et al., 2010; Uzunboylu & Ozdamli, 2011), which create

unsuitable learning experiences for students (Gong et al., 2013). This issue is not being

4
ignored by teachers. Recent survey data cites "learning how to differentiate instruction

using technology" as a professional development priority for 50% of teachers surveyed

(The Center For Digital Education, 2014).

Proposed Solution

The knowledge teachers need to integrate technology into the classroom is

represented by their Technological, Pedagogical, and Content Knowledge (TPACK)

(Koehler et al., 2013). Based on a framework of pedagogical and content knowledge

domains (PCK) proposed by Shulman (1986), TPACK introduces the technology

knowledge domain as a necessary body of knowledge for technology integration into

school curriculum (Mishra & Koehler, 2006). The framework is used to describe skills

necessary for teachers, such as operational skills, procedural knowledge, and the ability

to identify the affordances technology provides for specific content subjects (Voogt et al.,

2013).

Training in digital skills and computer lab behavior management is necessary for

teachers to prepare them for integrating technology into the classroom curriculum and

managing students taking their statewide computer-based assessment (Donnelly, McGarr,

& O'Reilly, 2011; Ertmer & Ottenbreit-Leftwich, 2010). For years, school districts have

been preparing for CCSS implementation: upgrading their network infrastructures to

handle the SBAC and PARCC online tests, investing in new computer labs, mobile labs,

and tablets to provide enough devices for students to test at each school location, and

providing professional development for core subject teachers to prepare them for the

instructional shifts necessary to align their instruction with the CCSS. Unfortunately, the

training for technology integration into classroom instruction is absent (Blackwell &

5
Yost, 2013), as is the training of classroom management techniques for the computer

labs. CCSS adoption is expensive, and although the issues presented have not been

ignored by school district technology leaders, in many cases the trainings to prepare

teachers for them have not been prioritized. Face-to-face professional development is

costly for school districts. Funds must cover the costs of the teachers’ learning time for

the training and for the substitute teachers covering the students in the classroom.

Another viable solution, allowing the vast amount of affected teachers to receive the

professional development they need, is to provide an online e-Learning training module

to begin and foster the development of their TPACK.

An e-Learning training module provides many benefits over traditional face-to-

face professional development. The monetary costs are considerably less, and the

training is presented in a more convenient manner. Learners have greater control over

the learning process, such as the speed at which they learn, the time allotted, and the

location. Learners have access to the training at all hours of the day, and are not confined

to specific locations. Support is available when they need it, as it is already integrated

into the training experience. Traditional face-to-face training provides a one-time

experience to the learners, whereas the training in an e-Learning module can be accessed

as often as needed. E-Learning improves the efficiency of training, decreases errors, and

can improve morale among the learners (Allen, 2003).

Importance of Project

CCSS standard adoption and computer based assessments of CCSS begins

nationwide during the 2014-2015 school year. The CCSS have created an opportunity for

students to learn important digital skills in the classroom (Achieve, Inc., 2004), however

6
teachers must learn what is required in order to create a positive, efficient digital learning

experience for their students. Providing teachers training on how to manage the new

instructional environment of digital devices and to integrate technology can generate a

positive experience of this complex paradigm shift. Research already shows that a

reluctance among teachers to implement technology, based on their attitudes and self-

efficacy (Lee & Lee, 2014), can lead to lowered expectations, confidence, and frequency

of usage (Donnelly et al., 2011; Paraskeva et al., 2008). Providing teachers with the

valuable knowledge and skills through a TPACK influenced e-Learning training

beforehand will create an easier, friendlier transition. A thorough multimedia-rich

professional development experience can improve attitudes, build basic skills, and can

establish good practices for lesson planning and software application usage.

Although the TPACK framework is intuitive and an appropriate solution for

explaining the body of knowledge necessary for technology integration (Ansyari, 2013;

Blackwell & Yost, 2013), TPACK theory and its implementation is still complicated

(Voogt et al., 2013). Professional development studies designed with a holistic approach

to the framework have provided clues to successful design strategies to use with teachers

such as collaboration, modeling, lesson plan design, and contextual learning activities

(Mishra & Koehler, 2006; Voogt et al., 2013). However, these previous strategies

required face-to-face professional development, which in turn can create obstacles for

teachers and districts such as time and funding. Providing an online-based training

experience circumvents these issues. Previous online-based professional development

research showed e-Learning to be a sufficient method for developing TPACK, provided

it's not a "one-shot" experience (Doering, Veletsianos, Scharber, & Miller, 2009). This

7
project aims to further research on instructional design and development of online-based

professional development for developing teachers’ TPACK by creating a continuously

available and accessible online resource that combines research backed strategies for

TPACK development, adult learning, and e-Learning design principles.

Overview of the Project Report

This report describes the design of an e-Learning training module developed to

target the development of teachers’ TPACK knowledge and skills. Chapter 2 presents a

review of literature including the TPACK framework, adult learning principles, a

framework for computer-based instruction, and e-Learning design principles used to

design and develop the training module. Chapter 3 describes the methods used for the

design of the module, site and participant selection, formative evaluations, and a

summative evaluation. Lastly, Chapter 4 discusses findings from this evaluation and

gives recommendations.

Definition of Terms

Andragogy: A theoretical approach to adult education, consisting of six

principles of how adults are motivated to learn.

Common Core State Standards (CCSS): A set of math and English language arts

standards recently developed for kindergarten through 12th grade education. Adopted by

43 states, they provide the goals of the knowledge and skills students will acquire in

school.

e-Learning: Digital instruction accessible through a computer device. It uses

multimedia such as videos, audio, and graphics to support the content delivery.

Instructional content can be stored locally on the device or accessed from another

8
location such as a server connected to the internet, and allows for learning to be

synchronous or asynchronous.

Practice Strategy: An element of computer-based instructional referring to the

design of the e-Learning content and interactivity presented to the learner.

Screen Design: An element of computer-based instructional referring to the

layout of what learners see on the computer device screen. The interface that provides

the content, activities, and navigational controls for the learner.

Successive Approximation Model: An instructional design model that uses three

iterations of an evaluate, design, and develop cycle. Throughout the process, prototypes

are created and tested, and feedback is solicited from stakeholders.

Technological, Pedagogical and Content Knowledge (TPACK): A framework

describing the relationship between three bodies of knowledge teachers need to integrate

technology into their instructional practices.

9
CHAPTER 2

LITERATURE REVIEW

Introduction

The project design builds on several bodies of literature pertinent to creating and

designing an effective e-Learning module for TPACK development. A search of relevant

literature was conducted using California State University Long Beach’s OneSearch

academic database tool and the Academic Search Complete database. Sources include

journal research articles and books on the TPACK framework, TPACK-influenced

professional development, e-Learning module design and development principles, and

andragogy. The first body of literature is a review of the TPACK framework and current

findings relating to its use for educator professional development. The second section is

a review of andragogy. Andragogy, developed by Malcolm Knowles, describes his

principles of adult learning, and is considered the art and science of educating adults

(Forrest & Peterson, 2006). The principles outline the different needs adults have when

acquiring knowledge, and the influences the principles have on the creation of a

meaningful, relatable, professional development. The third body of literature relevant to

the project is a theoretical framework for computer-based instruction, describing the

different constructs within an effective e-Learning program, and how the constructs

influence one another within the overall design. Lastly, e-Learning design principles are

presented in the review, outlining the concepts for effective interface design, presentation

10
of content, interactivity, practice flow, feedback, usage of multimedia objects, program

control, and motivating programming techniques.

Technological Pedagogical Content Knowledge

Mishra and Koehler (2006) developed the TPACK framework as a response to the

lack of theoretical grounding of educators' usage of educational technology. TPACK

describes the relationships between three bodies of knowledge educators need to possess;

content, pedagogy, and technology (see Figure 1). Content knowledge (CK) refers to

educators' mastery of the subject knowledge they teach. Pedagogical knowledge (PK)

pertains to educators' mastery of how individuals learn as well as managing learning

environments. Educators' mastery of standard and digital technologies is described as

their technology knowledge (TK). According to the framework, these bodies of

knowledge converge upon each other, creating a blending of the knowledge domains.

Pedagogical Content Knowledge (PCK), first introduced by Shulman (1986), represents

educators' abilities to create the proper learning experiences for the knowledge transfer of

their particular subject matter to learners. Mishra's and Koehler's addition of TK

presented additional blended bodies of knowledge in the framework, technological

pedagogical knowledge (TPK), technological content knowledge (TCK), and

technological pedagogical content knowledge (TPACK), the blending of all three bodies

of knowledge. TPK describes educators' abilities using their TK to select and apply the

proper technology to support and improve their teaching practices. TCK describes how

technology and content influence one another in a mutual relationship. Choosing the

right technology to use is directly influenced by the particular subject content to be

learned, whereas the choice of subject matter to be presented to learners is directly

11
influenced by what technologies are available to the educator. TPACK is regarded as the

"Total PACKage,” the convergence of all three knowledge domains. TPACK represents

the utilization of all three knowledge domains to effectively integrate technology into the

classroom environment, instruction, and curriculum.

FIGURE 1. TPACK framework (Reproduced by permission of the publisher, © 2012 by


tpack.org). The relationships between the three knowledge domains for effective
teaching with technology.

Many reasons for the initial development of the TPACK framework still exist

today. Teachers' technology knowledge, an important characteristic of overall teacher


12
knowledge (Mishra & Koehler, 2006), has to keep up with the rapidly changing dynamic

field of digital technologies. This includes learning newer operational skills for both

hardware and software, their affordances, and the ability to troubleshoot issues (Mishra &

Koehler, 2006; Voogt et al., 2013). Deficiencies in TK are a reliable predictor of

teachers' self-efficacy of their technology usage and beliefs (Paraskeva et al., 2008;

Voogt et al., 2013), which in turn may affect their reluctance, acceptance, and usage of

digital technologies in the classroom (Prasertsilp & Olfman, 2014; Voogt et al., 2013).

Just like the dynamic nature of technology innovation, teachers' TK must also be

dynamic, changing as older technologies become outdated and unnecessary.

Teachers' lack of TCK and TPK is not limited to the rapidly evolving innovations

of technology. Research shows in-service teachers never received educational

technology training during their pre-service education (Blackwell & Yost, 2013),

underwent previous trainings for older technologies, or had prior experiences that were

inadequate (Mishra & Koehler, 2006). Regardless, in today's educational settings

teachers' must be able to recognize the affordances of specific technologies and their

ability to change the presentation, representation, and instructional delivery of content to

learners (Voogt et al., 2013). They must be aware of the necessary time needed for

careful lesson preparation, such as lesson planning, comprehensive learning outcomes,

and well-designed activities and assessments (Gong et al., 2013). Teachers need to

possess the knowledge of how technology will influence the management of classroom

instructional procedures, routines, and interactions within the learning environment

(Voyiatzaki & Avouris, 2014).

13
Research studies have used the TPACK framework to design professional

development learning experiences for teachers, aimed at improving their skills integrating

technology into classroom instruction (Voogt et al., 2013). Mishra and Koehler's (2006)

original study emphasized a learning-technology-by-design approach, where users

participate in authentic, contextual learning activities. The activities are learner-

controlled, and require teachers to problem solve technology-related issues such as lesson

design and choosing the best hardware or software solutions. Since the original study, a

multitude of other researchers have applied the framework in their own studies,

furthering investigation on the effectiveness of teacher professional development

activities influenced by the TPACK framework (Voogt et al., 2013). Findings include

ideal characteristics for developing TPACK in teachers, such as modeling, mentoring,

face-to-face interactions, and collaboration. Authentic activities that teachers can relate

to, such as developing technology-enhanced lessons and redesigning previous lessons to

use technology, have proved to be successful at developing TPACK skills (Ansyari,

2013; Voogt et al., 2013). In congruence with Mishra and Koehler's original design,

research suggests TPACK professional development is developed around a holistic

approach to the framework, suggesting that learning activities incorporate the three

knowledge domains of content, pedagogy, and technology into the activity design (Voogt

et al., 2013). Overall, the TPACK framework is regarded as a sound theory for further

research and development of teacher professional development activities for technology

integration (Ansyari, 2013; Koehler et al., 2013; Voogt et al., 2013).

After studying math teachers' use of spreadsheets in instruction, Niess, Sadri, and

Lee (2007) proposed a model for the development process of teachers’ TPACK skills,

14
stemming from Rogers's (2010) diffusion of innovations theory (see Figure 2). The

linear process has five stages.

1. Recognize: Teachers understand the affordances of a particular technology,

yet do not integrate it into the curriculum.

2. Accepting: Teachers develop an attitude on whether or not to use the

particular technology in their instruction.

3. Adapting: Teachers begin to experiment with the technology, developing

strategies and resources that would enable its instructional integration.

4. Exploring: Teachers actively use the technology in their instruction.

5. Advancing: Teachers reflect on their experience integrating the technology

into their classroom instruction.

Özgün‐Koca, Meagher, and Edwards (2011) used this development model, along

with the TPACK framework, in their study of a math teacher's usage of graphing

calculators with the curriculum. The study showed both models to be beneficial for

professional development and reinforced the importance of teachers' personal reflection

relating to their TPACK development. Although the teacher's TPACK development

followed the linear progression of the proposed model (Niess et al., 2007), the study

showed it did not develop in a hierarchical order (Özgün‐Koca et al., 2011). This

outcome proposes that although teachers go through the five stages of TPACK

development, they may move forward or backward through the process based on the

experiences they have.

15
FIGURE 2. Development process of TPACK (Ness et al., 2007. Used with permission).
The linear process in which a teacher develops their TPACK.

Adult Learning

Adults do not acquire knowledge in the same manner as children. Pedagogical

practices of teaching subjects to children in school, where the teacher is in charge of

content instruction and delivery, do not apply towards adult learning and can even be

harmful to the adult learning process (Knowles, Holton, & Swanson, 2011). Andragogy,

the science of adult learning, focuses on instruction presented in a relatable situational

experience significant to the learner where the needs and interests of the learner are a

catalyst for motivation. The andragogy model of adult learning, developed by Malcolm

Knowles, describes the basic principles for adult learning: learner’s need to know, self-

concept of the learner, prior experience of the learner, readiness to learn, orientation to

learning, and motivation to learn (Knowles et al., 2011). Knowles’s model is based upon

16
his research and the research and theories of other learning experts such as Lindeman,

Rogers, and Maslow (Knowles et al., 2011).

Principles of Andragogy

Need to know. Before adults put in the time and effort necessary to learn new

knowledge and skills, they require a purpose for the learning. Adults put in the energy

required to complete the learning process only after they realize rewards of the learning.

Explaining the need to know is the first task and responsibility of the facilitator of the

learning process. Providing real or simulated experiences can help the learner realize the

knowledge and skill gaps they may currently possess (Knowles et al., 2011).

Learner’s self-concept. Adults take the responsibility of making decisions that

relate to their own life experiences. They are self-directing. The pedagogical model of

instructor-controlled learning, where the learner is dependent upon the teacher, conflicts

with the self-directing psychological nature of adults. When learning is forced by the

will of others, adults will resent, resist, and flee the situation (Knowles et al., 2011).

Role of learner’s experiences. Adults’ life experiences shape their biases,

predispositions, mental habits, practices, and presumptions, causing them to block out

new ideas, insights, and alternative ways of thinking. Adult learners are of varying ages

with different interests and motivations, and the scope of their life-shaping personal

experiences will differ greatly from their peers and the facilitator of an instructional

activity. Adult instruction has to take into account the learners’ experiences, being

careful not to devalue or overlook it, as this can lead to the dismissal and rejection of new

content presented. Instructional facilitators can incorporate the adult learners’

17
experiences through activities such as group discussions, problem solving, and

simulations (Knowles et al., 2011).

Readiness to learn. Adults need to be mentally ready to learn in order in order to

be able to incorporate their learning into their daily activities and situations. Content

must be relevant and applicable to the learner, and the facilitator of learning must be able

to contextualize it for them (Forrest & Peterson, 2006). Lack of relevance can lead to

poorly engaged learners, so the importance of the subject should be demonstrated if

necessary. Activities such as role playing and mentoring can be utilized to increase

readiness (Holton, Swanson, & Naquin, 2001).

Orientation to learning. Adults learn to solve issues. They are motivated to learn

when the subject matter will increase their effectiveness with relatable, life-centered

problems and tasks. They seek to know how they can apply the subject matter to solve

their personal and work-centered issues immediately, rather than for future use. Subject

matter must be presented in a context that is relatable to their field of work, and must be

seen as applicable (Knowles et al., 2011).

Motivation. The more adults are motivated, the better they will learn. External

motivating factors such as promotions or salary increases can be used; however, adults

learn best when they are motivated intrinsically. Internal ideas such as self-esteem, job

satisfaction, quality of life, and personal growth development have a more permanent

influence on motivating adults to learn. Factors that can block intrinsic motivators should

be identified and avoided (Knowles et al., 2011).

18
Lowe and Holton’s Framework for Effective Computer Based Instruction for Adults

Past research studies have sought to identify the key variables and components of

quality e-Learning and the learning characteristics of the users. Lowe and Holton (2005)

created a theoretical framework for effective computer based instruction (CBI). By

analyzing previous research of computer learning frameworks and the integration of

multiple learning theories such as behavioral, cognitive, and social, they developed a

model highlighting the critical components needed to have an effective CBI framework.

Figure 3 shows Lowe and Holton’s critical components of e-Learning within a

framework (see Figure 3). Their model uses a systems approach to identify the building

blocks or units needed for effective CBI, their relationships, and how they affect the

learning process and outcome. The units are divided horizontally to show their role as an

input, process, or output in the model, and vertically to represent their influence as a

support or design part to the model. Laws of interaction were developed to describe how

units affect each other in the learning process.

Inputs

The self-directedness construct describes the learners’ motivated personal

responsibility for the processes involved in their learning. It describes their ability to

independently plan, conduct, and evaluate their learning (Lowe & Holton, 2005). E-

Learning provides users the ability to learn at their own speed, yet requires them to show

a degree of personal responsibility for completing the instruction. The learners’ self-

directedness is influenced by their locus of control, metacognitive skills, and motivation

to learn. Locus of control is the learners’ belief that control over events in instruction is

19
FIGURE 3. Conceptual framework for effective CBI (Lowe & Holton, 2005). The
critical components for CBI. Their role and relationships within the framework.

attributed internally to themselves or externally to the environment. Learners with an

internal locus of control will require more freedom and control over the pacing and

processes of instruction in an E-Learning module, whereas learners with an external locus

of control will need the e-Learning environment and interface to provide more guidance

over their process of learning. Metacognition describes peoples’ ability to recognize how

they learn and control that process. Whether or not they utilize that skill with their

learning can affect their self-directedness as learners. As stated in the andragogy model

of adult learning, motivation is a critical component of adult learning (Knowles et al.,

2011). The learners’ perception of the value of the training, such as increasing

20
productivity and work efficiency affects how intrinsically motivated they will be during

instruction.

Learners’ belief in their ability to successfully use a computer is a measure of

their self-efficacy. Self-efficacy can affect learners’ motivation, behavior, persistence,

and effort. This unit is particularly important, since adult teachers show a deficiency in

computer skills and familiarity (Tondeur et al., 2012). A low self-efficacy in relation to

technology can affect their desire or reluctance to complete an e-Learning training. The

learning goal level describes the learning domain specific activities and performance

necessary for the learners to achieve the preferred learning outcome. Comprehension of

the learning goal influences the overall design and strategy of the instructional design,

such as the proving behaviors required of the learners to achieve the learning outcome.

Process

External support. Adults prefer a method of support in their learning. External

support is provided in many forms: peers, technology assistance, efficient computer

hardware, and appropriate lengths of time for instruction. The effective CBI model

emphasizes the importance of external support, labeling it as a required necessity.

CBI design. Governed by four individual units: instructional control,

instructional support, screen design, and practice strategy, the CBI design addresses the

needs of learners to present the best possible learning outcome. Three variations of the

instructional control construct can be used. First, a program control design guides the

learners through instruction, controlling elements such as pacing and sequence. Second,

a learner-controlled instruction gives control of the instructional program to the learners.

Research studies show success in e-Learning attributed to a learner controlled

21
environment (Clark & Mayer, 2008). Lastly, an adaptive controlled design combines

both forms of control, and adapts to the learner, based on their interactions and responses

to the CBI. The instructional support construct provides support for the content during

the learning process. This can be achieved through hints, visual aids, and evaluative or

corrective feedback. Instructional support can also be used as a motivational tool for

learners. The screen design construct influences learners’ motivation and controls the

transfer of information. Screen design pertains to the visual layout, the navigational

features, and use of visual graphics. The practice strategy construct can vary, based upon

learners’ characteristics of learning as well as upon the subject matter. Research has

shown higher achievement related to greater amounts of practice.

Instructional strategy design. The components of the instructional strategy design

construct determine the overall presentation of the instruction to the learners.

Organizational strategy, influenced by the learning objectives, determines the proper

sequencing of the content, interaction between learners and interface, and the methods

used for application of learned knowledge. Delivery strategy describes the medium used

for instruction its environment, ranging from a solo learning environment to group

settings. The unit governing the organization and scheduling of CBI is the management

strategy. Instruction has to fit within the learners’ schedule, resources need to be

available, and records need to be kept.

Output

Lowe and Holton have only one output for their CBI model, the learning

outcome. It is the measurable, performance-based behaviors of the learners after

completion of the CBI.

22
Laws of Interaction

Based on their framework theory, Lowe and Holton suggest each building block

unit influences others through the laws of interaction. For instance, learners with high

levels of self-directedness will need less instructional control, whereas learners with

lower attributes of self-directedness, such as low self-efficacy, lower locus of control, or

lower metacognition skills will need a higher level of instructional control from the CBI.

Levels of self-directedness are described as four stages, ranging from dependent learners

to self-directed learners (Lowe & Holton, 2005). The learners’ level of confidence with

their computer self-efficacy affects the levels of instructional control and support needed.

Low confidence requires more coaching and feedback from the system. To meet the

needs of all levels of self-efficacy, implementing an adaptive mix of program and learner

control is the ideal solution; however, this requires more time and effort from the

designer to create (Clark & Mayer, 2008). Both self-directedness and computer self-

efficacy have a direct impact on the level of external support. Learners with lower levels

of the former will seek out and be dependent on various forms of external support such as

supervision, immediate feedback, direction, interaction, motivation, technical support,

and time away from work. The CBI design and external support balance each other out.

A weak CBI design requires more external support and vice versa. Also influencing the

CBI design are components of the instructional strategy design construct. The

organizational strategy influences the layouts of the screen design and practice strategies

used. The learning goal has influence over instructional strategy design construct, as it

will determine the development and presentation of content and events in the instruction.

CBI design is similarly influenced by the learning goal level construct. The taxonomy

23
level of the learning goal will determine the amount of instructional support, as well as

the amount of program control needed for the learner to acquire the knowledge presented

throughout the CBI.

Lowe & Holton’s theoretical framework for effective computer based instruction

is based on previous research on the central variables of e-Learning instruction, as well as

current behavioral and cognitive theories of learning, and functions within the domain of

adult learning (Lowe & Holton, 2005). The framework lays out the foundational

constructs of computer-based instruction, and explores the pertinent behavioral

relationships. Described as the laws of interaction, these relationships link the constructs

to each other using a systems approach. Based on how well aligned the units of theory

are, the model can take on various system state conditions. An effective system state

occurs when both the support and design are well aligned and complement one another.

A moderately effective system state occurs when there is a partial alignment, such as a

strong support but weak design, leading to a reasonably effective program. Programs

with an ineffective system state have a strong misalignment within the top and bottom

portions of the CBI design. Stated by the laws of interaction, deficiencies in particular

units of theory will affect the soundness of other units under its influence, weakening the

strength of the support and design constructs, leading to an ineffective program.

Content, Interface, Design and Strategy

Effective delivery of an e-Learning module's instruction and content is contingent

on various design factors. A module's interface, navigational tools, presentation of

information and feedback, and medium of instructional media can all impact learners.

Well-designed interactivity and practice activities can stimulate and maintain learners'

24
interest throughout the entire course. On the contrary, a poor design of these important

factors can lead to a dull and boring instructional experience. In e-Learning and the

Science of Instruction, the authors Clark and Mayer (2008) present their view of key

concepts and principles of e-Learning instructional design. Their work is a culmination

of the authors’ study of e-Learning research and personal experiences.

Principles of E-Learning

Cognitive theory of multimedia learning. Clark and Mayer base their learning

theory on knowledge construction, where learners are actively engaged in the content,

leading to increased knowledge acquisition. Their model is composed of four principles.

1. Dual channels: People have separate channels for processing visual/pictorial

material and auditory/verbal material.

2. Limited capacity: People can actively process only a few pieces of

information in each channel at one time.

3. Active processing: Learning occurs when people engage in appropriate

cognitive processing during learning, such as attending to relevant material, organizing

the material into coherent structure, and integrating it with what they already know.

4. Transfer: New knowledge and skills must be retrieved from long-term

memory during performance. (Clark & Mayer, 2008)

Managing the learners’ cognitive processing is vital for e-Learning instruction.

Extraneous cognitive processing, created by poorly designed instructional layouts that do

not support the learning objective, need to be minimized by the e-Learning designer.

Essential processing, during which the learners focus on the main instructional material

relevant to the learning objective, is crucial to the design. Incorporating motivation into

25
the instruction, prompting learners to develop a deeper understanding of the content is

generative processing, which is also a desirable cognitive process for instructional

designers.

Cognitive learning theory, in which learners process information into knowledge

and skills, is also foundational for the multimedia learning theory. Designers must

develop instruction that fosters the transfer of information into the learner’s working

memory, relate it to prior knowledge stored in long-term memory, and create instances of

information retrieval from long-term memory back into the working memory. Julie

Dirksen’s 2012 book Design for How People Learn describes long-term memory as a

“closet full of shelves.” The goal for instructional designers is to make e-Learning

content relatable to the learners through various means, enabling learners to store

information on the various “shelves” in their “closet.” This will increase the probability

of learners’ information retrieval from long-term memory for later use. Too much

information presented at once can burden the working memory’s cognitive load. E-

Learning design must use a “less is more” approach in presentation of information to

reduce cognitive load. Following the dual channel principle, presenting information

through both visual and audio methods at the same time can reduce the cognitive load.

Active processing of the information, by means of examples and practice in the e-

Learning instruction, will help advance it into the “shelves” of the learner’s long-term

memory.

Multimedia principle. To take advantage of the dual channel cognitive process,

Clark and Mayer (2008) recommend using words and graphics, allowing for learners to

create visual and verbal mental representations of the content. Words can be presented as

26
printed text as well as spoken word. Graphics can be static and dynamic, the latter

including animations and video. Graphic usage is categorized as decorative,

representational, relational, organizational, transformational, and interpretive. Decorative

graphics should be avoided, as their function does not enhance learning. Organizational

graphics show relationships between elements, while transformational depict changes in

time of an element, interpretive convey invisible connections between elements, and

relational explain quantitative relationships and processes. Representational graphics that

demonstrate the appearance of an object should never be used more than once, as that can

lead to extraneous processing in the working memory. Except in the demonstration of a

particular motor skill, Clark and Mayer’s research studies find that static images are more

effective than animation based graphics.

Contiguity principle. Contiguity relates to the placement of corresponding printed

text and graphics. To decrease cognitive load, these corresponding elements should be

placed near each other within the design layout on the screen. In the case of large

amounts of text, using a roll over technique is recommended. Clark and Mayer provide

several examples of violating this principle: separating text and graphics on a scrolling

screen, displaying feedback on a separate screen, locating the directions of an exercise on

a separate screen from the exercise, placing text at the bottom of the screen away from

the graphics, and isolating the printed description of a key element from its correlating

visual graphic. When using narration, the spoken words should be presented at the same

time as the correlating graphics, especially when a video is showing steps to perform a

particular performance task. Narration of text not corresponding with the presented

27
graphic violates this principle. Designers should avoid mistakes such as embedding

narration and video into separate links.

Modality principle. To lessen the burden on the visual channel of cognitive

processing, text should be presented as spoken word rather than printed. Otherwise,

printed text and visual graphics could create too much extraneous processing on the same

cognitive channel. This is especially relevant in the case of animated graphics and video,

in which learners cannot focus on both presentations of visual information at the same

time. The authors recommend violating this principle only in special circumstances when

keywords in the instruction are presented. These include steps in a procedure, technical

terms, or specific directions for an activity.

Redundancy principle. Explanations of visuals should occur in either spoken

word or printed text. Both should not be used at the same time. Using narration is better,

as it uses both the audio and visual channel of cognitive processing. Combining both

narration and printed text with a visual graphic would overload the visual channel.

However, the authors present circumstances in which the use of printed text and narration

could be necessary, such as the absence of a visual graphic, the allotment of time allows

for its cognitive processing, and the vernacular being technical. Technical terms

presented aurally can create a difficulty in the learner’s mental processing.

Coherence principle. The coherence principle states “less is more.” Extraneous

use of graphics, text, and sounds that are irrelevant to the instructional goal should be

avoided. These are considered seductive details; interesting, but irrelevant material that

clutters up the learning process. Background sounds such as sound effects and music,

although considered interesting, should be avoided. Extraneous graphics are

28
unnecessary, and can distract the learner away from relevant information or disrupt of the

process of applying the relevant information to prior knowledge. Seductive graphics

cause the learner to cognitively process and organize the wrong information. Words used

for extraneous descriptions, interest, and details should be avoided. Important printed

text such as key terms should be indicated by a signaling technique, whereby the layout

of the signaled text contrasts with other text, drawing the learner’s attention towards it.

This can be achieved by changing the font, font size, or text color.

Personalization principle. Research has shown learners achieve better

understanding and performance of the learning objective when the content of the e-

Learning instruction is presented in a more conversational tone of voice, rather than in a

formal tone of voice (Clark & Mayer, 2008). By using the a 1st or 2nd person tone of

voice with the words “you” and “I”, a relationship is built between the learner and the

program. Research shows this technique is advantageous when humans try to make sense

of the presented content. Using a conversational tone prepares the learners’ cognitive

processes, makes them work harder, and reinforces a deeper understanding of the content.

Politeness of the conversational voice is important, and learners should feel like they are

interacting with another person. Overuse should be avoided, as it can become a

distraction.

The second part of the personalization principle recommends the use of an

onscreen coach. Described as a “pedagogical agent,” the onscreen character is used in

the delivery of content process as a guide to the learners. Functioning examples of a

pedagogical agent include a cartoon character or avatar, who uses conversational spoken

words in a human voice rather than machine recorded voice. Human likeness of the

29
agent is not necessary. The coach can also be represented as an object relating to the

content of the instruction. Examples of proper use of an onscreen coach include having

the coach providing hints, examples, and acting as a guide for demonstrations and

explanations.

Clark and Mayer state the relationship between the learner and the program can

be reinforced by the visibility of the program author. By revealing information about the

author such as his/her personality and perspectives, learners can develop a more human-

to-human relationship with the e-Learning program, reinforcing the learning process and

maintaining their levels of interest. A fine balance of this reveal must be controlled, as

too much author visibility can translate into seductive details, and distract the learner.

Segmenting and pre-training. Lengthy or complex instructional content presented

to the learner at once can cause cognitive overload, and can weaken the integrity of the

lesson. Overcoming this issue requires breaking down the content into smaller sections,

thus segmenting the instructional delivery to the learners. Pre-training is the introduction

of the names and characteristics of key concepts before the actual instruction begins.

This technique reduces the cognitive load by redistributing the key concepts to the

beginning of the lesson.

Worked examples and practice. E-Learning instruction presents information to

users in order to build new knowledge and skills into long-term memory. Acquisition of

the skills is reinforced by providing worked examples and practice throughout the e-

Learning setting. Worked examples are activities used in e-Learning to help learners

build workspace skills by providing step by step demonstrations related to the

performance of particular tasks or problem solving. Clark and Mayer highlight several

30
principles of correctly incorporating worked examples into the design of e-Learning.

Over the course of the e-Learning instruction, learners will gain more experience and

knowledge of the content. As the new knowledge is stored in long term memory,

continuously providing worked examples can have an expertise reversal effect. Learners

will no longer pay attention to the worked examples, and this can create extraneous

cognitive processing. Transitioning worked examples to problem solving exercises using

the fading technique can circumvent the expertise reversal effect. Fading accommodates

the learners’ acquisition of knowledge and skills by gradually eliminating the sequential

steps of a worked example. As demonstrated steps are removed, learners are required to

complete the missing steps on their own. Fading works gradually, until the learners are

problem solving the steps of the entire procedure.

Other design techniques for worked examples include self-explanation, where

learners are asked to identify the principles and concepts used in a worked example.

Worked examples can be accompanied by explanations to provide rationale for the steps.

Explanations can be an on-demand feature, controlled by the learners, or a response to an

error controlled by the program. Previously stated multimedia principles should be

applied to all worked examples. Examples can be chunked into meaningful segments,

labeled, and made available to the learners at any point in the learning process.

Worked examples should be designed to support the learning of near transfer and

far transfer goals. Near transfer learning describes step-based procedures that can be

applied in the workplace. Embedding contextual cues into the learning will help learners

retrieve the necessary information as they experience those same cues in the workplace.

Far transfer learning design is used to reinforce the learning of judgment and problem-

31
solving skills that can be applied in various situations of the workplace. More than one

worked example is needed to implement this process. Worked examples will present

different situations to the learners in which the underlying principles and concepts are

still the same. Self-explanation of the principles can be required for the learners to

complete the worked example.

Practice provides opportunities within the e-Learning setting for learners to build

and develop their new skills. Practice presents challenging tasks to them; the tasks are

difficult at first, but can be mastered over the course of the instruction. By targeting skill

gaps, learners build new skills that can be transferred to the work environment. Practice

environments should be distraction free, and provide explanatory feedback to the

learners. Clark and Mayer identify the following as core principles applicable to

developing quality practice in e-Learning instruction.

Practice should mirror the job. Users should be interacting with the instruction in

the same manner in which they would be performing tasks in the work environment.

Feedback should go beyond the simple response of “right” and “wrong” and provide

short explanations. Placement of text should follow the contiguity principle. The amount

of practice needed should be based upon the needs of the work environment. A less

critical skill requires less practice. The time allotted for practice decreases as the amount

of practice increases. Use of text is important. Instructions needed to complete tasks

should remain on the screen while learners are in the process of responding. Feedback

should be provided as text, allowing learners to control the pace of instruction. Audio

narration should not be used for questions or feedback. Response areas should be easily

identifiable and near the question. Areas within the layout should be designated for

32
feedback, ideally close to the learners’ response area. As the coherence principle states,

practice should be free of any extraneous design elements, such as text, sound, and

visuals.

Learner control vs. program control. Instructional control is dependent on a

learners’ self-directedness, which will vary from high to low. Users with a low self-

directedness will need the e-Learning module to control the pacing and delivery for them.

The design, however, must accommodate the opposite spectrum of learners, experienced

in the content, who require more learner control over the program. Learners prefer to

have learner control, and the design can enhance this preference by providing control

over the sequence and pacing of the content, as well as access to worked examples and

practice. Pacing control is provided by inserting cues such as titles, headings, and

introductory statements into the design, as well as navigational options such as forward,

back, exit, and menu.

An ideal instructional program integrates an adaptive control into the overall

design. This can be accomplished through several design techniques. The static

branching method discriminates the content based on the assessments of a pretest given

before the instruction begins. Differentiating the instruction through an ongoing

assessment process, by contrast, is dynamic branching. Advising the learners, another

technique, can be either generic or adaptive. Generic advising provides general tips to

the users for proceeding through the training, creating a more structured learning

experience. Adaptive advising provides recommendations based on the learners’

continuous responses throughout the design. The shared control method gives partial

control decisions to the learner, leaving the other decisions to the program.

33
Motivation, Interactivity, and Navigation

Michael Allen’s Guide to E-Learning: Building Interactive, Fun, and Effective

Learning Programs (Allen, 2003) contains the author’s opinion and recommendations on

techniques and strategies to use in the design of e-Learning. Allen’s basis for his

recommendations is backed by research and his extensive years of experience in the field

of e-Learning design and development, and covers important constructs found in Lowe &

Holton’s theoretical framework for effective computer based instruction (Lowe &

Holton, 2005). Allen offers seven “magic keys” for integrating motivation into

instruction, concepts of navigation, and the components for interactivity.

Motivation is an andragogy principle of adult learning, a unit of theory in the CBI

framework, and is critical for effective e-Learning. It can make a great instructional

design ineffective, and a weak design useful and functional. Allen’s seven “magic keys”

are effective techniques for building motivation, to make the learner “want to learn”

(Allen, 2003).

Magic key 1: Build on anticipated outcomes. Learning objectives are important

in the design process for e-Learning. They are comparable to the learning goal level in

the CBI framework, in that they influence the instructional strategy design and

instructional support constructs (Lowe & Holton, 2005). The direction and content of e-

Learning design begins with learning objectives. Whereas it is natural for educators to

state the learning objective at the beginning of a lesson, Allen finds this process to be

boring, un-motivating, ineffective, and recommends that it should be avoided. Instead,

use memorable and motivational experiences (Allen, 2003), such as providing a task

before instruction begins as another method of introducing learners to the instructional

34
objective. This method presents an opportunity to incorporate the need to know

andragogy principle (Knowles et al., 2011), and other instructional design principles

(Clark & Mayer, 2008; Lowe & Holton, 2005). An onscreen coach can be introduced,

conversational voice used to create a relationship with the learners, or a pre-instruction

task as a pretest to help determine the level of program control needed for the learners.

Allen recommends using scenarios that are dramatic or which possess game-like

qualities.

Magic key 2: Put the learner at risk. Putting the learners at a measure of risk is in

itself risky for the instructional designer. Risk has a very strong motivational influence

on learners. It focuses, energizes, and builds confidence in them. However, this method

can create drawbacks. Learners may rush through the instruction, not allowing enough

time for cognitive processing. Anxiety can rise, and self-confidence can be weakened,

which could lower their self-efficacy and metacognition in the self-directedness construct

(Lowe & Holton, 2005). Allen believes this method is worth the risk, and the negatives

can be avoided. The instructional design should allow the learners to get corrective

feedback, exit the scenario, set the level of difficulty, compliment the learners’ attempts,

scaffold the difficulty of the challenges, and provide different levels of assistance.

Allen’s ideas illustrate different variants of providing forms of Lowe and Holton’s CBI

construct, external support (Lowe & Holton, 2005).

Magic key 3: Select the right content for each learner. Allen’s third magic key

stems from previous e-Learning concepts of instructional strategy and learner control

(Lowe & Holton, 2005). Allen proposes individualization of a sequence of instruction

can motivate learners, rather than common, selective, or remedial forms of instruction,

35
such as “tell first then test” (Allen, 2003). The author recommends testing first, then

telling, a method allowing for both variable time limits and variable content sequencing.

By testing first, learners are able to immediately experience what they need to learn,

become active and engaged learners, and control the level of assistance needed.

Magic key 4: Use an appealing context. Much of the idea behind Allen’s

recommendation of appealing context relates to the andragogy principle of orientation to

learning (Knowles et al., 2011), and Clark and Mayer’s principle of segmenting (Clark &

Mayer, 2008). Instruction should be delivered in a problem, case, or activity based

meaningful situation. Adults are motivated to learn by solving problems and tasks, and

instructional delivery should be designed around this concept. During the design process,

as content is segmented into a sequence to reduce cognitive load on the learner, Allen

recommends that the training not begin at the bottom of skills hierarchy. The basic skills

are generally boring to learn, and confronting the learners with more challenging skills

from the middle or top of the hierarchy tree provides more challenge and risk. Another

method of creating meaningful, appealing instruction is to engage the learners through

the use of novelty as a context. Although effective, novelty should be used carefully as it

has a short life.

Magic key 5: Have the learner perform multistep tasks. Just as Malcolm

Knowles believes adult instruction should be life related (Knowles et al., 2011), Allen

believes e-Learning tasks should be authentic to the learner, and relatable to specific tasks

found within the workplace. E-Learning tasks should also not follow the common

procedure of question and answer, but should involve multiple steps before reaching

conclusion. This allows the learners to control their pacing and correct mistakes, and

36
allows for external support opportunities to provide meaningful feedback and intrinsic

clues related to the task. Rather than typical feedback such as “right” or “wrong,” helpful

feedback will engage the learners.

Magic key 6: Provide intrinsic feedback. Intrinsic feedback allows learners to

see how their correct performance is empowering to them and how, step by step, they are

becoming more capable, powerful, valuable persons (Allen, 2003). This notion of

intrinsic feedback relates to the self-directedness construct in the CBI framework.

Empowering learners on their ability to learn will create an internal locus of control

boosting their confidence, their motivation to learn, and reducing their need for program

control within the E-Learning design. Intrinsic feedback can be built into the design by

avoiding the typical “right” or “wrong” forms of feedback, instead incorporating the

feedback into the activity’s outcomes. Learners will see how the consequences of the

decisions they make during the instructional tasks will affect the overall end result of the

activity.

Magic key 7: Delay judgment. Allen believes learners should be able to make

mistakes and learn from them. Immediate judgment of mistakes must be delayed, to

allow the learners opportunities to evaluate the decisions they have made in the learning

activities. Allowing the learners to evaluate their decisions creates opportunities for

deeper understanding and retrieval of information from their long-term memories.

Delaying judgment allows the learners to discover and correct their mistakes, which in

turn can boost their metacognition, confidence, and motivation to continue.

Navigation. Well-planned, navigational controls and design of an e-Learning

training program incorporate various principles and constructs. The CBI framework

37
model units of instructional control, screen design, practice strategy, and principles of

segmenting and learner control all influence the design and nature of the e-Learning’s

navigational controls (Lowe & Holton, 2005). Allen describes the navigation controls

layout as effective when they follow design rules imperative to their success (Allen,

2003). Learners should be able to see the overall size of the program to develop a

perspective of how much time and effort is involved. The sequential nature of the

content’s segments, whether hierarchical, linear, or recommended, should be visually

represented and accessible to the learners. The learner’s progression through the

instruction should be visually represented in the screen layout. Navigational controls

should include accessible means for the learners to move forward and backward through

the contextual elements of the program. Means for allowing learners the opportunity to

correct themselves within tasks and practice should be available within the design.

Interactivity. Instructional interactivity allows the learners to engage with the

instruction, improving the cognitive processes of knowledge acquisition, and building of

skills through worked examples, meaningful tasks, and practice in order to improve

performance of newly acquired skills. Essential components of instructional interactivity

include context, challenge, activity, and feedback. Context, which relates to the

andragogy principle of readiness to learn (Knowles et al., 2011), describes the framework

and conditions of the instructional interactivity that are relatable and meaningful to the

learner. Challenge refers to the catalyst for learner engagement with the interactive

instructional element. Both context and challenge prompt a physical response, through a

particular activity, between the learners and computer. The e-Learning activity’s

response to a learners’ activity is provided through feedback. Well-designed feedback

38
goes beyond stating “right” or “wrong” to the learners. Instead, good feedback presents

an opportunity for the learners to determine the consequences of their actions and

decisions while they interact with an e-Learning activity.

Gaps and flow. In Design for How People Learn (2012), Julie Dirksen digs

deeper into the principles and constructs of e-Learning. Learning goals and objectives

are a crucial part of instructional design, as they influence the instructional and practice

strategies. Dirksen expands upon this idea, stating that learning goals need to be clearly

defined so that the correct instructional path is mapped out for the learners. To do so,

problems need to be identified, destinations set, gaps found and targeted, and it needs to

be determined how far up the taxonomy ladder the instruction will go. Understanding

who the learners are, and their particular learning styles is important, although targeting

every style of learning is a challenge. Intrinsic and extrinsic motivational factors of the

learners should also be noted, as they can be incorporated to create a more robust learning

program.

Dirksen identifies several gaps that influence and impede people from performing

their work objectives. Knowledge gaps occur when workers do not have the necessary

information to perform an objective correctly. Skill gaps occur when the knowledge and

information are present and available to workers, but the task still can’t be performed due

to a lack of ability. Overcoming a skill gap requires practice of the particular task in

order to master it. Motivational gaps arise when workers know what to do, but choose

not to do it for reasons such as anxiety, uncertainty, confusion, distractions, or general

disinterest. Environmental gaps are created by a lack of resources or incentives in the

workplace. Communication gaps are a result of bad directions, misconceptions, and a

39
lack of communication skills. Dirksen’s gaps are her explanation for what influences

workers from completing their tasks. Work sites and personnel will have different gaps

based on their particular situations, but the gaps can be corrected through e-Learning

training when e-Learning designers can correctly identify and know how to design the

instruction to overcome these obstacles. Correcting a knowledge gap is the least

challenging, as it is the result of a lack of information. Eliminating skills gaps, however,

requires obtaining mastery of the skill through an effective practice strategy.

The practice strategy is a construct within the CBI design framework (Lowe &

Holton, 2005). Clark and Mayer state many principles for the design of effective practice

(Clark & Mayer, 2008). Dirksen believes that like content delivery, e-Learning practice

activities should be scaffolded and staggered using Csikszentmihalyi’s flow model

(Dirksen, 2012). In this model, the level of difficulty changes over time. Some tasks are

slightly difficult, challenging learners, creating a sense of risk. Placed between

challenging tasks are easier tasks. Not only does this reduce the cognitive load on the

learners, allowing the mind to rest, it also provides some motivating satisfaction to them.

Figure 4 shows how the level of difficulty for e-Learning activities should fluctuate over

time.

Summary

The e-Learning training module was designed around the theories, principles, and

frameworks discussed in the literature review. Instructional content was presented from a

holistic view of the TPACK framework, tapping into teachers’ content and pedagogical

knowledge in order to further develop their TK, TPK, and TCK. The design will

incorporate Knowles’ principles of andragogy (Knowles et al., 2011) into the

40
FIGURE 4. Flow model of practice (Dirksen, 2012. Used with permssion). The
difficulty of e-Learning practice activities increases along with the users’ ability;
however, the rate of difficulty is not constant in order to reduce the chance of cognitive
load.

presentation of content and structure of the module. Upon the start of instruction,

teachers were informed about the importance of the training and how it related to their

lives. Content was meaningful and relatable to their experiences. Tasks were designed to

be life and problem-centered. The module’s structure was designed using Lowe &

Holton’s (Lowe & Holton, 2005) framework for effective CBI, incorporating all the units

of theory, and sensitive to the laws of interaction in order to achieve an effective system

state. The design of the screen layouts, content presentations and delivery, worked

examples, modeling, practice, navigational controls, and sequencing followed the design

principles, theories, and strategies provided by Clark and Mayer (2008), Allen (2003),

and Dirksen (2012).

41
CHAPTER 3

METHODS OF DESIGN AND DEVELOPMENT

Overview

The development of the module followed the Successive Approximation Model

(SAM) instructional design process (Allen & Sites, 2012). A needs assessment with

educational technology leaders was conducted to design the instructional content. E-

Learning design principles and theories were researched to design an efficient and

effective instructional process that maximized learning. Principles of andragogy were

incorporated into the design to target the needs of adult learners. The final version of the

module was developed over the course of ten months using graphic software tools and

web based e-Learning programming tools. In the spring semester of 2015, teachers at

two middle schools in the Long Beach Unified School District were offered the module

as a resource to use. Following the SAM model, formative evaluations of the module

occurred throughout the development process, and participants of the module completed

a summative evaluation.

Design of the Project

The project was created to address the needs brought about by California's

adoption of the Common Core State Standards. These include math and ELA standards

requiring student usage of technology and multi-hour computer-based standardized tests,

coupled with teachers' lack of TPACK skills and experience integrating technology into

their curriculum. Conversations with technology coordinators (K. Anderson, personal

42
communication, Spring 2014), leaders within the Long Beach Unified (V.

Chandrasekhar, personal communication, Fall 2012) and Ventura Unified School

Districts (J. Judd, personal communication, Fall 2012) reinforced the apparent lack of

technology knowledge by teachers, and the challenges they will face. Specific skill and

knowledge gaps were presented that fell within the PK, TK, TPK, and TCK domains of

the TPACK framework, and were used as the foundation for the module's instructional

content (see Figure 5).

Organizational Strategy

The instructional strategy's design targets the specific knowledge and skill gaps

previously identified using a holistic view of the TPACK framework. Instructional

content is presented through the story of four characters. Each character is a teacher of

one of the four core subjects: math, English language arts, history, and science. Users of

the training module follow and interact with the four characters as they go through

different scenarios that follow Niess, Sadri, and Lee’s development process of TPACK:

accepting, adapting, exploring, and advancing. Three scenarios are provided within each

core subject area, and are differentiated by the assumed TPACK level of the user,

beginner, intermediate, and expert. A fifth character is incorporated into the design to

serve several purposes. The fifth character is the narrator, the “pedagogical agent” (Clark

& Mayer, 2008), represents the “recognizing” stage of the TPACK development process

(Niess et al., 2007), and is an elective teacher.

43
44
FIGURE 5. TPACK knowledge and skill gaps. Knowledge and skill deficiencies identified through
conversations with educational technology leaders. The basis for the instructional content.
45
FIGURE 6. Organizational strategy of the e-Learning module. Instructional flow and navigation of
the training module.
Using the storytelling e-Learning design method suggested by Dirksen (2012)

allows the instructional design to follow Knowles’s principles of andragogy (2011) and

research-proven methods of TPACK development, such as modeling and contextual

learning activities (Ansyari, 2013; Mishra & Koehler, 2006; Voogt et al, 2013). Using

five characters associated with different subject content areas allows users of the module

to identify themselves with the characters, supporting Lowe & Holton's metacognitive

and motivation constructs (2005). Providing three different TPACK level-based

scenarios for each of the four core subject areas allows the instructional content to be

scaffolded, segmented, and adjusted to the users' locus of control (see Figure 6).

FIGURE 7. Screen design and navigational control.

46
Screen Design

The screen design uses a modern flat design, follows Robin Williams' principles

of graphics design: contrast, repetition, alignment, and proximity (Williams, 2008); and

it is divided into areas where the program’s various elements are nested. The main

navigational elements for the instructional content are centered in the screen and provide

visual assistance, helping the user decide how to proceed with the module (see Figure 7).

The main navigational area is divided into five different sections. Four are featured

prominently, and are associated with the four core subjects and their associated character.

The fifth section is located to the side, and introduces the user to the pedagogical agent

character and the current educational technology theories of TPACK and SAMR

(Puentedura, 2013). In accordance with research findings on the benefits of TPACK

awareness (Doering et al, 2009), a navigational link to a section within the module

devoted to creating TPACK awareness is available on the left of the screen design.

Below the main content area are controls for general help and support for navigation

throughout the module. Also located at bottom is a link to a resource section. This

section contains links to resources outside of the module that help further TPACK

development.

After the teacher chooses a particular character, the main navigational area of the

screen changes to an instructional area, where users interact with the module's story and

activities relating to the fictional character. Users’ section progress and scores are

displayed to the left of the instructional area (see Figure 8). Above the main navigation

and instructional content area are interactive buttons allowing users to return to the main

47
screen and to gain access to the library of instructional videos shown throughout the

module’s instruction.

FIGURE 8. Instructional content area. Section progress and scores are displayed on the
left.

Delivery and Management Strategy

Research shows TPACK professional development success occurs when teachers

are able to collaborate and mentor each other (Ansyari, 2013; Voogt et al, 2013).

Although the module is online-based to take advantage of features such as location, time,

and speed of learning, a collaborative feature was built into the delivery and management

strategy. The website hosting the module included an online chat room (see Figure 9).

48
Using this feature, users of the module had the ability chat with each other, asking

questions or seeking help with the module, as well as using the chat room's history to find

answers.

FIGURE 9. Screenshot of website hosting the e-Learning module.

49
One of the benefits of e-Learning includes the ability for users to save their

progress, so when they revisit the module they can pick up where they left off. Due to

the high cost of providing this ability, the project was unable to incorporate this feature.

To circumvent users’ frustration of not being able to save their progress, the module was

designed so returning users could easily access the backend instructional content without

having to revisit the frontend instructional content again. Nested at the top of the

module’s screen is set of navigational buttons that link to backend content, and are

programmed to automatically load as non-functional. The buttons function after users

input a specific code into a text input field nested within each button. The codes used to

unlock the navigational buttons are obtained after users complete the expert level math

scenario and the computer lab prep section. All instructional videos and resources

created for the module are available in areas of the screen separate from the instructional

learning content so returning users can easily access them on demand.

Development Process

Successive Approximation Model

The module was developed using the SAM process of instructional design (Allen

& Sites, 2012). The process was chosen for its continuous iterative cycles of evaluation,

design, and development. Using SAM, various prototypes of the module’s design and

activities were tested and evaluated throughout its creation. Select teachers and family

members were asked to give feedback on the module’s interface, navigation design,

activities, video controls, and fictional scenarios. The feedback was used to eliminate

unnecessary components of the module’s design, strengthen the graphic design, improve

50
its interface, and to generalize the module’s activities to increase their relativeness to

teachers.

Tools and Resources

A variety of tools were used to create the e-Learning module. The Adobe tools

Photoshop, Illustrator, and Premier Pro were used to create visual elements such as static

images, animations, and videos. Character images were created using the Bitstrips

website. Visual graphics such as computer screens, keyboards, silhouettes, and student

faces were purchased from online stock vector image websites. Conceptual designs of

the module processes were created in Microsoft Visio. Programming the module was

achieved using ZebraZapps, a web based e-Learning programming tool. Due to its

sandbox approach to design, ease of use, learning curve, and price, ZebraZapps was

chosen over other popular programming tools such as Adobe Captivate and Articulate.

Using the ZebraZapps platform had many advantages for hosting the module at a low

monthly cost. Its development environment provided the creative freedom to design an

original screen design and practice strategy, design original interactive elements and

controls, import videos hosted on YouTube, and to create and program original logic-

based activities. A website created through Weebly hosts the training module as an

embedded object, and incorporates a chat room widget created with the tlk.io web service

(see Figure 9). Since the project’s initial inception, the module’s screen design and

practice strategy has been redesigned twice. The final version used for the project took

ten months to develop, and has three hours of instructional content and activities. The

content includes three different story-based teacher scenarios, interactive activities,

51
gamification elements, an end-game computer lab management simulation, links to

outside resources, and twenty-one instructional videos created specifically for the project.

IRB Approval

Two school districts, Ventura Unified and Long Beach Unified, were solicited

regarding the process for permission to conduct research. Long Beach Unified

responded, and permission to conduct research was obtained through an application

process. Permission was granted for the project to be used by teachers from two middle

schools within the district. The school approvals, survey instruments, and consent forms

were submitted to the California State University Long Beach Internal Review Board for

permission to conduct research, and the project was approved.

Preliminary Testing of the Module

Process

Formative evaluations of the module occurred at various stages throughout the

development process. As elements of the design such as the instructional segments,

examples, practice activities, simulations, and screen designs were developed,

evaluations of the them were sought from teachers, colleagues, and family. A

preliminary test was conducted before the module’s final release to find overlooked

programming bugs, instructional concerns, or usability issues. A recruitment email was

sent to five potential participants, inviting them to test the training module (see Appendix

A). The potential participants were chosen for their experience of being a teacher, their

knowledge of using technology in a classroom, or for their knowledge of developing

professional development. On the website hosting the module, the participant consent

form was displayed outlining the terms for participation (see Appendix B). Only by

52
clicking on the “I agree” button on the consent form were users able to gain access to the

module. Feedback was gathered through an online survey linked within the module (see

Appendix C). The survey questions were designed to gather general information on the

participants’ likes and dislikes, technical issues they experienced, comments and

suggestions. Participants were also given the option to contact the investigator over the

phone to elaborate on their experience (see Appendix D).

User Feedback

Of the five participants invited to test the module, two responded to the online

survey with feedback. The tutorials and story line were praised, and according to one

respondent the project design was commended for “the way I was challenged to think

critically about the process of completing the training.” One particular part of the screen

design was reported as an issue from both respondents. The font size of the narrating

character’s text was too small to read, and providing the text in an audio format was

recommended. One participant reported being confused by how to proceed through the

training areas of the module, and also quit the module after getting stuck on a difficult

question that required the user to correctly identify ports on a projector and document

camera. The slow loading time of the module within the browser was also cited for

testing both of the participants’ patience. One respondent was contacted through a

telephone call for further elaboration on the module. Several concerns were brought to

attention, such as text size, a poorly worded question, and the module’s reliance on users’

intuition to navigate through the module.

A third beta test participant responded directly via email on her experience with

the module. An issue was noted regarding the Firefox web browser’s lack of the Adobe

53
Flash plugin. The size of the module was also noted as being too small. The module was

designed using a 1440 x 900 pixel resolution, however the hosting website was

automatically reducing it to 900 x 562 pixels. The practice strategy design, in particular

the user’s natural advancement through the module was also identified as being unclear.

Changes to the Design

Parts of the project were changed or fixed after receiving feedback. To eliminate

confusion, a flow map describing the common progression through the module was

created and incorporated into the module’s home screen and help section (see Figure 10).

The cascading style sheet properties of the website hosting the module were changed to

allow the module to be embedded at its full 1440 x 900 pixel resolution. At the top of the

project’s webpage, a message clarifying the behavior of different web browsers and the

requirements of the Adobe Flash plugin was posted to alleviate confusion. A link to

download the plugin was posted for Firefox users that did not have the plugin already. A

button allowing users to open the module in a separate browser window at its full

resolution was created and embedded at the top of the web page. A humorous graphic

image was embedded into the module’s first loading screen. The image was a message to

users describing the loading time as being a bit slower than expected, and was used to

help prepare users for the module’s loading times. To make it more clear for users, text

size throughout the module was increased. The question requiring users to identify

projector and document camera ports was modified to allow users the ability to continue

if they did not input the answers.

54
FIGURE 10. Home screen graphic illustrating the progression through the module’s
learning content. Also available in the module’s help section.

Summative Evaluation

Site Selection

In the spring semester of 2015, the module was introduced as a professional

development resource for participants. This school year marks the beginning of full

implementation of the Common Core State Standards in 43 states, and the use of

computer-based assessments to measure student achievement. Participants invited to use

the e-Learning module included 76 K-12 teachers and administrators at two middle

schools located within the Long Beach Unified School District (LBUSD). LBUSD was

chosen after having conversations with its technology education leader on the subject of

technology skill deficiencies of teachers relating to the Common Core Standards.

55
Recruitment of Participants

A recruitment message was sent to 76 teachers at the two participating sites via

email, learning management portal, and a flyer placed in their mailboxes (see Appendix

E). The recruitment message included a link to a website hosting the training module.

Upon accessing the website, a consent form was introduced outlining the terms for

participation (see Appendix F). Participants had the option to agree or disagree with the

terms, as well as an option to print out the consent form. Users were given access to the

webpage hosting the module only by clicking on the “I agree” button. Participants were

not expected to complete the entire module, and were encouraged to interact with it on

their own time.

Both schools consisted of teachers teaching core subject areas and electives at the

6th through 8th middle school grades. Twenty-six teachers were invited from one site and

50 from the other. Both schools have access to technology through their computer labs

and a school set of iPad tablets. Skill levels of participants were expected to range from

teachers with no computer lab experience to full time computer lab elective instructors.

Teachers had access to the training module and survey for 20 days.

Data Collection Methods

As the e-Learning module was being completed and released for general use

among the LBUSD participants, an interactive button linking to a survey was embedded

within the module. By clicking the button, users were directed to an anonymous online

questionnaire (see Appendix G). The questionnaire consisted of 33 questions used to

gather information on general user data, the effectiveness of the module design’s

application of adult learning and e-Learning design principles, the degree in which

56
participants gained knowledge and skills for integrating technology into instruction, the

effectiveness of the module’s facilitation of communication and collaboration, and open-

ended general feedback from the participants. Of the 76 teachers invited to use the

module, 20 completed the survey.

General information from the participants’ experience with the training module

was gathered through seven questions. Questions 2 through 5 asked which areas of the

module were explored and how many scenarios were completed. Questions 28 through

29 asked participants where and how they accessed the training module. This

information was used to understand how much content of the module was accessed, the

location preferences of the participants, and what devices were being used to access the

module.

Twelve questions used a Likert scale to measure the project design’s effective

application of Lowe and Holton’s framework for computer-based instruction (2005),

Knowles' principles of andragogy (2011), and the e-Learning design principles of Clark

and Mayer (2008), and Allen (2003). Questions 6 through 8 reflected on the module

topic’s importance, relevancy, and benefits to the users’ profession. Questions 9 through

17 documented the participants’ opinion on the presence of CBI design elements and e-

Learning design principles relating to instructional support, screen design, practice

strategy, interactivity, and motivation. Verification of the module design’s

implementation of principles and strategies researched for the project was gathered from

these questions.

The effectiveness of the project’s learning objective, developing knowledge and

skills for integrating technology into the classroom, and using the school computer lab

57
was measured by six Likert scale questions. Participants were asked if the training

module had increased their knowledge and skills relating to the integration of technology

into classroom instruction and activities, classroom and computer lab management, and

using the computer lab for instruction. Data gathered from these questions was used to

verify the module design’s ability to produce the project’s desired learning outcome.

Four Likert scale questions sought to find evidence of users' communication and

collaboration with their peers while using the module. Participants were asked whether

any communication or collaboration about the module occurred with their peers, with or

without the use of the website chat room. Whether the project’s online chat room feature

was effective in facilitating communication and collaboration among the participants was

verified by these questions.

The last four questions were open-ended questions, giving participants the

opportunity to provide feedback and comments regarding the project. Two questions

asked for the participants to reflect on their likes and dislikes of the training module, and

one question offered space for participants to submit suggestions for improving the user

experience. The last question gave the participants an opportunity to add any other

comments and suggestions.

Data Analysis Methods

Identification of the strengths and weaknesses of the project’s CBI design units,

such as practice strategy, screen design, instructional strategy design, instructional

support, instructional control, and external support was tabulated from the feedback.

Understanding the participants’ reaction to using an e-Learning training module and

learning about technology integration was also analyzed from the feedback.

58
CHAPTER 4

RESULTS, DISCUSSION, AND CONCLUSION

Results from the Survey

Twenty participants completed the online survey after using the module. Of

these, 55% reported accessing the module from home, 30% from work, and 15% from

both home and work. The majority of participants used only one device to access the

module. Specifically, 65% of participants reported using only a laptop, 25% used a

desktop computer, and 10% reported using both a laptop and desktop computer. The

math section of the e-Learning module was explored by 70% of the participants. There

were six available instructional sections. Of the participants, 65% reported completing

two sections, 15% completed four sections, and one reported completing all 6.

According to survey, the videos in the video library were viewed by 85% of participants.

Only 45% reported accessing the external resources in the resource library section.

Application of the CBI Framework, Adult Learning, and Design Principles

Table 1 shows participants overwhelmingly agreed the training module was

important, relevant, and beneficial to the users’ profession, with their responses ranging

from 95% to 100% in agreement to the questions in particular. Users reported they were

in control of their learning (80%) and their learning was supported by the training module

(95%). Participants agreed the module’s instructional content was easy to find (75%),

relevant (95%), and the graphics and text were easy to read and understand (95%). Sixty

percent of users agreed the module’s interactive activities were challenging, however the

59
activities were not a detriment to the learning experience, as only 25% of users believed

they were difficult to complete. A majority of participants (85%) reported the module

helped them complete the interactive activities. Nineteen of the 20 respondents agreed

the module motivated them to learn about integrating technology into classroom

curriculum. None of the respondents disagreed, and one response was neutral. This

finding is consistent with research finding that motivation is a key component of adult

learning and e-Learning design (Allen, 2003; Dirksen, 2012; Knowles et al., 2011).

TABLE 1. Application of the CBI Framework, Adult Learning Principles, and E-


Learning Design Principles into the Module’s Design (N = 20)

Question n Strongly Agree Neutral Somewhat Strongly Decline to


Agree Disagree Disagree Answer

The training module's 20 55% 45% 0% 0% 0% 0%


content is important for my
profession.

The training module's 20 60% 35% 5% 0% 0% 0%


content is relevant to my
profession.

The training module's 20 60% 40% 0% 0% 0% 0%


content is beneficial to my
profession.

I was in control of my 20 30% 50% 10% 10% 0% 0%


learning while using the
training module.

The training module 20 50% 45% 5% 0% 0% 0%


supported my learning of
integrating technology into
classroom instruction and
activities.

The training module's 20 10% 65% 10% 15% 0% 0%


instructional content was
easy to find.

60
TABLE 1. Continued

Question n Strongly Agree Neutral Somewhat Strongly Decline to


Agree Disagree Disagree Answer

The training module's 20 60% 35% 0% 5% 0% 0


graphics and text were clear
and understandable.

The training module's 20 45% 50% 5% 0% 0% 0


instructional content was
relevant to me.
The training module's 20 15% 45% 10% 15% 5% 10
interactive activities were
challenging.

The training module's 20 0% 25% 25% 30% 15% 5%


interactive activities were
difficult to complete.

The training module helped 20 20% 65% 10% 0% 0% 5%


me complete the interactive
activities.

The training module 20 30% 60% 10% 0% 0% 0%


motivated me to learn about
integrating technology into
classroom instruction.

Learning Goal Outcome

Table 2 shows the majority of the participants agreed the instructional content

helped them acquire knowledge and skills in using computer lab for instruction (80%),

managing the computer lab (75%), managing technology in the classroom (85%),

integrating technology into classroom instruction activities (75%), integrating technology

into student learning activities (80%), and integrating technology into student learning

assignments (80%). None of respondents disagreed with any of the above statements.

61
TABLE 2. Effectiveness of the Module’s Design to Produce the Desired Learning
Outcome (N = 20)

Question n Strongly Agree Neutral Somewhat Strongly Decline to


Agree Disagree Disagree Answer

Using this training 20 30% 50% 15% 0% 0% 5%


module, I gained
knowledge and skills on
how to use the computer
lab for instruction.

Using this training 20 25% 50% 15% 0% 0% 10%


module, I gained
knowledge and skills on
how to manage the
computer lab.

Using this training 20 15% 70% 10% 0% 0% 5%


module, I gained
knowledge and skills on
how to manage
technology in my
classroom.

Using this training 20 10% 65% 25% 0% 0% 0%


module, I gained
knowledge and skills on
how to integrate
technology into my
classroom instruction
activities.

Using this training 20 20% 60% 20% 0% 0% 0%


module, I gained
knowledge and skills on
how to integrate
technology into my
student learning
activities.

Using this training 20 10% 70% 20% 0% 0% 0%


module, I gained
knowledge and skills on
how to integrate
technology into my
student learning
assignments.

62
Communication and Collaboration

The degree of communication and collaboration occurring between participants

was measured by the last set of Likert scale questions. Table 3 shows the chat room

embedded within the module’s website was neither used for communication (80%) or

collaboration (65%). None of the participants reported using the website’s chat room

feature. A small group of participants reported using other means for peer

communication (n = 3) and collaboration (n = 2).

Participant Feedback

The training module’s practice strategy design and screen design was well

received by the respondents. Eleven users praised elements related to the module’s

practice strategy, such as its videos (n = 5), interactivity (n = 5), storyline (n = 3), and

gamification (n = 1). Respondents reported the module having “good writing and

organization of content,” “great explanations,” “learner friendly language,” “relative

content,” humor, and information that was clear, concise, and “made me feel at ease.”

Eight respondents liked aspects relating to the module’s screen design, such as its

graphics (n = 4). Responses also included liking how the videos were embedded into the

module (n = 1), the use of a video library (n = 1), visual aids (n = 1), its text, and “easy to

find features” (n = 1). The narrator’s voice was praised for being soothing, having “good

semantics diction,” and for being “very clear.” Other individual comments praised the

module for its acknowledgement of the current technology knowledge and skill

deficiencies among teachers (n = 2), the instructional content’s relation to the teaching

profession (n = 2), the participants’ freedom to learn on their own time (n = 1), and for

giving specific examples of technology use in instruction (n = 2).

63
TABLE 3. Module Design’s Facilitation of Communication and Collaboration Among
Participants (N = 20)

Question n Strongly Agree Neutral Somewhat Strongly Decline to


Agree Disagree Disagree Answer

I used the chat room 20 0% 0% 15% 10% 55% 20%


feature on the module's
website to communicate
with other participants
regarding the training
module.

I used other means of 20 0% 15% 5% 10% 50% 20%


communication (other than
the chat room) to
communicate with other
participants regarding the
training module.

I used the chat room 20 0% 0% 5% 5% 55% 35%


feature on the module's
website to collaborate with
other participants regarding
the training module.

I used other means of 20 0% 10% 0% 10% 45% 35%


communication (other than
the chat room) to
collaborate with other
participants regarding the
training module.

Negative feedback was less frequent in the survey data. The feedback focused

specifically on elements of the module’s screen design and practice strategy. Four

comments related to the module’s screen design. They mentioned confusing graphics,

difficult navigation, too much content on the screen, and an issue regarding its full

resolution size (1440 x 900) extending beyond the user’s screen. Individual comments

relating to the module’s practice strategy (n = 3) mentioned the lack of saving ability,

64
lack of history subject content, and getting stuck in the module. The most frequently

cited negative aspect of the module was the loading time (n = 4).

Suggestions for improving the user experience and use of time included four

recommendations for an easier interface to navigate and understand. One comment

specifically recommended using a drop down menu to navigate to the different sections

of the module. One respondent recommended integrating mini quizzes into the ending of

each section. Two respondents commented on fixing technical issues, such as broken

icons and broken video playback.

Twelve participants responded to the last open-ended question with comments or

suggestions for the training module. Responses included suggestions for improvement (n

= 2), confusion with the survey (n = 1), and an eagerness to see history content in the

future (n = 1). Nine comments praised the module. One such comment related to the

module’s relevancy,

Overall I feel like this is a step in the right direction. Teachers incessantly say

that their districts don't offer a training module to properly prepare them to

integrate technology in their classrooms. This module allows teachers to move at

their own pace which alleviates their affective filter to try new things. The

degrees ranging from beginner to expert are great, allowing for teachers to truly

learn things on their own [like setting up their own tech] rather than asking a

colleague to do it for them year after year.

Another comment indicated praise for the module’s practice strategy,

The information provided by the tool is, to me, basic. Yet this information was

presented in such an appealing manner that I found myself listening to the

65
explanations because of the eloquent scripting. A welcome sense of good humor

is prominent throughout the modules. Hence, I found this tool to be inspiring. It

inspires me to explore tech options that have come on line since I last tried to

integrate web-based tech in my lessons (four years ago).

The module’s benefits were also mentioned. As one respondent wrote,

A lot of teachers are afraid to take their students to the computer lab as well

because they don't have a classroom management system in place so this module

would give teachers the confidence to integrate technology more in their

classrooms.

Summary of Findings

The participants’ feedback provides evidence that the project’s design was

successful. Users reported the module being relevant, important, and beneficial to their

profession. The module’s design included instructional control, instructional support, a

comprehensive screen design, and a comprehensive practice strategy; key units of Lowe

& Holton’s CBI design framework (2005). Users were able to effectively navigate

through the interface, and the content on the screen was clear and intelligible. Presenting

the instructional content within a story-based teacher scenario setting was praised and

created relevancy to the user. The activities were challenging, but were not too difficult

for users to complete.

The project’s design created a motivating environment for users to learn about

technology integration by incorporating Allen’s magic keys to enhance user’s motivation.

Anticipated outcomes were presented at the beginning of the module through its

introductory video. The scenarios’ learning content was separated into three scaffolded

66
levels, giving learners the opportunity to choose the right content for themselves. The

module used humor, music, and a cartoonish graphic design theme to create an appealing

context. Several of the activities within the fictional scenarios were authentic tasks

involving multiple steps for completion. Intrinsic feedback was embedded within the

fictional storyline after users completed activities and tasks, and was also provided

through an overall user score and a hidden reward system. Results of the user’s answers

to questions and activities were delayed until the end of a completed fictional scenario.

Users reported gaining knowledge and skills for integrating technology and using

the computer lab. The first learning scenario level specifically focused on technology

knowledge (TK). Users followed the story of a fictional teacher placed in the awkward

situation of needing to plug in the cables and wires for her computer, projector, and

document camera. Through this scenario, users learned about the names and functions of

the various cables used to make the technology equipment work together. The second

and third fictional scenario presented the same character going through the phases of

Niess, Sadri, and Lee’s development process of TPACK (2007), learning how to

incorporate several instances of technology into her instruction. Fourteen videos were

created to help instruct learners on computer lab preparation, management, and use. The

videos’ content was based on the TPACK knowledge and skills gaps (see Figure 5) found

in the initial needs assessment interviews with educational technology leaders. The

videos were available to users on demand through the video library section and were

presented to learners after the expert level scenario was completed. Although the e-

Learning module was unable to facilitate communication and collaboration during the

67
learning process, it was able to facilitate the participants’ acquisition of knowledge and

skills useful for using the computer lab and integrating technology into instruction.

Using the SAM process for the module’s development proved to be beneficial,

allowing the module’s design to be evaluated and refined continuously. Throughout the

module’s development, the feedback received from multiple sources helped shape its

practice strategy, screen design, and activities. Many issues found within the module’s

preliminary testing were able to be fixed in time for its release to the two participating

middle schools.

Limitations

The project has limitations relating to the sample of participants, the survey

instrument, and the design of the training module. The sample size included only twenty

teachers from two middle schools in one school district. This sample, while ample for

evaluation purposes, cannot fully generalize all teachers’ responses to using the module,

especially those teaching in the elementary and high school grade levels. The module

was also introduced to teachers at the end of the school year. The sample size may have

been affected by teacher’s reluctance to participate in voluntary professional development

during a busy time of the school year. One of the selected middle schools is where the

researcher teaches, and participants’ responses could have been influenced by their

professional relationship.

Another limitation of the project concerns the validity of the survey instrument

used to gather feedback from the participants. The questionnaire used a limited set of

questions to gather data on the participants’ acquisition of integrating technology and

using the computer lab. The questions relate to acquiring general knowledge and skills

68
that fall within a broad scope of the TPACK framework, however, they were not

designed to gather specific evidence of TPACK acquisition. Therefore, although the

respondents reported acquiring knowledge and skills through the training module, it

cannot be assumed from the data that the module directly increased the participants’ level

of TPACK.

The module used an extensive amount of graphic image files, creating a large

bandwidth need for computers to download the module’s content in a timely manner.

Slower internet connections would have slowed the loading process, and users may have

become impatient waiting for the module to load. One of the features of e-Learning, the

ability to save a user’s progress, was not available. Although the module was designed to

accommodate this missing feature, users may have become frustrated with the inability to

save their progress and visit the module again at a later time.

Recommendations

The module should be made available and tested on a larger audience of teachers

working in all grade levels in order to capture more generalizable data. It should be

introduced at the beginning of the school year so teachers have more time to spend

interacting with the lesson scenarios, the video library, and the resource library. Teachers

should also be allotted a period of time within the school year to use what they have

learned and apply it to their instruction. A valid survey instrument aligned with

measuring TPACK should be used before and after participants use the module to assess

their development of TPACK (Schmidt, et al., 2009).

This e-Learning module is hosted on the web. Although this is a workable

method of delivery, creating a downloadable application and making it available for

69
mobile devices should also be considered. Creating a downloadable application may

increase learner participation by eliminating the delivery inconsistencies created by the

behaviors of different web browsers, and by increasing its availability to more computer

device mediums such as tablets. A downloadable application also eliminates the need to

redownload the instructional content every time a learner uses the e-Learning module.

Future e-Learning module designs should also include the ability to save a users’

progress, giving them the ability to continue their learning at a later time and allow for

their scores to be recorded for evidence of learning.

The use of a chatroom to engage users in communication and collaboration should

be further explored. Future e-Learning designs should consider embedding the chatroom

feature within the e-Learning module screen design to increase its user awareness.

Future designs of e-Learning modules for teacher professional development

should include content relatable to all of the four core subjects. Including English

language arts, history, math, and science content in the practice strategy may increase the

intrinsic motivation and participation of teachers by providing instructional content

relatable to a broader audience.

Conclusion

The purpose of this project was to create an online e-Learning module that would

help develop teachers’ ability to use their school computer lab and their ability to

integrate technology into their instruction. The project’s design was based on the

research of the TPACK framework, adult learning principles, a framework for computer-

based instruction, e-Learning principles, and design strategies. The module’s practice

strategy design went through several revisions. The final version’s design integrated the

70
various frameworks, design principles, and design strategies described in this report. The

design was refined using initial feedback from teachers and then tested more extensively.

The e-Learning module was well-received by its users, and it was commended for its

presentation, content, interactivity, videos, narration, relevancy, and purpose. Overall,

this module is a promising and cost-effective strategy for helping teachers develop their

knowledge of how to use and teach with technology

71
APPENDICES

72
APPENDIX A

PRELIMINARY TEST RECRUITMENT EMAIL

73
Dear Potential Beta Tester,

I’m recruiting participants to beta test my master’s project. The project is an interactive
online training for teachers about using technology in your classroom instruction & in the
computer lab.

As a beta tester, you are asked to complete as much of the training module as possible,
and take note of any issues or problems relating to your experience. Afterwards, you are
asked to complete an online survey about your experience. You may also choose to be
contacted over the phone to discuss the training module at a time of your convenience.

Payment for your participation includes a $20 Amazon gift card, sent to an email address
of your choice. Payment can only be received upon completion of the online survey.

The online training is available 24 hours a day and can be accessed from any computer
device with a flash enabled browser. Completion of the training is expected to take 60-90
minutes, but is not required.

Please note that all survey answers and phone discussions will remain anonymous &
confidential.
74
Thank you for your time.

Andrew Fitzgerald
Cal State University Long Beach

Link to training module project & survey: http://j.mp/TechReady-Beta

Password for access: technology (case sensitive)

75
APPENDIX B

PRELIMINARY TEST INFORMED CONSENT FORM

76
Supporting Teachers’ Integration of Technology with e-Learning

You are asked to participate in a research study conducted by Andrew Fitzgerald, M.A. in
Educational Technology and Media Leadership, from the Advanced Studies in Education
and Counseling at California State University, Long Beach. The results will contribute
towards a Masters project. You were selected as a possible participant in this study
because you are a teacher working in the Long Beach Unified or Ventura Unified School
Districts.

PURPOSE OF THE PROJECT

The purpose of this project is to create an effective online professional development


experience to prepare teachers for technology integration and computer lab instruction.

PROCEDURES

If you volunteer to participate in this study, you will be asked to do the following things:

1) Participate in a beta test of an online e-Learning training module.


2) Complete a short survey following the training module.
3) Participate in a voluntary phone interview to provide feedback on your experience
with the module.

Length of beta test: 60 minutes to 90 minutes

Location for beta test: Computer device with a web browser running Adobe Flash Player
and internet access.

Length of phone feedback interview: 10 minutes to 30 minutes

POTENTIAL RISKS AND DISCOMFORTS

You may experience some discomfort such as mild anxiety, boredom, mental fatigue, or
embarrassment of poor performance while interacting with the module. You may also
experience discomfort from the time needed to complete all the elements of the e-
Learning module, as well as the collection of identifiable personal data, such as your
name, phone number, and an email address.

POTENTIAL BENEFITS TO SUBJECTS AND/OR TO SOCIETY

You will be provided a professional development experience on computer lab and


technology integration into classroom instruction.

77
PAYMENT FOR PARTICIPATION

Upon completion of the phone interview, you will be emailed an Amazon gift card
valued at $20 to an email address of your choice.

CONFIDENTIALITY

Any information that is obtained in connection with this study and that can be identified
with you will remain confidential and will be disclosed only with your permission or as
required by law.

Non-identifiable survey results and information will be used to complete the Master’s
project, and may be released to the Long Beach and Ventura Unified School Districts.

PARTICIPATION AND WITHDRAWAL

You can choose whether to be in this study or not. If you volunteer to be in this study,
you may withdraw at any time without consequences of any kind. Participation or non-
participation will not affect your employment status, or any other personal consideration
or right you usually expect. You may also refuse to answer any questions you don't want
to answer and still remain in the study. The investigator may withdraw you from this
research if circumstances arise which in the opinion of the researcher warrant doing so.

IDENTIFICATION OF INVESTIGATORS

If you have any questions or concerns about the research, please feel free to contact:

Principal Investigator: Andrew Fitzgerald 562.841.6379

Committee Chair: Dr. Stephen Adams 562.985.5498

RIGHTS OF RESEARCH SUBJECTS

You may withdraw your consent at any time and discontinue participation without
penalty. You are not waiving any legal claims, rights or remedies because of your
participation in this research study. If you have questions regarding your rights as a
research subject, contact the Office of University Research, CSU Long Beach, 1250
Bellflower Blvd., Long Beach, CA 90840; Telephone: (562) 985-5314. email: ORSP-
Compliance@csulb.edu

78
SIGNATURE OF RESEARCH SUBJECT (AND) OR LEGAL REPRESENTATIVE

By clicking “Agree,” I understand the procedures and conditions of my participation


described above. My questions have been answered to my satisfaction, and I agree to
participate in this study. I may get a copy of this form by clicking “Print Consent Form.”

Print I Agree I Disagree


Consent
Form

79
APPENDIX C

PRELIMINARY TEST ONLINE SURVEY QUESTIONS

80
Thank you for participating in the test of this project! Please take a few minutes to answer
some questions relating to your experience with this training module.

1) What did you like about the training module?


a. Free Response
2) What did you dislike about the training module?
a. Free Response
3) Please list issues you discovered in the training module.
a. Free Response
4) Is it okay for the researcher to contact you by phone to elaborate on or discuss
your experience with the training module?
a. Yes
b. No
5) If yes, please enter your name and phone number, along with the best time to
reach you.

Name

Phone Number

Best time to reach you by phone

Your payment of a $20 Amazon gift card for testing the project will be sent to an email
address of your choice. Please enter which email address you would like the gift card sent
to in the box below.

Email Address

Thank you for your time and effort. Please click the submit button below to submit your
answers. If you have any other comments, you may leave them below.
a. Free response

81
APPENDIX D

PRELIMINARY TEST PHONE CALL SURVEY QUESTIONS

82
Phone Survey Questions
1) What did you like about the training module?

2) What did you dislike about the training module?

3) Did you discover any issues or bugs with the training module?

4) What were they?

5) Any other comments or suggestions?

83
APPENDIX E

RECRUITMENT MESSAGE FOR EMAIL, LMS, AND FLYER

84
Dear Teacher,

I'm am conducting research as a part of my master's project at CSULB. The project is an


interactive online training and resource for teachers for using technology in your
classroom instruction & in the computer lab.

The online training is available 24 hours a day and can be accessed from any computer
device with a flash enabled browser. Completion of the full training is not required. You
are asked to complete a short online survey about your experience with the training. The
survey is available through the training website, and is not required for you to complete.

Please note that all answers will remain anonymous & confidential.

Thank you for your time.

Andrew Fitzgerald
Cal State University Long Beach

Link to online training & survey: http://j.mp/TechReadyPD

Password for access: technology (case sensitive)

85
APPENDIX F

INFORMED CONSENT FORM

86
Supporting Teachers’ Integration of Technology with e-Learning

You are asked to participate in a research study conducted by Andrew Fitzgerald, M.A. in
Educational Technology and Media Leadership, from the Advanced Studies in Education
and Counseling at California State University, Long Beach. The results will contribute
towards a Masters project. You were selected as a possible participant in this study
because you are a teacher working in the Long Beach Unified or Ventura Unified School
Districts.

PURPOSE OF THE PROJECT

The purpose of this project is to create an effective online professional development


experience to prepare teachers for technology integration and computer lab instruction.

PROCEDURES

If you volunteer to participate in this study, you will be asked to do the following things:

1) Interact with an online e-Learning training module by watch instructional videos


and complete interactive practice activities.
2) Complete a survey.
Length of time: 60 minutes to 90 minutes

Location: Computer device with a web browser running Adobe Flash Player and internet
access.

POTENTIAL RISKS AND DISCOMFORTS

You may experience some discomfort such as mild anxiety, boredom, mental fatigue, or
embarrassment of poor performance while interacting with the module. You may also
experience discomfort from the time needed to complete all the elements of the e-
Learning module.

POTENTIAL BENEFITS TO SUBJECTS AND/OR TO SOCIETY

You will be provided a professional development experience on computer lab and


technology integration into classroom instruction.

PAYMENT FOR PARTICIPATION

There is no monetary compensation offered for your participation.

87
CONFIDENTIALITY

Any information that is obtained in connection with this study and that can be identified
with you will remain confidential and will be disclosed only with your permission or as
required by law.

Non-identifiable survey results and information will be used to complete the Master’s
project, and may be released to the Long Beach and Ventura Unified School Districts.

PARTICIPATION AND WITHDRAWAL

You can choose whether to be in this study or not. If you volunteer to be in this study,
you may withdraw at any time without consequences of any kind. Participation or non-
participation will not affect your employment status, or any other personal consideration
or right you usually expect. You may also refuse to answer any questions you don't want
to answer and still remain in the study. The investigator may withdraw you from this
research if circumstances arise which in the opinion of the researcher warrant doing so.

IDENTIFICATION OF INVESTIGATORS

If you have any questions or concerns about the research, please feel free to contact:

Principal Investigator: Andrew Fitzgerald 562.841.6379

Committee Chair: Dr. Stephen Adams 562.985.5498

RIGHTS OF RESEARCH SUBJECTS

You may withdraw your consent at any time and discontinue participation without
penalty. You are not waiving any legal claims, rights or remedies because of your
participation in this research study. If you have questions regarding your rights as a
research subject, contact the Office of University Research, CSU Long Beach, 1250
Bellflower Blvd., Long Beach, CA 90840; Telephone: (562) 985-5314. email: ORSP-
Compliance@csulb.edu

SIGNATURE OF RESEARCH SUBJECT (AND) OR LEGAL REPRESENTATIVE

By clicking “I Agree,” I understand the procedures and conditions of my participation


described above. My questions have been answered to my satisfaction, and I agree to
participate in this study. I may get a copy of this form by clicking “Print Consent Form.”

Print I Agree I Disagree


Consent
Form 88
APPENDIX G

PARTICIPANT ONLINE SURVEY QUESTIONS

89
Thank you for participating with this project! Please take a few minutes to answer some
questions relating to your experience with this training module.

1) I accessed the module again after the first introductory session.


a. Yes
b. No
c. Decline to answer
2) Where was the training module accessed from?
- Choose as from the following:
a. Work
b. Home
c. Other: Please specify ____________
d. Decline to answer
3) What devices were used to complete the training module?
- Choose from the following:
a. Desktop
b. Laptop
c. Tablet
d. Decline to answer
4) I felt motivated to use the training module.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
5) I felt in control of my learning.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
6) The training module supported my learning when I needed it.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer

90
7) The training module’s interface was easy to navigate.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
8) The module’s content and activities were easy to find.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
9) The instructional content was clear and understandable.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
10) The instructional content was relevant to me.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
11) The instructional content was engaging.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
12) This professional development is beneficial for using the computer lab with my
students.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
91
13) This professional development is beneficial for integrating technology into my
classroom instruction.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
14) This professional development is beneficial for integrating technology into
classroom activities.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
15) This professional development is beneficial for integrating technology into
classroom assignments.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
16) This professional development is beneficial for helping me understand the
TPACK framework.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
17) I collaborated with my peers while using the module in the introductory session.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer

92
18) I used the chat room function on the module’s website to collaborate with my
peers.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
19) I prefer this online method of professional development over traditional face-to-
face professional development.
a. Strongly Agree
b. Agree
c. Neutral
d. Somewhat Disagree
e. Strongly Disagree
f. Decline to answer
20) What aspects of the training module did you like?
a. Free Response

21) What aspects of the training module did you dislike?


a. Free Response

22) What suggestions do you have to make this a better experience and use of your
time?
a. Free response

23) Do you have any other comments or suggestions?


a. Free response

93
APPENDIX H

E-LEARNING TRAINING MODULE WEBSITE

94
The e-learning module is available to see and use by visiting
http://www.andrewfitz.net/techreadypd.html

Chat Room

Training Module

95
REFERENCES

96
REFERENCES

Achieve, Inc. (2004). Ready or not: Creating a high school diploma that works.
Washington DC: The American Diploma Project.
Allen, M. (2003). Michael Allen's guide to e-Learning: Building interactive, fun, and
effective learning programs for any company. Hoboken, NJ: Wiley.
Allen, M., & Sites, R. (2012). Leaving ADDIE for SAM. American Society for Training
and Development. Retrieved from https://www.td.org/
Ansyari, M. F. (2013). In-service teacher professional development arrangements for
technology integration: Some critical considerations. International Journal of e-
Education, e-Business, e-Management, and e-Learning, 3(4), 340-343.
Blackwell, J., & Yost, N. (2013). Technology in the classroom: teacher education
programs and technology: Preparing teacher candidates for working with P-8
students. Childhood Education, 89(5), 325-327.
California Department of Education. (2013). Schedule of the 2013-14 final entitlements.
Retrieved from Funding Results website: http://www.cde.ca.gov/fg/fo/r14/
documents/commoncore13ent.xls
California Department of Education. (2014). Common core state standards
implementation faq. Retrieved from California Department of Education:
http://www.cde.ca.gov/fg/aa/ca/commoncorefaq.asp#q06
California Department of Education. (2014). Common core state standards systems
implementation plan for California. Retrieved from Common Core State
Standards: http://www.cde.ca.gov/re/cc/documents/ccsssimplementationplan.doc
California Department of Education & Education Technology Task Force. (2014).
Blueprint for California education technology. Sacramento, CA: California
Department of Education.
The Center For Digital Education. (2014). Classroom management. Folsom, CA:
e.Republic.
Clark, R. C., & Mayer, R. E. (2008). e-Learning and the science of instruction: Proven
guidelines for consumers and designers of multimedia learning. San Francisco,
CA: Pfeiffer.
Dick, W., & Carey, L. (1996). The systematic design of instruction (4th ed.). New York,
NY: Harper Collins College Publishers.

97
Dirksen, J. (2012). Design for how people learn. Berkeley, CA: New Riders.
Doering, A., Veletsianos, G., Scharber, C., & Miller, C. (2009). Using the technological,
pedagogical, and content knowledge framework to design online learning
environments and professional development. Journal of Educational Computing
Research, 41(3), 319-346.
Donnelly, D., McGarr, O., & O'Reilly, J. (2011). A framework for teachers’ integration
of ICT into their classroom practice. Computers & Education, 57, 1469-1483.
Doorey, N. A. (2012). The Light Ahead. In Coming together to raise achievement: new
assessments for the common core state standards (Opportunities Ahead).
Retrieved from http://www.smarterbalanced. org/faq/17-how-is-the-smarter-
balanced-assessment-consortium-different-than-the-partnership-for-the-
assessment-of-readiness-for-college-and-careers-parcc/
Ertmer, P., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change: How
knowledge, confidence, beliefs, and culture intersect. Journal of Research on
Technology in Education, 42(3), 255-284.
Forrest, S. P., III & Peterson, T. O. (2006). It's called andragogy. Academy of
Management Learning & Education, 5(1), 113-122.
Gong, C., Chen, G., Cheng, W., Yang, X., & Huang, R. (2013). Potential issues on
initiatively utilizing eTextbooks in K-12 classrooms. In Proceedings of the 2013
IEEE 13th International Conference on Advanced Learning Technologies
(ICALT) (pp. 314-318). Washington, DC: IEEE Computer Society.
Harris, J., Mishra, P., & Koehler, M. (2009). Teachers’ technological pedagogical content
knowledge and learning activity types: Curriculum-based technology integration
reframed. Journal of Research on Technology in Education, 41(4), 393-416.
Hew, K. F., & Brush, T. (2007). Integrating technology into K-12 teaching and learning:
current knowledge gaps and recommendations for future research. Educational
Technology Research and Development, 55(3), 223-252.
Holden, H., & Rada, R. (2011). Understanding the influence of perceived usability and
technology self-efficacy on teachers’ technology acceptance. Journal of Research
on Technology in Education, 43(4), 343-367.
Holton, E. F., III Swanson, R. A., & Naquin, S. S. (2001). Andragogy in practice:
Clarifying the androgogical model of adult learning. Performance Improvement
Quarterly, 14(1), 118-143.
Johnson, L., Adams, S. B., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H.
(2013). NMC horizon report: 2013 K-12 edition. Austin, TX: The New Media
Consortium.

98
The K-12 Center. (2012). Coming together to raise achievement: new assessments for the
common core state standards. Princeton, NJ. Center for K–12 Assessment &
Performance Management at ETS. Retrieved from http://www.smarterbalanced.
org/faq/17-how-is-the-smarter-balanced-assessment-consortium-different-than-
the-partnership-for-the-assessment-of-readiness-for-college-and-careers-parcc/
Knowles, M. S., Holton, E. F., & Swanson, R. A. (2011). The adult learner: The
definitive classic in adult education and human resource development (Seventh
ed.). Burlington, MA: Elsevier.
Koehler, M., Mishra, P., & Cain, W. (2013). What Is technological pedagogical content
knowledge (TPACK)? Journal of Education, 193(3), 13-19.
Konan, N. (2010). Computer literacy levels of teachers. Procedia Social and Behavioral
Sciences, 2, 2567-2571.
Kristen, K., Wendt, J., Wendt, S., & Beach, J. (2014). Teachers’ experiences and
perspectives on the integration of technology. World Conference on Educational
Multimedia, Hypermedia and Telecommunications, 2014(1), 1592-1599.
Lee, Y., & Lee, J. (2014). Enhancing pre-service teachers’ self-efficacy beliefs for
technology integration through lesson planning practice. Computers & Education,
73, 121-128.
Lowe, J. S., & Holton, E. F., III (2005). A theory of effective computer-based instruction
for adults. Human Resources Development Review, 4(2), 159-188.
Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A
framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.
Mouza, C. (2011). Promoting urban teachers’ understanding of technology, content, and
pedagogy in the context of case development. Journal of Research on Technology
in Education, 44(1), 1-29.
National Governors Association Center for Best Practices, Council of Chief State School
Officers. (2010). Common core state standard for English language arts &
literacy in history/social studies, science, and technical subjects. Washington DC:
Author.
National Governors Association Center for Best Practices, Council of Chief State School
Officers. (2014). Standards in your state. Retrieved from Common Core State
Standards Inititative website: http://www.corestandards.org/standards-in-your-
state
National Governors Association Center for Best Practices, Council of Chief State School
Officers. (n.d.). Common core state standards for mathematics. Washington DC:
Author.

99
Niess, M. L., Ronau, R. N., Shafer, K. G., Driskell, S. O., Harper, S. R., Johnston, C., . . .
Kersaint, G. (2009). Mathematics teacher TPACK standards and development
model. Contemporary Issues in Technology and Teacher Education, 9(1), 4-24.
Niess, M. L., Sadri, P., & Lee, K. (2007). Dynamic spreadsheets as learning technology
tools: Developing teachers’ technology pedagogical content knowledge (TPCK).
Paper presented at the meeting of the American Educational Research Association
Annual Conference, Chicago, IL.
Özgun-Koca, S. A., Meagher, M., & Edwards, M. T. (2010). Preservice teachers’
emerging TPACK in a technology-rich methods class. The Mathematics
Educator, 19(2), 10-20.
Özgün-Koca , S. A., Meagher, M., & Edwards, M. T. (2011). A teacher’s journey with a
new generation handheld: Decisions, struggles, and accomplishments. School
Science and Mathematics, 11(5), 209-224.
Paraskeva, F., Bouta, H., & Papagianni, A. (2008). Individual characteristics and
computer self-efficacy in secondary education teachers to integrate technology in
educational practice. Computers & Education, 50, 1084-1091.
Potter, S. L., & Rockinson-Szapkiw, A. J. (2012). Technology integration for
instructional improvement: The impact of professional development. Performance
Improvement, 51(2), 22-27.
Prasertsilp, P., & Olfman, L. (2014). Effective teacher training for tablet integration in K-
12 classrooms. System Sciences (HICSS), 2014 47th Hawaii International
Conference on, 52-61.
Puentedura, R. R. (2013, October 25). SAMR: A contextualized introduction. Retrieved
2015, from Ruben R. Puentadura's Blog: http://www.hippasus.com/rrpweblog/
archives/000112.html
Reiser, R. A., & Dempsey, J. V. (2012). Trends and issues in instructional design and
technology (3rd ed.). Boston, MA: Pearson Education.
Ritzhaupt, A. D., Feng, L., Dawson, K., & Barron, A. E. (2013). Differences in student
information and communication technology literacy based on socio-economic
status, ethnicity, and gender: Evidence of a digital divide in Florida schools.
Journal of Research on Technology in Education, 45(4), 291-307.
Rogers, E. M. (2010). Diffusion of innovations. New York, NY. Simon and Schuster.
Schmidt, D. A., Baran, E., Thompson, A. D., Koehler, M. J., Mishra, P., & Shin, T.
(2009). Technological pedagogical content knowledge (TPACK): The
development and validation of an assessment instrument for preservice teachers.
Journal of Research on Technology in Education, 42(2), 123-149.

100
Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching.
Educational Researcher, 15(2), 4-14.
Smarter Balance Assessment Consortium. (2012, November). Scoring reporting and
estimated testing times. California. Retrieved from website http://www.
smarterbalanced.org/resources-events/faqs/
Smith, A., & Zickuhr, K. (2012). Digital differences. Washington, DC: Pew Research
Center.
Student Achievement Partners. (2014). Homepage. Retrieved from website:
http://achievethecore.org/
Tillery, A. D., Varjas, K., Meyers, J., & Collins, A. S. (2010). General education
teachers’ perceptions of behavior management and intervention strategies.
Journal of Positive Behavior Interventions, 12(2), 86-102.
Tondeur, J., Braak, J. V., Sang, G., Voogt, J., Fisser, P., & Ottenbreit-Leftwich, A.
(2012). Preparing pre-service teachers to integrate technology in education: A
synthesis of qualitative evidence. Computers & Education, 59, 134-144.
Uzunboylu, H., & Ozdamli, F. (2011). Teacher perception for m-learning: Scale
development and teachers’ perceptions. Journal of Computer Assisted Learning,
27(6). 544-556.
Voogt, J., Fisser, P., Roblin, N. P., Tondeur, J., & Braak, J. V. (2013). Technological
pedagogical content knowledge – A review of the literature. Journal of Computer
Assisted Learning, 29, 109-121.
Voyiatzaki, E., & Avouris, N. (2014). Support for the teacher in technology-enhanced
collaborative classroom. Education and Information Technologies, 19(1), 129-
154.
Williams, R. (2008). The non-designer's design book (3rd ed.). Berkeley, CA: Peachpit
Press.
Yucel, A. S., & Kocak, C. (2010). Evaluation of the basic technology competency of the
teachers candidate according to the various variables. Procedia Social and
Behavioral Sciences, 2, 1310-1315.

101

You might also like