You are on page 1of 150

Quantitative Comparison of Traditional to Combined Online Instruction

for Simple Linear Regression

Dissertation

Submitted to Northcentral University

Graduate Faculty of the School of Business and Technology


in Partial Fulfillment of the
Requirements for the Degree of

DOCTOR OF BUSINESS ADMINISTRATION

by

JOSEPH A. SNIDER

Prescott Valley, Arizona


November 2011
UMI Number: 3493237

All rights reserved

INFORMATION TO ALL USERS


The quality of this reproduction is dependent upon the quality of the copy submitted.

In the unlikely event that the author did not send a complete manuscript
and there are missing pages, these will be noted. Also, if material had to be removed,
a note will indicate the deletion.

UMT
Dissertation Publishing

UMI 3493237
Copyright 2012 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.

uest
ProQuest LLC
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, Ml 48106-1346
Copyright 2011

Joseph A. Snider
APPROVAL PAGE

Quantitative Comparison of Traditional to Combined Online Instruction


for Simple Linear Regression

by

Joseph A. Snider

Approved by:

hip-lL
Chair Date

Member: Roger Holt, D.B.A.

Member: David Bouvin, D.B.A.

Certified by:

/*&>/C~dJ?$-
School Dean: Lee Smith, Ph.D Date
Abstract

Businesses require statistically literate workers. This means people are capable of critical

thinking, can analyze real-world data, and possess the ability to communicate their

findings. In 2005, the American Statistical Association endorsed the Guidelines for

Assessment and Instruction in Statistics Education (GAISE) college report (College

Report, 2010), outlining ways to improve statistical literacy. The purpose of this

exploratory study was to research a possible method of improving statistical literacy

under the recommendations of the 2005 GAISE college report by comparing a combined

approach of four online instructional methods to traditional methods of lecture and

textbook for the topic of simple linear regression. The participants in the study were

from MBA cohorts at a private Midwestern university. The research design was a pretest

and posttest model with a control group (traditional methods, N = 11) and quasi-

experimental group (combined online methods, N = 9). Using a Mann-Whitney U test,

pretest score differences between groups were not statistically significant (z = -.699, p =

.485), while posttest scores differences between groups were significant (z = -2.265, p —

.024). Using a Wilcoxon signed rank test, the control group pretest and posttest

difference was not statistically different (z = -.960,/? = .337), while the quasi-

experimental group difference was statistically significant (z = -2.263, p - .024). This

means the two groups started out the same, but ended up with a difference. The Chi-

Square analyses for demographic variables showed no statistically significant difference

between the two groups. The results provide evidence that additional research is worth

pursuing and that effective and inexpensive online materials are achievable. The online

materials are applicable to both corporate and university settings.

iv
ACKNOWLEDGEMENTS

Undoubtedly, just behind surviving cancer, achieving a doctorate is the toughest

and most grueling process I have ever experienced. It is a testimony to having patience

and perseverance. The Lord has seen fit to grant me the opportunity to apply what I have

learned and to teach others. For that, I am eternally grateful.

I would like to thank my wife Julie for supporting me throughout the years to

achieve this milestone. Without her love, help, feedback, and encouragement, this goal

would not have been achievable. Special thanks go out to Kathy Gosser, a mentor who

encouraged me to attempt attainment of a doctorate, and Nina Moliver, editor

extraordinaire.

I would like to thank my chair, Dr. Flegle, for his help in navigating through the

process and providing leadership for the committee members. His insights have made the

products much stronger. I would like to thank all members of the research university and

Northcentral University, for providing the mechanisms to achieve this goal.

v
Table of Contents
List of Tables viii
List of Figures ix
Chapter 1: Introduction 1
Background 2
Problem Statement 5
Purpose 6
Theoretical Framework 7
Research Questions 8
Hypotheses 9
Nature of the Study 10
Significance of the Study 13
Definitions 15
Summary 17
Chapter 2: Literature Review 18
Statistical Literacy 19
Outcome Factors in Statistics Courses 21
Incorporating Technology in a Course 22
Nature of the Student 23
Engaging Format and Content 25
Assessments 27
Teaching Modalities in Business Statistics Courses 28
GAISE College Report Recommendations 30
Revised Bloom's Taxonomy 36
Advantages and Disadvantages of New Teaching Methods 38
Learning Objects 43
Learning Objects Advantages 46
WebQuests 53
Cognitive Flexibility Theory Hypertext 55
A Combination Approach 59
Summary 61
Chapter 3: Research Method 63
Research Questions 64
Hypotheses 64
Research Method and Design 65
Participants 66
Materials/Instruments 68
Operational Definition of Variables 74
Data Collection, Processing, and Analysis 75
Methodological Assumptions, Limitations, and Delimitations 77
Ethical Assurances 78
Summary 79

vi
Chapter 4: Findings 81
Results 82
Descriptive Statistics 83
Nonparametric Statistics on Research Questions 86
Production Actual Costs for the Content in the Study 89
Evaluation of Findings 90
Summary 93
Chapter 5: Implications, Recommendations, and Conclusions 95
Implications 97
Limitations of the Study 99
Recommendations 100
Conclusions 103
References 106
Appendix 118
Appendix A: Graphical Organizer for Business Statistics Course 119
Appendix B: WebQuest Sample Pages 120
Appendix C: Learning Object Sample Screens 121
Appendix D: Cognitive Flexibility Theory Hypertext (CFTH) 124
Appendix E: Cognitive Flexibility Theory Hypertext (CFTH) Flow Diagrams 128
Appendix F: Pretest and Posttest Survey Instrument 130
Appendix G: Sign-Up Demographics Questions 137
Appendix H: Informed Consent Form for Control Group 138
Appendix I: Informed Consent Form for Experimental Group 139
Appendix J IRB Approval EmailfromNorthcentral University 140

vii
List of Tables
Table 1 Teaching Methods - Advantages and Disadvantages 39
Table 2 Expected Costs for a Simple Linear Regression Lesson 73
Table 3 Frequencies for Gender 83
Table 4 Frequencies for Age 84
Table 5 Frequencies for Computer Hours per Week 84
Table 6 Frequencies for Microsoft Excel ™ Skill Level 85
Table 7 Frequencies for Learning Preference 85
Table 8 Median Values for Pretest and Posttest Scores 88
Table 9 Mean Scores for Pretest and Posttest 88
Table 10 Actual Costs for a Simple Linear Regression Lesson 90

viii
List of Figures
Figure 1. Conceptual Drawing of Expectation-Delivery Gap 18
Figure 2. Revised Bloom's Taxonomy 36

ix
1

Chapter 1: Introduction

An important skill for educated employees is statistical literacy, defined as an

ability "to read and interpret statistics, and think critically about arguments that use

statistics as evidence" (United National Development Programme [UNDP], 2010). To

improve statistical literacy, students need to be effectively and efficiently educated in

college-level statistics courses (Ben-Zvi & Garfield, 2008). On a practical level,

businesses require employees who can collect data, synthesize and analyze data into

information, and present findings to colleagues and associates (Seifer, 2009).

Business statistics courses contain material applicable to any industry. The

application of business statistics is widespread and practical in nature. Knowledge of

business statistics is beneficial in the areas such as six sigma projects for manufacturing

(Shah, Chandrasekaran, & Linderman, 2008), health care (Gigerenzer, Gaissmaier, Kurz-

Milcke, Schwartz, & Woloshin, 2007; Monahan, 2007), and marketing analysis (Albaum,

Roster, Wiley, Rossiter, & Smith, 2010). When statistical literacy is inadequate, workers

and managers can misuse and misinterpret data, make faulty decisions without facts, and

not maintain productivity goals (Monahan, 2007).

To understand how to improve statistical literacy one has to understand the

learning environment. The nature of pedagogy is changing. Instead of using only

traditional lecture and book format in the classroom, teachers are increasingly

incorporating electronic teaching techniques employing constructivist tools (Halat, 2008).

Constructivist learning is active, problem-based (Carter, 2002), and student-centered

(Bush, 2006; Connolly & Begg, 2006; Mykytyn, 2007).


Introducing information and communication technology (ICT) into the classroom

is another change. Incorporating ICT has many benefits, including enabling problem

solving, communication, and creativity (Neo & Neo, 2009) and is often valuable in

instructing younger students. Tsolakidis & Fokiali (2010) studied incorporating ICT in

distance learning and found cost was lower than a traditional course.

Students themselves may be different. Younger students have been referred to as

digital natives (Prensky, 2001), or people who have used ICT all of their lives. In

contrast, digital immigrants are people who have not used ICT all their lives. Digital

natives have low tolerance for lectures, require active instead of passive learning, and

rely heavily on communication technologies for information (Prensky, 2001).

This chapter introduces a study of the effect of introducing a combination of

online methods for teaching simple linear regression, a topic in introductory statistics, to

college students. The background section explains how educational trends and rising

costs create new demands for higher education. The problem section defines a need for

updated pedagogy and the purpose section explains about combining four relatively new

online teaching techniques to improve statistically literacy or reduce costs of equivalent

student results. In this study, four new teaching methods introduced since 1990

combined into one online teaching environment.

Background

Representatives of the American Statistical Association (ASA) recognized the

need for changing the teaching of college-level introductory statistics courses. In 2005,

an ASA report called the Guidelines for Assessment and Instruction in Statistics

Education (GAISE) was produced (Everson & Garfield, 2008). The GAISE report

2
(College Report, 2010) provided the following guidelines and recommendations for

introductory statistics courses at the college level: (1) Emphasize statistical literacy and

develop statistical thinking, (2) use real data, (3) stress conceptual understanding rather

than mere knowledge of procedures, (4) foster active learning in the classroom, (5) use

technology for developing conceptual understanding and analyzing data, and (6) use

assessments to improve and evaluate student learning.

Researchers (Everson, Zieffler, & Garfield, 2008) implemented the GAISE

guidelines over a 2-year period in both on-ground and online education courses. The

effort took 2 years because the researchers made only small adjustments at one time.

Qualitative findings showed that statistical reasoning skills increased, as evidenced by

class discussions. In addition, both evaluation ratings and demand for the courses

increased (Everson, Zieffler, & Garfield, 2008).

Tuition costs are increasing (Archibald & Feldman, 2008; Martin, 2002). Costs

for tuition, room and board in 1980 were $2,373 for public institutions and $5,470 for

private institutions (U.S. Department of Education [USDOE], 2009). Public tuition,

room and board increased to $11,578 in the 2007-2008 school year, and private costs

were $29,915 in the 2007-2008 school year. Smaller budgets (College Budgets,

2006) present a challenge to college administrators to improve the efficiency in addition

to the effectiveness of higher education.

College educators are adding online formats to new and existing courses. More

than 3.9 million students were taking at least one online course during the fall 2007 term,

representing a 12% increase compared to the previous year (Sloan Survey, 2008).

According to Schweizer, Whipp, & Hayslett (2002) advantages of online education

3
include lower cost, quick remote access, communication speed, ability for timely

feedback, potential for interactivity, and an ability to communicate with large audiences.

New online teaching methods include WebQuests (Halat, 2008; Lahaie, 2008;

Zheng, Stucky, McAlack, Menchana, & Stoddart, 2005), learning objects (Farha, 2007;

Stamey, 2006), electronic mind maps as graphical organizers (Ruffini, 2004; Schau

& Mattern, 1997), and CFTH (Jan, 2000; Papastergiou, 2008). WebQuests are Web-

oriented content and have a structured format. The electronic organizer is a Web page

providing links to the WebQuest and CFTH. The CFTH in this study is a collection of

original Web pages with pictures and links on them to provide an immersive role-playing

case study environment.

In isolation, each of the new online methods has met with success, but also has

deficiencies when used alone. In a study of 327 students, participants taught with

learning objects achieved higher test scores than did students taught with traditional

methods of instruction (Farha, 2007). Lahaie (2008) used WebQuests successfully for

nursing programs. Single-page electronic mind maps called e-Coursemaps have been

recommended as a way of presenting graphically organized material to students in a

course, because of low cost and ease of production (Ruffini, 2004). When 312 Korean

high school seniors learning history online used CFTH, students showed improvements in

the complex areas of synthesis, comparison, and analysis (Jang, 2000). Testing of each

of these learning tools in isolation was successful, but an exhaustive literature search

found nothing on combining the four tools for teaching a single topic or course.

Integrating various solutions together is a common strategy in information

technology. For example, in a health care company, a solution to store and process

4
medical records is separate from a solution to provide accounting capabilities. However,

through integrating "best of breed" technologies, an entire enterprise platform is

attainable (Ford, Menachemi, Huerta, & Yu, 2010). A strategy of combining four

previously researched teaching methods, each shown as positive individually, may prove

to be a "best of breed" strategy similar to the health care example above.

A lesson on simple linear regression was the focus of the online materials in this

study. Levine, Stephan, Krehbiel, and Brenesen (2011) defined simple linear regression

as using "a single numerical independent variable, X, to predict a single numerical

dependent variable Y" (p. 472). Simple linear regression using Microsoft Excel ™

involves obtaining a set of raw data, entering the data on a spreadsheet, creating a scatter

plot with a regression line. Simple linear regression also has many complex concepts to

understand, such as assumptions, when is it appropriate to use the technique, what to do

when the assumptions are violated, and hypothesis testing of the coefficients involved.

The case-study role-playing environment, called the CFTH, is where conceptual

understanding theoretically takes place.

Problem Statement

The general problem is a lack of statistical literacy for students, which translates

into the same lack in employees of corporations. The specific problem is how to improve

pedagogy in an introductory statistics course. Representatives of the American Statistical

Association (ASA) recognized the lack of statistical literacy and the need for changing

the teaching of college-level introductory statistics courses. In 2005, an ASA report

called the Guidelines for Assessment and Instruction in Statistics Education (GAISE) was

produced (Everson & Garfield, 2008). Qualitative research has taken place concerning

5
the GAISE recommendations (Everson, Zieffler, & Garfield, 2008), but no quantitative or

qualitative research was found that studied combining four online teaching methods for

complex material.

According to the GAISE report (College Report, 2010), the current level of

pedagogy does not meet the needs of current student populations for college introductory

statistics courses. The GAISE report (College Report, 2010) provided guidelines and

recommendations for introductory statistics courses at the college level. One piece of

evidence proving a lack of statistical literacy is that since 1997, more than 40% of all

students taking the Advanced Placement (AP) Statistics examination (College Board,

2009) failed to meet a 3 out of 5 score on the examination, enough to gain college credit

for a course in introductory statistics.

Purpose

The purpose of the quantitative study was to further research into complying with

the GAISE recommendations to improve statistical literacy. The study compared the

effectiveness of combined online teaching methods to traditional methods of teaching a

topic in a business statistics course. The topic of the study was simple linear regression

in a 1-week segment of a required introductory college-level statistics course.

The exploratory pilot study was quasi-experimental in design (Isaac & Michael,

1997), comparing results of participants assigned non-randomly into a control group

receiving traditional lecture plus textbook format, against a quasi-experimental group

receiving combined online training materials. Participants were a sample of Masters of

Business Administration (MBA) students at a private Midwestern university on the

6
Indiana, Ohio or the Kentucky campuses of the university. The control and quasi-

experimental groups were 10 or more in size.

All participants took an identical pretest and posttest to determine achievement.

A within-group comparison consists of test administration (pretest versus posttest) as the

independent variable, and a course assessment score as the dependent variable. A

between-group comparison consisted of type of instruction (online versus traditional)

defined as the independent variable, and a course assessment score as the dependent

variable.

Theoretical Framework

New learning pedagogies which have emerged since the 1990s include cognitive

flexibility theory (Rossner-Merrill, Parker, Mamchur, & Chu, 1998; Wiley, 2002),

constructivism (Bush, 2006; Connolly & Begg, 2006), and problem-based learning

(Abramovich & Cho, 2006). According to cognitiveflexibilitytheory, people learn

through exploration and experience, choosing their own path to knowledge instead of a

prescribed lesson plan. Constructivism complements cognitiveflexibilitytheory in that

according to constructivism, people bring their own experiences into solving new

problems, thus gaining knowledge in participatory activities. In problem-based learning,

people learn through solving real-world problems in an activity or case study format.

Cognitiveflexibilitytheory, constructivism, and problem-based learning form the

theoretical framework for this research.

Cognitive flexibility theory ensures that information comes from multiple

perspectives (Jacobson & Spiro, 1995). Cognitiveflexibilityis the ability to take

different conceptual and case perspectives in order to represent knowledge (Jang, 2000).

7
In this research, the topic of simple linear regression appeared in the WebQuest, the

learning object multimedia tutorial, and in multiple areas of the CFTH. Cognitive

flexibility shows that a person can bring the knowledge gained to bear on problem

solving.

Constructivism is a method in which the learner builds knowledge through

acquiring experiences (Bush, 2006). In the research study, a student attempts to acquire

basic skills through the WebQuest and learning object and then uses that set of skills to

solve real-world problems in the CFTH. By participating in applied activities, students

potentially gain a deep understanding through problem solving and creation of solutions.

Problem-based learning asks the student to apply what they have learned to solve

situations they face. The problem to work on can be a case study, a statistical problem to

solve, orfiguringout which statistical technique to use on a set of data (Bude, L., Imbos,

T., Wiel, M., Broers, N., and Berger, M. (2009). In this study, the CFTH used the case

study approach, allowing students to role-play as a consultant and apply what they have

learned in business situations.

Research Questions

To investigate the effects of combining online teaching methods with traditional

delivery of instruction on learning simple linear regression in an introductory college-

level statistics course, the following research questions were addressed.

Ql. To what extent will test scores differ for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and CFTH for

learning simple linear regression from the beginning to the end of a 1-week segment of a

college-level statistics course?

8
Q2. To what extent will test scores differ between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and CFTH for

learning simple linear regression and students receiving traditional methods for learning

simple linear regression in a 1-week segment of a college-level statistics course?

Hypotheses

Following are the null and alternative hypotheses used to test the research

questions.

Hl 0 . Test scores will not differ significantly for college students receiving a

combination of a graphical orgamzer, a WebQuest, a learning object, and

CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

HIa. Test scores will differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

H20. Test scores will not differ significantly between college students receiving a

combination of a graphical orgamzer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

H2a. Test scores will differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving


traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

Nature of the Study

The research was a quantitative study, quasi-experimental in design due to non-

random assignment of participants (Vogt, 2007), with a pretest and posttest given to

business school participants from a Midwestern university over a one-week period. Non-

random assignment makes the research design quasi-experimental (Black, 1999). Black

(1999) defined a quasi-experimental group as one "when samples are not completely

random and subject to practical considerations that possibly reduce the generalization of

the results" (p, 47). A control group and a quasi-experimental treatment group facilitated

a comparison between the two groups. The control group received traditional education

from a lecture and textbook, while the quasi-experimental group received combined

online training methods and materials.

Members of the quasi-experimental treatment group received an intervention

consisting of a graphical organizer, a WebQuest, a learning object, and CFTH. The

members of the quasi-experimental group took the pretest, training, and posttest prior to

the simple linear regression lesson in the traditional course, making it an extra workload

when the students participate, not a replacement workload. The control group only had

the extra workload of the pretest and posttest.

Part of the results of the study was a within-group comparison, with the test

administration (pretest versus posttest) defined as the independent variable, and a course

assessment score as the dependent variable. A between-group comparison was also

conducted, with type of instruction (online versus traditional) defined as the independent

10
variable, and a course assessment score as the dependent variable. Pretest and posttest

assessments were in the form of an identical online survey instrument with 12 multiple-

choice questions.

Black (1999) stated that a test instrument "should be validated by experts to

evaluate its consistency with desired constructs" (p. 199). As encouraged by Black

(1999), at least one statistics expert and one teacher of the current MBA course have

validated the test instrument to have internal validity, measuring the understanding level

on the topic of simple linear regression. The validation with faculty experts required

several revisions and feedback steps.

By using exactly the same online test instrument, reliability of the test instrument

would only have been an issue if the Web site acted differently for each individual. By

using a professional Web site (www.proprofs.com), with thousands of people connected

to it, the reliability concern was minimal. If a large number of people did not complete

the test, then more investigation of speed or other Web technical issues would have been

in order.

Evaluating the learning outcomes required nonparametric statistics because of the

small number of participants in this pilot study (Black, 1999; Isaac & Michael, 1997).

Therefore, Mann-Whitney U tests for interval/ratio data and Chi-square tests for nominal

data (Black, 1999, Table 19.1) were appropriate for between-groups comparisons, and the

Wilcoxon signed-rank test was appropriate for within-group matched-pair comparisons.

One or more cohorts were required to achieve pilot sample sizes of 10 or more for

both the control and quasi-experimental groups because a typical cohort has 9 to 15

students each, and expectations were that a portion of students did not participate.

11
Students did not have the choice of joining either the treatment group or the control

group, but had the choice of whether to participate. Participants within a cohort were all

part of the control group or all a part of the quasi-experimental group. This assignment

made it easier on the faculty member because they did not have to track individual

student's participation in one group or the other.

The researcher was not the faculty member teaching the course. While a single

faculty member to teach the traditional material was optimal for this study, practicality in

terms of timeframe and resources did not permit it, and was a limitation of the study. A

single faculty member was optimal for both the traditional and combined online training,

although the faculty member will not actively participate in the combined online training.

A single faculty member was not practical in terms of timeframe and resources, and was

a limitation of the study.

The independent variable for Research Question Ql was the test administration

(pretest versus posttest). Test administration was a nominal, dichotomous variable, with

a value of 0 for pretest and a value of 1 for posttest. The independent variable for

Research Question Q2 was the type of instruction (online versus traditional). The type of

instruction was a nominal, dichotomous variable, with a value of 0 for traditional

instruction and a value of 1 for combined online instruction. The dependent variable for

both research questions was the test score. The test score was a ratio variable because it

has a possible absolute zero, with possible values ranging from 0 to 12. Demographic

variables included gender, age, and level of Microsoft Excel™ experience (see Appendix

A). Although demographic data assisted in understanding if the two groups are similar,

demographic variables were not covariates in this study.

12
In addition to the GAISE college report, another framework for guidance and

evaluation is the Revised Bloom's Taxonomy of Cognitive Learning (RBTCL). The

RBTCL is a classification of levels of intellectual behavior important in learning

(Krathwohl, 2002). Benjamin Bloom developed Bloom's Taxonomy by leading a group

of educational psychologists in 1956. Cognitive psychologists led by Lorin Anderson

created the revised version. To maintain quality of education, the RBTCL provides a

good source for comparison with the combined online teaching methods. The research

study materials and results were compared to the RBTCL.

Significance of the Study

Although the power of the study was low and the sample size was insufficient for

generalizing the findings, there may be future implications in the realms of both business

and education. This study offers opportunities for new lines of inquiry in business

statistics courses and other courses with conceptually complex material, such as

information technology courses (including database design and object-oriented design),

science, psychology, and organizational leadership. In addition, these findings may

encourage educators to research or create increasingly diverse hybrid educational models

(Alonso, Lopez, Manrique, & Vines, 2008; Richardson & Tim, 2007). A longitudinal

study on retention of the information is another avenue of potential research. Corporate

learning management systems could also explore the potential of combining these four

methods.

Not all of the study results require further research because low-cost training

materials are a practical matter in corporations. The use of low-cost multimedia training

materials is directly applicable to businesses of today. For instance, training on the use of

13
a new software feature could include a learning object and the content placed on a

corporate Web portal. PowerPoint ™ slides merged with audio are another option for

business applications.

All of the online teaching methods in this study are applicable for students or

employees with hearing loss. Even the learning object in the form of an online tutorial

has both an audio portion and captions at the bottom of the screen, similar to film

subtitles. Therefore, findings from this study may be beneficial for students or

employees with hearing impairments. Individuals with hearing loss have 51 % less

chance of obtaining a good college degree compared to students with normal hearing

(Richardson, 2009).

This study has implications for further research on broader sets of participants.

Follow-up research could include a study involving a comparison of the teaching

methods for the duration of an entire course, rather than 1 week. The introduction of a

learning management system and facilitator interaction for the online environment would

be a realistic addition to the research. Further research would also be desirable in

comparing the value of the different forms of online software. Although comparisons of

software are common in online technical magazines, there is little peer-reviewed

quantitative research focusing on combining online methods.

Another extension of studying the combination of online materials would be to

introduce new ways of using online materials for supplemental resources in studies of

hybrid education (combining traditional and online instruction). A study of hybrid

education could involve the enhancement of a traditional course with online activities. In

14
this way, it would be possible to evaluate the effects of introducing technology to the

traditional classroom.

Definitions

The following key terms were pertinent to this study.

Cognitive flexibility. Cognitiveflexibilityis the ability to take different

conceptual and case perspectives in order to represent knowledge (Jang, 2000).

Cognitive flexibility shows that a person can bring to bear the knowledge gained to apply

to problem solving.

Cognitive Flexibility Hypertext (CFH). Cognitive Flexibility Hypertext, also

called CFTH, is a Web environment allowing multiple ways to maneuver through content

in a complex structure (Wiley, 2002). Using CFH, the learner may view different

example cases to explore concepts, using multiple points of navigation. An example of

CFH or CFTH is the Hypercase® at Rutgers University (Kendall & Kendall, 1999).

Constructivism. Constructivism is a method according to which the learner

builds knowledge through acquiring experiences (Bush, 2006). In the study,

constructivism is the approach in which a student acquires basic skills and then uses that

set of skills to solve real-world problems (Hakeem, 2001).

e-Coursemap (originally called an e-Map). An e-Coursemap™ is an electronic

mind map helpful for organizing material on a single page and for sequencing the content

of courses (Ruffini, 2004).

Learning objects. Learning objects are reusable digital resources to support

learning (Wiley, 2002). There is much controversy around the definition of learning

objects. The Learning Standards Technology Committee defines learning objects as

15
digital or non-digital objects used in technology-supported learning (Wiley, 2002).

Intelligent learning objects display responses to learner input to facilitate assessment or

further learning (Farha, 2007).

Mind map. A mind map is a graphical picture with branches like a tree. With a

mind map, a larger topic is broken into smaller topics (Ruffini, 2004). Small branches

then lead to other interactive material by using hyperlinks to Web sites or electronic files.

Repurposing. The act of copying and editing existing Web pages for another

purpose constitutes repurposing. When developing new Web pages, existing Web pages

are useful references. This minimizes effort and cost while providing a consistent

experience for the person viewing the Web pages.

Revised Bloom's Taxonomy of Cognitive Learning (RBTCL). The RBTCL is

a classification of levels of intellectual behavior important in learning. The RBTCL was

originally developed in 1956 by a group of educational psychologists led by Benjamin

Bloom and was later revised by cognitive psychologists led by Lorin Anderson. A

pyramid is the usual depiction of the RBTCL. The bottom level of the pyramid is

remembering, followed by understanding, then applying, analyzing, evaluating, and

finally creating (Krathwohl, 2002).

Statistical literacy. Statistical literacy involves reading and interpreting statistics

as evidence of arguments. Statistical literacy refers to "critical tliinking about arguments

that use statistics as evidence" (Schield, 2004, p. 16). According to the UNDP (2010),

statistical literacy is "the ability to read and interpret statistics, and think critically about

arguments that use statistics as evidence".

16
WebQuests. WebQuests are structured inquiry-based learning activities using

computer technology (Lahaie, 2008). WebQuests have standard sections of introduction,

task, process, evaluation, conclusion, and credits (Dodge, 2005). A WebQuest is a

teaching method used with an individual student or groups.

Summary

Without improvements in the statistic literacy of employees, business workers and

managers have the potential to make errors in judgment based on incorrect evaluations

and false interpretations of data seen daily (Manahan, 2007). The GAISE report for

colleges suggested ways to improve statistical literacy for undergraduate students

(College Report, 2010). The research focus for this study was the evaluation of one 1-

week lesson in an introductory statistics course. Participants in the study were MBA

students required to take a Business Statistics course at a private Midwestern university.

Participants were in either a control group or quasi-experimental group with non-random

assignment.

Members of the treatment group received an intervention combining online

teaching methods for learning simple linear regression. The intervention consisted of a

graphical organizer, a WebQuest, a learning object, and CFTH. Members of the control

group received traditional instruction on the same topic. Both within-group and between-

group comparisons are possible on test scores at the end of 1 week. Within-group

comparisons employed Wilcoxon signed rank tests, and between-groups comparisons

used Mann-Whitney U tests.

17
Chapter 2: Literature Review

The purpose of the quantitative study was to further research into complying with

the GAISE recommendations to improve statistical literacy. The study compared

combined online teaching methods to traditional methods of teaching a topic of simple

linear regression in a business statistics course over a 1-week period. The process of

literature research started with understanding statistical literacy, and then moved to online

teaching methods to address GAISE recommendations.

The literature sources cited here were the direct result of electronic database,

Web, and book searches. The searches fostered understanding statistical literacy, and

found online teaching methods to meet a gap documented in the GAISE college report

(College Report, 2010). Figure 1 depicts the gap (dotted line) between expectation and

delivery of education in terms of cost, pedagogy, and technology in coursework.

<^j&«p«*a*»o"j^>*—GAP-—*c]r Delivery
r>
• a&s

^ ^

as

mJ--,
w
-g
•— r-^j^-YL
—i •» u?
j

Figure 1, Conceptual Drawing of Expectation-Delivery Gap.

18
The existing research literature shown in the following sections are divided into

discussions on statistical literacy, factors that affect outcomes in statistics courses, current

teaching modalities, and the new online teaching methods. A discussion of GAISE

recommendations includes an analysis of new online teaching methods advantages and

disadvantages documented in the current research literature. To explain the research

study, the literature review contains a discussion of new training materials required for

the research, and implications for further research in hybrid and distance education.

The sources of research literature were predominantly peer-reviewed journal

articles on the four online teaching methods discussed, pluses and minuses of each, and

current pedagogy. Books on the subject of learning objects and CFTH (Wiley, 2002), an

article on graphical organizers (Ruffini, 2004), the Web site for WebQuests (Dodge,

2005), and graduate coursework using the CFTH concept (Kendall & Kendall, 1999)

were the seminal works and Web site that influenced the choice of studying combining

online teaching methods.

Statistical Literacy

Schagen (2006) defined statistical literacy as "understanding basic concepts such

as uncertainty, sampling, bias, and representativeness and asking critical questions about

any statistics that are presented" (p. 21). Milo Schield (2004) defined statistical literacy

as "critical thinking about arguments that use statistics as evidence" (p. 16). More than

one definition of statistical literacy points to an ability to understand information from

collected data, correctly analyze information, and an ability to communicate results

(Cerrito, 1999; Coutis, 2007).

19
Another way to understand statistical literacy isfroma quote in a psychology

journal "Statistical literacy is a necessary precondition for an educated citizenship in a

technological democracy" (Gigerenzer, Gaissmaier, Kurz-Milcke, Schwartz, &

Woloshin, 2007, p. 53). In that article, out of 450 adults in the United States studied,

only 30% could answer simple questions involving percentages, and one-third believed

mammograms detect problems with absolute certainty. Without statistical literacy, the

public is susceptible to what doctors and the media tell them, instead of finding out the

facts for themselves, which in the case of medical statistics and evidence based medicine

can be disastrous (Monahan, 2007).

Statistics education is pertinent to all major areas of study, all service and

manufacturing industries, and is pervasive in society as well. For the research study at

the private Midwestern university, Business Statistics (ADM515) is a part of the MBA

curriculum. Capshew (2005) quotes statistics courses as required portions of Bachelors

and Masters level curriculums for social sciences.

To increase statistical literacy, there are issues to overcome. Statistical literacy

issues stem back to pre-high school students (Carmichael, Callingham, Watson, & Hay,

2009). Coutis (2007) determined from a limited study that non-English speaking students

had practical issues when presenting and on tests. Kasonga and Corbett (2008) identified

surface learners as only acquiring enough material to pass tests compared to deep learners

who dig deeper for more meaning. Because statistical literacy involves deep

understanding of complex concepts, test-taking issues and surface learning are issues to

overcome.

20
Statistics anxiety and motivation are key factors of success in a statistics course

(Capshew, 2005), The ARCS model (Attention, Relevance, Confidence, and

Satisfaction), according to Capshew (2005), provides ways of increasing motivation and

decreasing anxiety. One solid tip from the article suggests redefinition of success in

mathematics to effort instead of aptitude because the learner can control effort level and

not aptitude.

Ideas for improvement in statistical literacy abound, but lack quantitative

information to back them up. In their extensive search of research articles on student

interest in mathematics and statistics, Carmichael, Callingham, Watson, and Hay (2009)

found that puzzles, computers and group work activities catch the interest of students.

Scores improved when the curriculum changed to a more student-centered process.

Kasonga and Corbett (2008) promote assessment changes in South Africa that focus on

literacy, thinking, and reasoning as a way to increase statistical literacy. Studying the

combined online methods supports adjusting pedagogy to increase statistical literacy by

quantitatively evaluating new teaching methods.

Outcome Factors in Statistics Courses

Many factors can affect performance outcomes in a statistics course. The factors

could include, but are not limited to, incorporation of technology, the anxiety level and

prior knowledge of the student, whether the content and format engages the student, and

the need for better assessments. Most of the existing research is limited in terms of

sample size and being quantitative. The majority of existing research is qualitative in

nature.

21
It was useful to research what instructors wished for in changes to a statistics

course. In a study of 139 American Statistical Association (ASA) and Decision Science

Institute members from a questionnaire, the three highest wishes from statistics

instructors were greater use of computers, use of real data, and smaller classes (Strasser

and Ozgur, 1995). The same wishes appeared in the GAISE college report, endorsed by

the ASA in 2005. The GAISE college report recommended increased use of technology

and real data sets for instruction. The GAISE college report did not mention class sizes.

Incorporating Technology in a Course.

One study focused on teaching bivariate data relationships (Forster, 2007).

Content for the lesson included diamond pricing compared to carat size, which succeeded

in engaging the students. A resultfromthe study was that the type of technology could

be a factor in achievement. In a study with 23 participantsfroman all-girl school, the use

of Microsoft Excel ™ increased support for discussion on bivariate data because the

visibility of axis labels and resolution increased when compared to graphics calculators

(Forster, 2007). Forster (2007) determined the spreadsheet tool also provided a means

for easily importing data, production of a scatter plot, calculation of a regression line, and

display of the regression line. For in-class discussions, a projector facilitated viewing

graphs.

In a Finnish study with 53 polytechnic students, Web technology was not favored

for those with a predisposition for the subject matter of mathematics and higher

information and communication technology (ICT) orientation (Alajaaski, 2006). Those

who had a low ICT orientation or a lower predisposition for mathematics had an attitude

change for the better between a pretest and posttest.

22
Resulting in a positive score results, 38 Malaysian students benefited from the

introduction of technology in the form of Microsoft PowerPoint ™ slides, Microsoft

Excel ™ spreadsheets, and concept mapping (Alias, 2009). Average scores in descriptive

statistics increased by 15% and overall performance increased as much as 21%. Content

made a difference in the Malaysian study, with presentation slides geared towards

challenging statistical misconceptions through dialog with the students. Students were

shown a presentation slide, asked to comment on it, and then when shown the answer,

their misconceptions proved a catalyst for discussion. Students also had a difficult time

with concept mapping because more than one answer could be correct.

Three months after taking an introductory statistics course, student motivation

increased when using technology, but conceptual understanding was still low in a

quantitative study with 22 participants. It was also low for those without the use of

technology as a comparison group (Meletiou-Mavrotheris, Lee, & Fouladi, 2007). The

study found that the technology has to match a pedagogical purpose. For instance, a

professional statistical software package may not be useful for explaining sampling

distributions.

Nature of the Student.

Pan and Tang (2005) found four factors contributing to anxiety of students in

statistics courses. The first factor from the seven participants was fear of mathematics.

The second was lack of application to daily life. The third factor was pace of instruction

and the fourth factor was instructor attitude. Strategies to battle these factors included

flexibility in assisting students, practical real-world applications, multiple evaluation

23
criteria for a grade instead of a few exams, and providing a supportive environment. A

limiting factor in the study was the small number of students.

Online education may not be better for all students. In a study of 94 students, 33

of which used online education for a statistics course, students with lower GPAs

performed higher in terms of traditional education, and lower with online education.

Students with higher GPAs performed equally well with either type of education

(Harrington, 1999).

In a study at Washington State University, 75% of the 267 students surveyed

preferred lab and lecture format to an Internet based approach (Johnson, Dasgupta,

Zhang, & Evans, 2009). Subject material and gender were statistically significant

variables, with male students preferring use of computers and for complex subjects or

subjects requiring labs (hands-on activities) the lectures with face-to-face interaction was

preferred. Summers, Waigandt, and Whittaker (2005) ran a comparison of 38 students in

a nursing program at the University of Missouri. Out of the 38 students, 17 decided to

take the online version of the statistics course. There were no significant differences

found in statistical knowledge from the two groups, but satisfaction was higher in the

traditional education group because of the instructor explanations, enthusiasm, concern

for the students, and openness.

Ward (2004) stated several common sense reasons why hybrid courses are better

than traditional or online to match student needs. The reasons were that students had

face-to-face interaction while also accessing materials online, multiple types of

communication supported, the classroom was a spontaneous environment, and students

had the opportunity to ask questions (Ward, 2004). Ward (2004) did not have the

24
features of a current learning management system of today for turning in assignments,

chatting, discussions, pictures, videos, and conference calling.

Chiesi and Primi (2010) studied 487 University of Florence participants in a

statistics course for psychology majors. Anxiety level and prior high school background

in mathematics were contributing factors to achievement in a statistics course. A factor

analysis in the study confirmed the Statistical Anxiety Rating Scale (STARS) results.

Another study of 99 students introduced technology as an assessment vehicle, resulted in

more anxiety for students with a lack of background in the subject of statistics (Cybinski

& Selvanathan, 2005).

Engaging Format and Content.

To be engaging for the student, one researcher found that developing raw data

from the students had positive results (Kottemann and Salimian, 2008). Leech (2008)

used a card game to reinforce statistical concepts for several semesters and found it a

cooperative environment and fun. Wells (2006) used the hands-on teaching format of

service-based instruction, where students collaborated with external community agencies

to provide statistical skills on applied project work.

From a practical perspective, Bloomsburg University in Pennsylvania created an

MBA statistics course offered as a distance program to solve problems in professor

availability, students being part-time, and the ability to offer courses all year (Grandzol,

2004). Grandzol (2004) documented interactions between faculty member and students

changing from lectures, assignments, question and answer sessions, and office visits to

learning units, file exchanges, chat rooms, and discussion forums. Cost and convenience

25
were driving factors to create the online education, with no documented compromise in

pedagogy.

Hurlburt (2001) investigated whether lectlets were an effective tool in distance

education or as supplemental materials for classrooms. Lectlets are "Web streamed audio

in conjunction with interactive text or graphical displays" (p. 15). The breakdown of

students in the study who completed the courses included 116 traditional education and

36 for distance education. Hurlburt (2001) did not find lectlets as a replacement for on-

ground education.

Lectlets had several advantages compared to traditional lectures. The advantages

included an ability for all students to answer review questions instead of one student,

more control of the material than in the classroom, ease of finding content to replay, an

ability to replay the lectlet any time that is convenient instead of specific posted times,

availability of transcripts, and fostering independent learning (Hurlburt, 2001). Even

though the lectlets were superior in many ways, better educational scores came from

traditional education, so Hurlburt (2001) called the use of lectlets more convenience

education and the best use of lectlets for students was as an adjunct to the classroom

because of an ability to make up for absences. The issue to be concerned about,

according to Hurlburt (2001) was the cost of producing lectlets. Labor hours for one

course could reach 2,000 and are a key factor for online projects in distance education or

hybrid education environments.

Bude\ L., Imbos, T., Wiel, M., Broers, N., and Berger, M. (2009) conducted a

study of 206 students to research what affect tutoring in a statistics course had. A

difference of 7% (60% to 67%) resulted from examinations at the end of the statistics

26
course. Those students with tutoring had higher scores than students without tutoring.

The major difference in achievement was engaging students in discussions, according to

the researchers.

Sze (2004) used video lecture format to cover the concern from online students

that missed lectures in traditional education. Video lectures are useful for showing

software techniques, explaining complex concepts, and should concentrate on a single

topic, be short to digest easily, and have VCR capabilities of pause and playback (Sze,

2004; Whatley & Ahmad, 2007). Camtasia Studio was the tool used to record video

lectures (Sze, 2004). Adobe Captivate ™ is another software tool used to produce video

and audio captures simultaneously. Playing the videos requires an Internet connection or

an ability to play recorded CDs on a computer.

Gordon, Petocz, and Reid (2009) surveyed and interviewed 37 international

educators and found that when educators focus on conceptual understanding instead of

mathematical formulas it is possible to have success with a wide variety of students from

various disciples (psychology, sociology, business) in the same classroom. The Gordon,

Petocz, and Reid (2009) study was important in that business statistics has a need for

developing conceptual understanding with a broad audience in multiple disciplines.

Tapping into basic skills of using formulas is still necessary after determining what

statistical techniques to employ for a given situation.

Assessments.

Assessments for statistics courses are important for measuring student success.

The National Science Foundation sponsored activity to create assessment questions to

measure statistical literacy from a statistics course. The three years of research with 1470

27
students, all across the United States, resulted in the Comprehensive Assessment of

Outcomes in Statistics (CAOS) series of tests (Delmas, Joan, Ooms, & Chance, 2007).

CAOS available questions regarding a single topic did not cover all of the conceptual

material in this study, so it was not useful in this particular study.

Fairfield-Sonn, Kolluri, Rogers, and Singamsetti (2009) promoted two levels of

assessments for statistics education. The first level measures a point in time of

knowledge. The second level assessment measures retained knowledge. Tutoring was

the recommended action if students performed poorly on second level assessments.

Kasonga and Corbett (2008) promoted assessment that used multiple choice

questions and open-ended essay format to test knowledge of ideas, connections and

extensions. Lawrence and Singhania (2004) compared a problem-solving format instead

of multiple-choice questions in a statistics course, resulting in no significant different in

test scores, but a very significant difference in student preference. Students preferred

multiple-choice questions, complaining that problem-solving questions were too

complex.

Teaching Modalities in Business Statistics Courses

Teaching methods and modalities for a college-level statistics course cover a wide

territory between traditional lecture-textbook to distance education. Hybrid education

bridges and merges the traditional and online formats together. There is no clear winner

in this discussion based on prior research literature. Larson and Sung (2009) took 168

students in an MIS course and compared online, face-to-face, and blended formats using

an analysis of variance technique. The finding was no significant difference between the

28
three formats on student performance, but satisfaction and effectiveness scores of blended

and online formats scored were high.

Johnson, Dasgupta, Zhang & Evans (2009) found that students preferred

traditional methods, mostly due to the possibility for face-to-face interactions. After six

years of offering online courses and material, students at the University of WoUongong in

Australia preferred hybrid formats 40% of the time, higher than either traditional or

online formats by themselves (33% and 25% respectively). Meletiou-Mavrotheris, Lee,

and Fouladi (2007) found that technology introduction in the classroom increased the

motivation of the students but not their academic achievement.

Format seems to take less of a role in some research than the interactions people

prefer or how engaged or motivated students are. Kottemann and Salimian (2008) found

that engaging students was more important than teaching format. Grandzol (2004)

explains that interactions for traditional and online formats for teaching statistics are

different. Traditional instruction includes lectures, assignments, question and answer

sessions, formal discussions, informal discussions, office visits, phone calls, and email

messages. Online interactions include learning units, file exchanges, discussion forums,

chat rooms, group pages, phone calls, and email messages. Grandzol (2004) needed to

create an online version of an MBA statistics course due to demand for it. The goal of

the Grandzol (2004) work was to create a course of equal quality to the traditional format.

Based on the prior research, any teaching method can work, and it is apparent that

student and teacher perceptions vary on the issue. To make progress in advancing

statistical literacy, a guideline is required. That guideline for improvement is the GAISE

29
college report, with its six recommendations. Another comparison is to the Revised

Bloom's Taxonomy.

GAISE College Report Recommendations

The American Statistical Association produced a report in 2005 recommending

changes in statistics education (College Report, 2010). Both the college report and K-12

report have recommendations for statistics education. Everson and Garfield (2008, p. 3)

documented the six recommendationsfromthe Guidelines for Assessment and

Instruction in Statistics Education (GAISE) college report as (1) Emphasize statistical

literacy and develop statistical thinking, (2) Use real data, (3) Stress conceptual

understanding rather than knowledge of procedures, (4) Foster active learning in the

classroom, (5) Use technology for developing conceptual understanding and analyzing

data, and (6) Use assessments to improve and evaluate student learning.

Providing a few instructional ideas, the GAISE college report did not attempt to

create new teaching methods or tie them to pedagogy in any significant way. New

learning pedagogies emerged since the 1990s and after an extensive search, three

educational pedagogies address the GAISE college report. The three pedagogies are

cognitive flexibility theory (Rossner-Merrill, Parker, Mamchur, & Chu, 1998; Wiley,

2002), constructivism (Bush, 2006; Connolly & Begg, 2006), and problem-based learning

(Abramovich & Cho, 2006).

Cognitive flexibility theory proposes that people learn through exploration and

experience, choosing their own path to knowledge instead of a prescribed lesson plan.

Constructivism complements cognitive flexibility theory in that people bring their own

experiences into solving new problems, thus gaining knowledge in participatory

30
activities. Problem-based learning is self-defining in that people learn through solving

real-world problems in an activity or case study format.

To address these new pedagogies, this study evaluated combining four new

teaching methods introduced since 1990. The new teaching methods included

WebQuests (Halat, 2008; Lahaie, 2008; Zheng, Stucky, McAlack, Menchana, &

Stoddart, 2005), learning objects (Farha, 2007; Stamey, 2006), electronic mind maps as

graphical organizers (Ruffini, 2004; Schau & Mattern, 1997), and cognitive flexibility

theory hypertext (CFTH; Jan, 2000; Papastergiou, 2008).

It was important to tie the four online teaching methods to six GAISE college

report recommendations. In doing so, pedagogy and teaching methods match a specific

outcome to achieve. The following sections elaborate on each recommendation and

discuss how the four online teaching methods attempted to meet the goals of the

recommendations.

Recommendation 1: Emphasize statistical literacy and develop statistical

thinking

The graphical organizer for the entire course in Business Statistics (Appendix A)

shows a whole course in a conceptual framework, with navigable links to content. One

type of content is a WebQuest (Appendix B), which explains the topic under discussion

and how to analyze data. A learning object (Appendix C) shows how to perform the skill

of calculating regression equation variables using Microsoft Excel ™, allowing the

student to increase statistical literacy abilities to analyze data and predict new dependent

values from independent values of their choice using the regression equation.

31
Finally, the cognitive flexibility theory hypertext (Appendix D), or CFTH, is an

immersive role-playing environment on the Web, allowing for exploration using a virtual

setting, employing photographs and hot spots on the Web pages, interactive canned

dialogs, andflexiblemovement in the scenarios. By role-playing as a consultant, the

student gains experience in a real-world scenario without the real-world expense of being

hired, trained, and sent out on assignments. Based on completion of syllabus

assignments, the student uses newly formed skills and applies them to solve complex

problems via multiple case studies.

Recommendation 2: Use real data

All CFTH assignments used real data to analyze for role-playing as a consultant.

CFTH raw data came from a myriad of sources, including public information provided

free on Web sites like the Census Bureau or NASA. For instance, CFTH assignments

used current medical and government statisticsfromgovernment Web sites.

The WebQuest and learning object used real examples of data, with proper

sourcing of materials cited. To study gaining skills quickly, a short-term WebQuest

attempted to facilitate skills building. In long-term WebQuests, students would gather

real data and cite their sources.

Recommendation 3: Stress conceptual understanding

The graphical organizer provides aframeworkof the big picture of the whole

course in a single Web page. The graphical organizer was an electronic version of a

concept map for a Business Statistics course. The WebQuest presents conceptual

material similar to a textbook, and then allows the launching of the tutorial material in the

form of the learning object. The learning object showed the concept of first entering

32
data, then using Microsoft Excel ™ to calculate regression formula variables, then

calculate new dependent variable values based on independent variable values provided

by the syllabus or if the student chooses, their own values. For a deeper understanding

based on performing the role of a consultant at two assignments, project assignments

exercised conceptual material through role-playing and immersion. Business concepts

included working with people, finding one's way through the facility, accessing the

experts when necessary, meeting new people, and how to present findings.

Recommendation 4: Foster active learning in the classroom

Both the learning object and CFTH are examples of active learning. The learning

object, WebQuest, and CFTH use Web technologies. The learning object promoted

active skills development by showing the Microsoft Excel ™ steps involved (with

narration and caption lines) on the topic of simple linear regression and then, in

conjunction with the WebQuest allows the student to practice those new skills. The

CFTH promoted experiential learning and problem-solving techniques while

simultaneously providing role-playing as a consultant analyzing actual data. The CFTH

was a role-playing set of case studies. The WebQuest contained five practice examples

with correct answers and a rubric to determine self-proficiency before moving further in

the assignments.

Recommendation 5: Use technology for developing conceptual understanding

and analyzing data

All of the online teaching methods used technology. The four online teaching

methods used Web pages and small amounts of programming using Hypertext Markup

Language (HTML). NOTEPAD™ is a free text editing utility with Windows, which

33
edited the Web pages. Afreeimage map editor created hot spots on the Web pages.

Camtasia Studio version 6 created the learning object output by merging voice captures

with screen captures and producing FLASH or other video formats. A laptop, a USB

connected microphone headset, a digital camera, and personal computer speakers were

the only equipment needed to produce the learning object. A browser consumed the

output from the Internet orfroma data CD, data DVD, portable storage device like a

thumb drive, or shared disk access on a network.

Recommendation 6: Use assessments to improve and evaluate student

learning

The WebQuest had practice problems and a rubric to test knowledge of how to

calculate the regression equationfromraw data. The identical pretest and posttest had

twelve total questions to prove mastery of the concepts and skills required. In an actual

course, grades with comments returned to the students on syllabus assignments using

CFTH data sets would provide another measure for improvement and learning.

Students have increased Internet access and increased exposure to digital media.

Prensky (2001) referred to students who grew up after 1980 as digital natives because of

using ICT all their lives. However, Prensky's theories are controversial. Prensky

asserted that digital natives learn differently than digital immigrants, that there is a need

for new methods of teaching and new content, and the brains of digital immigrants have

changed due to the nature of processing information. A study of 2,000 pre-service

teachers (Guo, Dobson, & Petrina, 2008) showed that age was not a factor in scores of

ICT proficiency. Bennett, Maton, and Kervin (2008) concluded that Prensky's assertions

were unsubstantiated by clear evidence and quantitative research.

34
Several new online educational methods have originated since the 1990s. The

new methods included the development and use of learning objects (Farha, 2007; Wiley,

2002), WebQuests (Halat, 2008), CFH (Jan, 2000; Papastergiou, 2008), and electronic

mind maps (Ruffini, 2004; Schau & Mattern, 1997). Each method has advantages and

disadvantages. This literature review covers advantages and disadvantages of each new

teaching method.

Current thinking suggests that using electronic media engages students and

encourages them to seek, analyze, and synthesize data. Bernie Dodge and Tom March

developed WebQuests in 1995 at San Diego State University (Zheng, Stucky, McAlack,

Menchana, & Stoddart, 2005). WebQuests help construct new knowledge or meaning

(Lahaie, 2008) by student use of electronic media. Lahaie (2008) stated that WebQuests

fit into nursing programs, helping to develop critical thinking and problem-solving skills.

Zheng, et al. (2005) studied 207 subjects to research if gender played a part in

learning with WebQuests. Zheng, et al. (2005) concluded that gender is not a significant

factor in learning with WebQuests and that teachers need to develop environments for

learning instead of prescriptive lessons. In other words, let the students learn in their own

way.

WebQuest construction follows constructivist pedagogy (Halat, 2008). Halat

(2008) cited problems with WebQuests, referring to the lack of access to the Internet,

lack of time to prepare WebQuests, and lack of reliable links as challenges for teachers.

WebQuests use learners' time well, by focusing on using information rather than looking

for it, and by supporting thinking at the levels of analysis, synthesis, and evaluation

(Dodge, 2005).

35
Revised Bloom's Taxonomy

The Revised Bloom's Taxonomy of cognitive learning is a classification of levels

of intellectual behavior important in learning (Krathwohl, 2002). A pyramid is the

typical depiction for the Revised Bloom's Taxonomy (Coffey, n.d.). The bottom level is

remembering, moving to understanding, then applying, analyzing, evaluating and at the

top is creating (Krathwohl, 2002). Each word in the pyramid can facilitate lesson

planning for education at all levels (Figure 2). It is important to tie the new teaching

methods to the Revised Bloom's taxonomy to ensure quality and comprehensiveness in

developing cognitive processes.

Figure 2. Revised Bloom's Taxonomy.

36
Remembering requires recall of information. To be able to perform the

WebQuest, students need to recall the skills development tutorial. Remembering where

the assignments show up in the CFTH is a requirement to be successful in the lesson.

Understanding involves interpreting, summarizing or explaining. Interpreting the

results of a scatter plot and summarizing raw data in graphical form shows understanding.

Explaining takes place when writing up resultsfromdata analyses of raw data found in

the CFTH environment.

Applying means that students can use the information or skills in a new situation.

In the case of CFTH, a student would apply new skills in a case study situation online in

an immersive environment. Another example of applying new skills is the practice work

in the WebQuest to show mastery of the new skill, which along with the rubric, the

student has instant feedback into whether to proceed or go back to the tutorial.

Analyzing involves decomposition, comparing, or exploring relationships. The

lesson of simple linear regression accomplishes exploration of bivariate data relationships

as a primary technique to analyze pairs of raw data. Comparing the new scatter plot to a

standard of sloping downward, being flat or sloping upward is another activity during

skills development.

Evaluating allows students to make judgments, hypothesize, or experiment. By

performing the act of creating a scatter plot, then judging if the shape is linear, then

further making a judgment as to linearity, the student evaluates the bivariate data. In the

CFTH, the student investigate opportunities presented in the online case studies, and then

make judgments on how to proceed and analyze the new data acquired there.

37
Creating involves the generation of something new. In the CFTH role-playing

environment, the student was encouraged to manipulate existing raw data and must

generate new analyses from that data, along with a presentation of findings. In the

WebQuest, the student was encouraged to create new scatter plots for the practice

examples to measure their skill level. In a full course, a student turns in assignments, as

compared with research on a single topic with ungraded assignments. The research did

not enforce creation of new data.

Advantages and Disadvantages of New Teaching Methods

In determining the subject matter to study, both pedagogy and student needs were

matched up with new online teaching methods. Much of the prior research yielded

advantages or disadvantages of specific new teaching methods. No one method is a

panacea. Current traditional lecture plus textbook format and each new teaching method

was researched and found wanting.

For example, expenses were high for students and institutions for traditional

methods in terms of tuition, books, parking, time, professor salaries (Ward, 2004). New

methods required expensive production of content (Jang, 2000). New methods did not

reach the goal of satisfying all of the needs of a whole course by themselves, or all of the

levels of the Revised Blooms Taxonomy.

Table 1 is a summary of literature findings, listing advantages and disadvantages

of each teaching method. For each online method, there are negative aspects. The

impact of combining online methods together is similar to the software implementation

concept of "best of breed" integration (Engle, 2008), with the hope to minimize

disadvantages as a whole.

38
Table 1

Teaching Methods -Advantages and Disadvantages

Teaching Method Advantages Disadvantages


Traditional Very structured Faculty focused method versus learner
textbook and Learners have access to teacher focused method
lecture and syllabus For all levels of Bloom's Taxonomy Expensive for the learner
(Johnson, Expensive for the institution
Dasgupta, Zhang, In large institutions, faculty is not
& Evans, 2009; always available for questions and
Summers, technical assistants do the teaching
Waigandt, & Not flexible for scheduling
Whittaker, 2005; Proceed at one pace for all learners
Ward, 2004).
Electronic Mind Simple to construct Requires other electronic media for
Maps or Graphical Provide intuitive navigation building a whole course
Organizers Flexible Requires a computer and mouse
(Ruffini, 2004) Available anytime anywhere
Learning Objects Just as good or more effective than Must be combined with another
(Farha, 2007; traditional methods method to provide a total course.
McKnight, 2006; Can be reused Require software and time
Meister-Emerich, For some levels of Bloom's Taxonomy Video recorders add expense
2008; Stamey, Learner focused No human interaction
2006; Wiley, Available anytime anywhere Requires a computer and mouse
2002). More engaging than a textbook
WebQuests Provide context, structure for learning Requires contentfromthe Web that
(Halat, 2008; Available anytime anywhere may not always exist
Lahaie, 2008; Learner focused Requires browser
Zheng, et al. 2005) More engaging than a textbook Requires a computer and mouse
Flexible and easy to construct Students may surf unintended pages
Reuses available content
Can be used by an individual or a group
For some levels of Bloom's Taxonomy
Cognitive Uses an immersive environment for role- Has a skills development prerequisite.
Flexibility Theory playing and higher cognitive learning Can be expensive to develop.
Hypertext Flexible Requires a computer and mouse
(Kendall, 1999; Learner focused Requires an independent learner
Jang, 2000; Available anytime anywhere Learner time commitment can be heavy
Wiley, 2002) For higher levels of Bloom's Taxonomy

Traditional textbook and lecture is a teaching method. Listed with the teaching

method is one or more research articles related to traditional education following APA

guidelines. The second column lists the advantages and the third column lists

disadvantages for that teaching method.

There are advantages and disadvantages with each method. A step forward would

be to retain most if not all of the advantages, while minimizing or eliminating the

39
disadvantages. The text following the table explains those advantages and disadvantages

in more detail. By combining the four online teaching methods, the idea is to minimize

disadvantages while preserving the advantages. In information technology, a best-of-

breed integration strategy takes singular products known to provide high value, and then

integrate them together to deliver an entire solution. Best of breed integration strategies

are useful when functionality required is in silos (Engle, 2008). By using this as an

educational model for complex material, which any one given method would be lacking

or deficient, the idea is to have a final product that is comprehensive and high quality.

Traditional Textbook and Lecture

The baseline for comparison to combined online methods was the traditional

textbook and lecture format of education. The traditional method is prevalent and has

existed for decades at the college level. For complex topics, there is some concern

whether online methods can compete with the traditional face-to-face instruction

(Johnson, Dasgupta, Zhang, & Evans, 2009; Summers, Waigandt, & Whittaker, 2005;

Ward, 2004). For example, mathematics requires practice, skills training, and problem

solving. Using an objectivist step-by-step approach in the classroom is a familiar

paradigm to teach mathematics. The subtle nuances of human interaction might not be

able to replicate easily with online communication. Face-to-face interaction is a key

advantage for traditional education (Ward, 2004), but this assumes that professors

perform teaching duties instead of research. One-on-one interaction is not possible in

large classrooms. Traditional instruction follows the objectivist model in which the

learner reflects reality from the lecturer and textbook (Stoilescu, 2008). The objectivist

40
model ignores the background of the learner. The objectivist model equates to faculty-

centered learning. The constructivist model is learner-centered.

Other disadvantages center onflexibility,scheduling and cost of traditional

education. The pace of a traditional class is not individual because all participants flow at

the same rate. The traditional method lacksflexibilityin terms of scheduling. The basis

for the schedule forms from the professor or technical assistant hours instead of at the

convenience of the learner. Labor of classroom-based instruction is a major contributor

to cost in education. Cost is higher with full-time employees incurring benefits than with

adjuncts and part-time employees. Textbooks are also expensive, reaching $898 for the

academic year 2003-2004 and averaging 6% increases every year (Government

Accountability Office, 2005).

Graphical Organizers

A technique to provide a roadmap and context for online learning is an interactive

electronic mind map, also known as an e-Coursemap (Ruffini, 2004). A mind map is a

graphical picture with branches like a tree to break a larger topic into smaller topics, and

eventually small branches that lead to the substantive course material. Mind maps are a

graphical concept map, organizing the material in a simple visual breakdown. According

to Sirias (2002), graphical organizers break down complex material, allowing students to

use spatial and textual skills simultaneously. Within this study, the use of an electronic

graphical organizer provided navigation and an entry point for the study materials.

When a mind map is electronic, the learner accesses the course material using a

Web page, which includes hyperlinks to Web sites or electronicfilesfromthe graphical

picture. The hyperlinks could present a syllabus for the course, could take the learner to

41
skills development material, present chapters of material, assignments, or an immersive

role-playing environment. The flexibility of Web pages permits the inclusion of

documents and multi-media video and audio files, making an electronic mind map

inherently flexible.

One identified purpose of a mind map is that of a graphical organizer (Schau

& Mattern, 1997). Schau and Mattern (1997) used graphical organizers and concept

maps for planning, learning, and assessment in their graduate level courses. For this

study, an e-Coursemap provided an organized and graphical navigation to materials. The

learner sees the big picture for the course every time when entering the course, which is a

simple reminder of how the individual lesson fits into the whole course.

The main deficiency of an electronic mind map is that it does not work by itself in

a course. The electronic mind map is not meant to be the course material, so must be

used in concert with the course material to provide a whole course. An electronic mind

map is a single graphical image with links to the content, not the content itself. It was

strikingfromthe literature reviews that each teaching method did not try to combine with

other methods to create a complete course, but instead developed in silos. Combining an

e-Coursemap with other materials in this study makes this disadvantage moot.

An electronic mind map begins with a picture. The picture can be hand-drawn

and then scanned into a graphical image on the computer, or software can enable the

creation of the drawing. Once a drawing exists to explain the overall concept, a Web

page refers to the image and then hot spots or linkable areas on the image launch more

Web pages. An electronic mind map does not necessitate learning Hypertext Markup

Language (HTML), the building block of Web pages, but it is easier if the author of the

42
electronic mind map is familiar with HTML. The last aspect of using the electronic mind

map is access to the initial Web page, which can be through a Web site, CD-ROM, or

files copied to a computer, accessed by a browser like Internet Explorer ™ or Firefox ™.

A browser is a free software program, which renders and displays the Web page on the

computer. Common browsers tested in this study were Chrome ™, Internet Explorer ™,

and Firefox ™.

Software exists to help with the creation of electronic mind maps. A product

called Mind Manager ™ is an example. The simplest form of a mind map uses HTML

and an image map. The image map refers to sections, often referred to as hot spots, of a

graphical image and provides a clickable link to another document when the mouse

moves across those portions of the image. The Meracl Image map editor is free from its

originator's Web site (Meracl, 2010) and allows mapping of rectangles, circles and

polygons. After mapping the images, HTML code is produced that can be inserted into

an HTML document. If Windows is the operating system, the free NOTEPAD™ utility

included with the operating system can aid in creating HTML and image maps.

Learning Objects

Advantages and disadvantages of learning objects stem from controversies

regarding the many definitions, efficacy, metadata, granularity, and lack of quantitative

research. Exploration of prior research showed high interest in research about learning

objects. More research articles about learning objects existed than any of the other new

teaching methods combined. This interest level requires more in-depth explanations of

the prior research than the other methods.

43
David A. Wiley edited a series of essays on learning objects in 2002, forming the

seminal work on the subject called The Instructional Use of Learning Objects. Many of

the concepts regarding learning objects, constructivist teaching, and cognitive flexibility

theory stem from his work. In reading the book, the thought of how learning objects

could transform education in small chunks was akin to the way extensible Markup

Language (XML) broke onto the scene in the 1990s for information technology as a

ubiquitous data-messaging platform. At first, it was hard to pin down XML because it

was so open to interpretation and use, much like learning objects. Indeed, learning

objects and information technology have common conceptual roots of object-oriented

designs.

Thousands of learning objects are currently available at free sites like Multimedia

Educational Resource for Online Teaching (MERLOT) on a wide variety of subject

matter (McKnight, 2006; Richards, 2003). McKnight (2006) documented that more than

2,000 learning objects exist in the MERLOT repository for business topics and more than

200 exist in accounting alone. In addition to MERLOT (Poupa & Forte, 2003), examples

of other repositories are the Alliance of Remote Instructional Authoring and Distribution

Networks for Europe (ARIADNE), AAMC MedEdPortal, The Campus Alberta

Repository of Educational Objects (CAREO), The Cooperative Learning Object

Exchange (CLOE), and the Education Network Australia (EdNA Online) (Ruiz, Mintzer,

& Issenberg, 2006).

Learning Object Definition

Two issues with learning objects are the lack of a clear definition and lack of high

quality metadata to search for them (Parrish, 2004). David A. Wiley researched learning

44
objects and his book in 1992 called The Instructional Use of Learning Objects was the

defining work on the subject. Wiley defined learning objects as "any digital resource that

can be reused to support learning" (Wiley, 2002, p. 6). In the same book from Wiley, the

Learning Standards Technology Committee (a part of IEEE formed in 1996) defined

learning objects as "any entity digital or non-digital, which can be used, re-used or

referenced during technology-supported learning" (Wiley, 2002, p. 5). Stamey (2006)

defined learning objects as snippets of electronic learning that are self-contained,

engaging, and combine in various ways to facilitate the building of coursework.

With so many definitions of learning objects, it is confusing and causes adoption

issues (Farha, 2007). Not knowing for sure the final definition of a learning object makes

it difficult to catalog and reference for searching, and what metadata to collect on them to

enable reuse (Bergtrom, 2006). Walker (2007) defined narrated PowerPoint ™ slides as

a learning object for a business statistics course. According to Cebeci and Tekdal (2006),

audio and multimedia podcasts are learning objects. Although students and lecturers

appreciated narrated PowerPoint ™ slides, the large file size was an issue. Using a

different tool like Camtasia Studio could deal with that issue. Using narrated screen

captures as a learning object is an alternative to narrated PowerPoint ™ slides.

Additionally, a new piece of software within the Camtasia Studio suite allows for

capturing PowerPoint ™ slides and narrating them within the Camtasia Studio editor. In

this way, a reduction in the file sizes occurs, eliminating a barrier to use.

It is not easy to know if something is a learning object. For instance, are "clicker

sets" (audience automatic responses to questions or polls) considered learning objects

(Bergtrom, 2006)? Physical videotapes and clicker sets would certainly be digital

45
content, but would not be reusable by multiple people simultaneously in different parts of

the world, or stored in a repository. According to Yacovelli (2003), being stored in a

repository with metadata should be part of the definition of a learning object.

Learning objects have been classified as intelligent, first order, or second order.

Mallmann and Nunes (2008) defined intelligent learning objects that provide feedback to

the learner instead of just displaying information. Allert, Richter, and Nejdl (2004)

classified learning objects as first order (FOLO) and second order (SOLO). FOLO

learning objects are resources for a specific learning objective. SOLO learning objects

are resources to reflect a strategy or conceptual model. The skills development portion of

the study uses a first order learning object (FOLO) to facilitate learning of basic skills

with Microsoft Excel ™.

Learning Objects Advantages

De Salas and Ellis (2006) listed advantages of learning objects from the student

and faculty perspectives. For the student, advantages included highly effective learning

environments, ability to use a combination of learning objects of their choice, allowing

for experiences for problem solving with other learners, universal access, and

individualized learning (De Salas & Ellis, 2006). For faculty, advantages of learning

objects include high degrees of interoperability and ability to use them as building blocks

within several courses, increased productivity for instructional designers, and savings in

terms of time and money for course development (De Salas & Ellis, 2006).

Farha (2007) compared the use of learning objects against traditional textbook

lessons with 327 university level students and found that learning objects were three

times more effective in learner achievement scores than traditional methods. Farha

46
(2007) used a quantitative ANOVA analysis on a randomly assigned control group and

experimental group to analyze the raw data in SPSS. Based on Wiley (2002) and Farha

(2007), learning objects enhanced the educational experience. Buzzetto-More and Pinhey

(2006) state that learning objects promote active learning environments.

The troubling aspect to the analysis was only 226 valid responses out of 327

participants. That means 101 people could not answer any questions correctly, which is

disturbing. The content of the lesson or the students are two obvious potential factors in

this poor achievement. The object of the lesson was to create a formula for a monthly

payment calculation for a loan in Excel. Potentially, either almost one third of the

students could not perform this function or the materials were not sufficient for the

learning objective. More research into content differences of learning objects is required

to answer this question.

The main pluses for learning objects are reusability, the ability to replay or

redisplay the content, ease of updating the material and consistency of message.

Learning objects are electronic and providing them on the Web is possible. Any

electronic medium like a CD or DVD can be a method of distribution and access. In a

corporate setting, file servers, collaboration sites, or learning management systems can

house the material. For instance, Farha (2007) used Blackboard ™ as the learning

management system to house the learning object. Chen, Willis, and Mahoney (2005) used

WebCT (Web Course Tools) as a course management system.

Another advantage is the flexibility and adaptability of learning objects and

reusability for many courses. Taylor, Stewart, and Dunn (2005) used learning objects for

a database concepts course. MERLOT stores thousands of learning objects for multiple

47
disciplines. Business statistics, and specifically the topic of simple linear regression,

requires skills training materials. The comparison of scores from a pretest and posttest

helps determine if the learning object created builds those skills.

Electronic learning materials such as tutorials have been useful in learning

technical topics. Vendors like Microsoft have dedicated Web sites for technical support,

filled with tutorials and video lessons on how to use their products. One advantage of

learning objects is the ability to replay them as many times as required for the individual

needs of the learner, and accessibility anytime, anywhere (via the Web or CD-ROM).

Learning objects provide a consistent message no matter how many students use the

material or how much time passes.

Learning Object Disadvantages

Learning objects in isolation are not sufficient to build an entire course. Learning

objects must combine with other materials to provide a total course. Learning objects as

tutorials are well suited for skill development, but to be valuable for an entire course on a

subject, additional components are necessary. Electronic mind maps, CFTH, learning

management systems, and WebQuests fill the gaps. Butson (2003) takes issue with

learning objects as a reductionist technology, ignoring inference, abstraction, and

discovery. Agreeing with Butson would mean that something else is required besides

learning objects to complete the educational experience. In addition to a learning object,

a WebQuest and CFTH attempt to complete the educational experience.

The word "object" comes from object-oriented programming (Wiley, 2002).

Learning objects are similar to programming objects in an object-oriented programming

project. Learning objects combine to form sophisticated courses compared to

48
programming objects, which combine to form sophisticated computer applications. Jones

and Boyle (2007) describe object-oriented programming as the definition of classes to

define objects. Objects can be stored in repositories similar to learning objects. Once

tested and in repositories, the objects can be reused with minimal testing required.

According to Jones and Boyle (2007), reuse must be an initial objective for learning

objects.

In a typical introductory statistics course, there are at least a dozen individual

topics for skill development and advanced skills on how to apply the material. For an

Advanced Placement statistics course, there are 15 or more major topics. Each topic

would require one or more learning objects, some method or vehicle to launch it and a

way to evaluate competence of the learner after the learning object is used. A learning

management system such as Blackboard ™ can serve in that capacity. The WebQuest

training materials in the study aid in launching of the learning object, and evaluating

competence is through testing and for a real course, grading of assignments in the

syllabus.

Learning objects require software and time to produce. Learning objects require a

computer and mouse to consume the content. If using video recording equipment,

especially sound peripherals, expense is a problem. Requirements for this study included

a personal computer with a Windows ® operating system, and Microsoft Excel ™.

Students, faculty, and business people may already have these items. Camtasia Studio ™

was the largest expense at 200 dollars. The USB headset was 40 dollars, and personal

computer speakers were 20 dollars. The total initial investment was 260 dollars in

hardware and software to create and play back learning objects. The biggest cost by far

49
is labor hours, with a single learning object taking from 60 to 300 hours, according to

Kapp (2003). The learning object using Camtasia Studio took 40 hours to build and test,

and 20 hours to learn the product, for a total of 60 hours, which is on the low side of the

Kapp estimates.

A criticism of learning objects is the lack of human interaction. This criticism

forgets the overall platform for online education of the learning management system.

Universities have electronic platforms to sign up for courses, provide assignments,

syllabi, grades, chat sessions, and discussion forums. It is not the function of a learning

object to provide learning management in addition to electronic content. The research

literature commonly used only one technique for teaching and made it the cure for all

teaching issues. Many criticisms of new teaching methods stemfromthis silo approach

to solving problems. This realizationfromprior research provided a pivotal reason for a

combination approach in this research study.

Another consideration is who "owns" the learning object content. Options for

intellectual property ownership are the faculty member or designer of the learning

objects, the educational institution or business, or the learning object can be freely

available through sites like MERLOT (Bennett & McGee, 2005; Farha, 2007). Some

universities and businesses have employees sign papers that educational or business

entities retain ownership of any products created by an individual.

Tornpsett (2005) argues that reuse of learning objects to build new courses is

much more complex than instructors might naively assume. This argument makes sense

when learning objects have a large scope and functionality, but the caveat to this

50
argument is that large learning objects tend to overlap with learning management system

functionality - grading, assignments, assessments, quizzes, etc.

Learning Object Metadata

Meister-Emerich (2008) found that available learning objects at online

repositories were useful to teach complex analytical topics, such as statistics. Friesen

(2004) defines the Learning Object Metadata (LOM - IEEE 1484,12,1-2002) as a

storage vehicle for descriptions of learning objects. Another metadata repository for

learning objects is the Dublin Core Metadata Initiative. These repositories are an attempt

at standardization, but come with the price of creating the metadata. One repository

might be favored compared to another if it has fewer elements to fill out while still being

useful for indexing and searching for learning objects.

The Learning Object Repository Instrument (LORI) was the tool used to rate 149

learning objects mentioned in statistics textbooks. The ratings system scored each

learning object with a one through five points. Between 2.0 and 2.49 represented the

highest interval measured, with a quantity of 97 learning objects falling into that interval.

Meister-Emerich (2008) also found that learning objects covered topics of descriptive

statistics but did not cover complex topics such as comparison of differences in means of

two samples. Higher LORI scores mean that the learning object covers more of Bloom's

taxonomy.

One reason that metadata repositories are important is the factor of reuse. Elliott

and Sweeney (2008) reused learning objects on a project that would have been 160 hours

of original development, and instead spent 48 hours. The study estimated that a three-

fold increase in time was possible for new learning objects compared to reuse of learning

51
objects. One way to make reuse possible is to make the learning object SCORM

compliant. SCORM stands for Sharable Content Object Reference Model, and is a set of

specifications to allow learning objects to connect with various learning management

systems. Camtasia Studio software publishes SCORM-compliant content for Internet use

and in learning management systems.

For reuse to be possible, an expectation exists of a certain amount of granularity

at a low enough level. If the learning object is too broad, including many topics and

assessments, questions that do not pertain to another use, or just trying to be all things to

all people, this hinders reuse (Convertini, Albanese, Marengo, A., Marengo, V., &

Scalera, 2006). More issues are whether learning objects are actually being reused

(Elliott & Sweeney, 2008; Krauss, Ally, & Koohang, 2005), and the cost to create them

(Friesen, 2004). Elliott and Sweeney (2008) used a case study approach to study learning

object design. Friesen (2004) evaluated metadata repositories and concluded that return

on investment for learning objects would require high reuse. Learning objects also

require some way to launch them in an overall context to provide an entire course of

material to a student.

Repositories for learning objects metadata exist at the MERLOT site on the Web,

and at the University of Wisconsin. This means that learning objects are readily

available, but the metadata might not be of high quality (Parrish, 2004). If the learning

object is too large in content, its reuse is in jeopardy. If the learning object content is

light, it might not be as useful as it could be (Elliott & Sweeney, 2008). The ability to

reuse a learning object enough times to offset the cost of creation means that the learning

52
object must also be easy to find when searching the repositories (Bennett & McGee,

2005).

The metadata captured may be of poor quality (Friesen, 2004; Knowles, 2005),

causing problems when searching for learning objects. There are three competing

standards of the Dublin Core (Friesen, 2004), MERLOTfromMIT and Learning Object

Metadata (LOM) from IEEE. All three standards have different elements to capture for

each learning object (Heath, McArthur, McClelland, & Vetter, 2005).

A recommendation from Cher, Siew, and Richards (2006) is to use proper

instructional design for learning objects, making them clean and neat, easy to read, and

digestible small chunks of content. Cher, Siew, and Richards (2006) also considered

mathematics as requiring clear instructions, but also a facility to explore and work with

real data examples.

WebQuests

WebQuests, a learning method created by Bernie Dodge and Tom March in 1995

from San Diego State University (Halat, 2008), provide a context for delivery of

educational materials that are online. Lahaie (2008) considered WebQuests as engaging

for students while allowing them to construct knowledge by seeking, analyzing, and

synthesizing data. WebQuests (Zheng, Stucky, McAlack, Menchana, & Stoddart, 2005),

follow constructivist pedagogy (Halat, 2008). WebQuests are available online to use free

of charge (Dodge, 2005).

WebQuests have several standard sections to the online material. The sections are

Introduction, Task, Process, Evaluation, Conclusion and Credits (Halat, 2008). Each

section becomes a single web page or sections of a large Web page, and the evaluation

53
section contains a rubric for the learner and teacher to know how to grade the activity.

The Process section is the connection point to a learning object. The learning object and

WebQuest in this study focuses on simple linear regression concepts and techniques

using Microsoft Excel ™.

An advantage of WebQuests is structure, providing a recipe or standardized way

to develop course lessons. WebQuests are very flexible and introduce technology to

classrooms. Skills development in this study used a short-term WebQuest (1-3 periods).

Long-term WebQuests (more than 3 periods) have the time to allow for writing papers or

essays, showing critical thinking. This study used a short-term WebQuest for skills

development. Weller (2004) kept costs low for online activities by not producing

expensive learning objects, but instead assigning participative activities (similar to long-

term WebQuests). Weller (2004) investigated using learning objects by producing 120

learning objects to represent material for an entire Economics course, but also recognized

that learning objects are only one part of a total solution to enhance distance education.

One expectation of this study is to document the costs of the preparing the online learning

materials in terms of dollars, time, and skills required.

There are issues with using WebQuests. WebQuests have their roots in K-12

educational environments. The apparent success of WebQuests with K-12 education has

not translated into higher education. Not much literature exists about using WebQuests

for adult learning environments, and very little quantitative research (Abbitt & Ophus,

2008). This begs the question of whether secondary and higher education resources share

ideas and methods with each other or if there is aflawpreventing WebQuestsfromuse in

higher education.

54
Another issue is that Web content does not always exist when needed. Missing

links for Web content are a problem because the Web site owners may not be aware of

the usage, do not guarantee reliability of service and do not have to maintain content

indefinitely for these purposes. A third issue is that students may not like the content of

the lesson and drift off searching the Internet (Halat, 2008).

Cognitive Flexibility Theory Hypertext

Cognitive Flexibility Theory Hypertext (CFTH) or Cognitive Flexibility

Hypertext (CFH) is an educational approach rooted in constructivist theory (Wiley,

2002), with students guiding their own education, learning from course material and

applying it to real-world situations. Wiley (2002) stated that Cognitive Flexibility

Hypertext (CFH) environments are "flexible, generative, constructivist learning

environments rather than merely efficient instructional systems" (p. 87). CFTH or CFH

are especially useful in ill-structured domains (Spiro, Collins, Thota, & Feltovich, 2003).

Business statistics can be well-structured and ill-structured. Well-structured training

serves skills development. When acquiring understanding instead of simple skills and

memorization, business statistics includes the ability to solve real-world problems in

various ways, with the learner challenged to apply appropriate skills to new situations. In

a university in Brazil, 29 students used CFH with case studies and based on a survey,

found that reflection and authenticity was increased (Lima, Koehler, & Spiro, 2004).

An example of CFTH is the HyperCase ® developed by Dr. Julie Kendall and Dr.

Ken Kendall on the topic of Systems Analysis and Design (Kendall & Kendall, 1999).

By immersing the learner into a case study environment, with role-playing, the

interconnected network of Web pages becomes a playing ground and complex learning

55
environment that is easy to navigate but challenging enough to be engaging and

stimulating (Kendall & Kendall, 1999).

One advantage of CFTH is the extensibility and richness to provide adequate

complexity to engage and challenge the student. The number of Web pages and their

links to each other provide the complexity of environment that simulates real-world

activity. For instance, the Web pages within the systems analysis and design course by

Kendall and Kendall (1999) number in the hundreds.

Although there is an environment to wander in the Kendall and Kendall (1999)

example, more Web pages are devoted to data structure content and object documentation

than to the environment being wandered. For this single research lesson on simple linear

regression the opposite is true. There are 87 Web pages for the research materials, with

39 pages devoted to the fictional company area called C4Hire, Inc. and 38 pages devoted

to the two online case studies, leaving 10 miscellaneous Web pages such as the course

description and syllabus. The 38 Web pages involved in the case studies are composed of

a higher percentage of wandering and experiential details than with data (Appendix E).

With 17 lessons in a course, the expectation is to have two cases to work for each lesson,

making 34 cases in the course. Only time and effort limit the potential size of the

environment. For solely the CFTH, a minimum number of 685 (39 + 17*38) Web pages

for a complete course is an educated estimate. More pages would be necessary for the

WebQuests and general tutorials. With that number of pages, the students would have

the potential for a challenging and engaging learning experience.

Jang (2000) supported the conclusion that online education using CFTH for

complex and analytical subject matter is more effective for Korean high-school history

56
students than traditional learning from a textbook. CFTH has advantages of being able to

simulate real-world experiences, interactions between the learner and fictional characters,

more complex assignments that integrate politics, environment, people, organization

structure, practical problems to be solved and expected real-world outcomes (Jang, 2000;

Wiley 2002).

Unless tutorials combine within the CFTH, a disadvantage of CFTH is that basic

skills development is a prerequisite before moving into a virtual role-playing

environment. Another disadvantage is the expense to create the environment. Web

programmers are expensive resources, along with instructional designers and hosting

servers. The approach Kendall and Kendall (1999) used was to create static Web pages

to keep the cost down, and not include avatars to simulate the user of the system. Static

interconnected Web pages kept costs down for this study.

CFTH environments lack simulated human movement such as in virtual reality

environments, but are less expensive to create. Virtual environments that include avatars

require Web programming and complex server environments to run on. An example is

Second Life, which has as one of its uses, a virtual learning environment hosted on the

Web. Another example is the Appalachian Educational Technology Zone (AETZone),

used at Appalachian State University to provide a 3-D virtual reality space for students to

interact (Bronack, Sanders, Cheney, Riedl, Tashner, & Matzen, 2008). AETZone

requires an Active Worlds server (Active Worlds, 2010) and programming is involved to

create the interactive environments. The license of under $1,000 to run the software is a

small portion of the cost to host an Active Worlds environment. One would need to add

57
the hardware for internal hosting or a dedicated and managed hosting service, and the

labor to create the programming of the virtual world.

The immersive Web environment in this study assumed the student to role-play as

a statistics consultant, solving real-world problems. Virtual reality environments such as

Second Life ™ attempt to be photo-realistic, with realistic looking environments and

avatars. Photos combined with defined areas on them called hot spots accomplish the

goal of photo-realism at a much lower cost than programming virtual reality

environments. With a free hot spot editor like Meracl, the programming required is auto-

generated.

The business aspect of the CFTH is another characteristic of the environment.

The student has access to an environment that shows nuances of how one must interact

with people, who the chain of command is, key resources to talk with, where to find the

data required, visiting multiple locations as a consultant, and doing the "legwork" it takes

to accomplish a task. This experience in a virtual Web environment enabled the student

without the fuel to drive to destinations, the hiring process in attaining a job, gaining

contacts and experience to have access to projects normally given to senior people with

years on the job, and the creation of a reference library of knowledge.

One of the critical factors is to make all materials accessible electronically to

answer the cost-effective parameters of universities and students. Static Web pages and

the lack of avatars that move reduce the cost in the immersive environment by not

requiring heavy programming in the virtual learning environment. The movement

between rooms can take the form of clicking on an arrow to navigate, which minimizes

the complexity of any HTML programming. The cost of the CFTH environment in this

58
study centers on stock photos for the role-playing environment. In 2010, a typical cost of

a stock photo is under two dollars for an ability to use the photo for one project. This

lesson required 50 dollars worth of stock photos from a Web site with more than a

million photographs to choosefrom(Big Stock Photo, 2010). Some photographs used

werefreefromthe Web site, and some were readily andfreelyavailable prior to the

research.

A Combination Approach

The control group in this study consisted of business degree students. The control

group obtained traditional instruction with book materials assigned by the university and

one of the regular professors that teach the ADM515 Business Statistics course.

Microsoft Excel ™ performs any complex calculations as part of the course. One chapter

on the topic of simple linear regression is the scope of content for this study. Homework

assignments are evidence of learning the material. The control group receives a pretest

and posttest before and after that module of the course to determine their achievement.

The traditional training intervention consisted of lecture and textbook, with the

students working problems in class and at home. A topic module of an AP Statistics

course can take up to 15-20 hours of in-class work. Assessment takes the form of

multiple choice quizzes or exams. For a college-level business statistics course at a

private Midwestern university, the topic takes less than four hours in class, with

homework spanning one week. A test on the topic can be in the form of a written exam

or long quiz, with a time limit and in multiple-choice format.

The quasi-experimental group is another set of business degree studentsfromthe

same university. The group is quasi-experimental because of no random assignment.

59
Instead of learning from lecture and textbook, the intervention is a combination of four

online teaching methods. The learner measured themselves against multiple work

problems and a rubric in the WebQuest, and in a full course, an assignment using the data

set found in the CFTH to practice their new skills would be required, but not in the

research. The group receives a pretest and posttest before and after that module.

The combined online materials presented concepts established with the AP

Statistics test. The concepts taught for the AP Statistics test are included in the traditional

instruction and online instruction, which allows a valid comparison of learning methods.

The concepts studied within simple linear regression are collection of data, creation of a

scatter plot, interpretation of the scatter plot, calculation of the slope, y-intercept for a

best-fitting line of the data, calculation of the coefficient of correlation statistic, and

prediction of a y-value using the simple linear regression equation. Hypothesis testing of

the correlation value is one of the more complicated concepts of the course topic. The

test instrument reflects the evaluation of these skills and conceptual knowledge,

determined by two independent statistics professors.

The creation and interpretation of a residuals plot is a part of AP Statistics lessons

and included in the traditional textbook under research. To be consistent, the online

materials covered residuals plots. To cover the additional material, more content under

the WebQuest and tutorial learning object was required. The online and traditional

materials covered hypothesis testing of the correlation coefficient as well.

The online training intervention starts with the use of an electronic mind map,

called an e-Coursemap (Ruffini, 2004), providing a graphical organizer for navigation to

course materials. A WebQuest and a learning object cover the topic of simple linear

60
regression and generating a simple linear regression equation using Microsoft Excel ™.

The second portion of the online training intervention incorporates CFTH, providing a

role-playing environment for the student. Students had exercises in the WebQuest to test

their new skills and a rubric to judge their progress. As part of the role-playing

environment, the student had opportunity to use a data set found in the CFTH, and apply

the skills learned in the WebQuest and online tutorial to a real-world example.

Summary

Businesses require employees that can collect data, synthesize and analyze data

into information, make judgments based on the information, and present findings to

colleagues or associates. Statistical literacy has been and remains to be an improvement

opportunity for college graduates. The American Statistical Association documented and

approved six recommendations for improvement in how to teach college introductory

statistics courses in the 2005 GAISE college report.

One of the GAISE college report recommendations was to integrate technology

into the classroom. Web 1.0 represented static pages controlled by a webmaster. Web

2.0 introduced social media and collaboration to the landscape, and Web 3.0 promises

cultural shifts as well, with personal search engines and databases (Goodfellow & Maino,

2010). With more technology available to students, integrating familiar technologies into

the classroom becomes imperative for the future, whether using on the ground or online

modalities.

Multiple new teaching methods originated since the 1990s. Research is required

to learn more about how to comply with the recommendations of the GAISE college

report. Studying the impact of combining online teaching methods for a single topic in

61
an introductory college statistics course was a step to further the research need. By

combining four teaching methods, it was possible to investigate if weaknesses found in

individual methods remained for the combined approach, and if new problems surfaced.

62
Chapter 3: Research Method

Business leaders require workers and managers capable of critical thinking and

analyzing real data. Statistical literacy is an important skill for educated employees, with

health care as an example (Monahan, 2007). Advancements in statistical literacy for

college students, fostered by good-quality statistical instruction, have the potential to

provide more workers and managers who think critically, use real world data, and

demonstrate problem-solving skills (College Report, 2010).

The general problem is a lack of statistical literacy for students (Schagen, 2006),

which translates into the same lack of statistical literacy in employees of corporations.

The current level of pedagogy does not meet the needs of current student populations for

college introductory statistics courses, inspiring the GAISE recommendations to improve

current pedagogy. In order to further research into how to comply with the GAISE

recommendations, combined online methods of instruction is the area of research in this

study.

The purpose of the quantitative research was to compare combined online

teaching methods to traditional lecture and textbook format for teaching simple linear

regression in a 1-week segment of a required introductory college-level business statistics

course. Participants were two groups of students, non-randomly assigned, at a private

Midwestern university. The concepts taught in the research study for simple linear

regression are from the textbook, Statistics for Managers Using Microsoft Excel (Levine,

Stephan, Krehbiel, & Berenson, 2011). Members of the treatment group received an

intervention combining online teaching methods for learning simple linear regression,

and members of the control group received only traditional instruction for the same topic.

63
Both within-group and between-groups comparisons were conducted using

nonparametric statistics. Nonparametric statistics "make no assumptions about

population parameters or distributions" (Vogt, 2007, p. 68). Because this is an

exploratory study, with a small sample size, generalizations about population normality

or its mean and standard deviation do not take place.

Research Questions

To investigate the effects of combining online teaching methods with traditional

delivery of instruction on learning simple linear regression in an introductory college-

level statistics course, the following research questions were addressed.

Ql, To what extent will test scores differ for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and CFTH for

learning simple linear regressionfromthe beginning to the end of a 1-week segment of a

college-level statistics course?

Q2. To what extent will test scores differ between college students receiving a

combination of a graphical organizer, a WebQuest, a learmng object, and CFTH for

learning simple linear regression and students receiving traditional methods for learning

simple linear regression in a 1-week segment of a college-level statistics course?

Hypotheses

The following are the null and alternative hypotheses used to test the research

questions.

Hl 0 . Test scores will not differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

64
CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

Hl a . Test scores will differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

H2„. Test scores will not differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

H2a. Test scores will differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

Research Method and Design

The research design was quasi-experimental because participants were assigned to

group membership non-randomly (Vogt, 2007). The research used a pretest and posttest

design with the identical test instrument used in both tests (Black, 1999). Both between-

groups and within-groups comparison were documented in the results.

For random assignment, all students would choose whether to participate in either

the control or experimental group. In this particular study, the whole cohort itself was

65
assigned to be in the control group or quasi-experimental group, meaning the participants

are not choosing the group. The research would have been impossible to administer for

the faculty member if a portion of the cohort was in the control group and another portion

of the same cohort was in the quasi-experimental group due to timing considerations.

The control group timing was just before and after the simple linear regression lesson.

The quasi-experimental group timing was any time in the program of courses before the

lesson on simple linear regression.

A quantitative study was the choice because there is a lack of quantitative

research material about combining online learning methods. A quantitative analysis

served to provide lacking quantitative information about combining online methods in

prior research studies. Any future research will be able to compare against these

measures.

Participants

Students at this particular private Midwestern university belong to cohorts,

assigned by the university during registration processes. Any faculty members were

assigned from the university to participate in the study and preside over the control group

participants. The faculty for control group participants had experience teaching the

ADM515 course.

One or more cohorts were chosen to participate in the control group, depending

on level of voluntary participation. The study represents extra work for the student and

does not replace any activities of the traditional education. For the control group, the

extra time commitment is up to two hours to take the pretest and posttest before and after

the traditional instruction on simple linear regression.

66
One or more cohorts were chosen to participate in the quasi-experimental group,

depending on level of voluntary participation. The study represents extra work for the

student and does not replace any activities of the traditional education. For the quasi-

experimental group, the extra time commitment includes up to two hours to take the

pretest and posttest before and after the combined online instruction on simple linear

regression. Additionally the students spent time working with the combined online

learning materials, expected to be two to six hours to match the minimum of two hours

lecture and four hours of homework. The student time commitment is under student

control, similar to the amount of time spent working on homework problems and there

was no way to enforce the time commitment in either the control or quasi-experimental

group.

The quasi-experimental group received a pretest, combined online instruction, and

a posttest, all without the aid of the faculty member. The quasi-experimental faculty

member responsibilities begin and end with handing out participation requests and

consent forms. Although it would have been optimal to have the same faculty member

preside over both control and quasi-experimental groups to simplify any possible external

validity concerns, it was not necessary for the study because the faculty member is not

participating in the education of the combined online materials. In a different type of

study, such as a hybrid education study, the faculty member would be crucial to preside

over both sets of education and participate in them both, but that is not the case here.

The reason more than one cohort was required is the lack of volunteer

participation. Because the study represents an extra workload, some students declined to

67
participate. A minimal incentive of a gift card was offered to recognize the extra time

commitment and workload of these students to participate in the study.

The only timing factor for the quasi-experimental group was to receive the

combined online training before receiving the traditional instruction on the topic of

simple linear regression. Every attempt was made to consider the training interventions

as the difference between the pretest and posttest.

Materials/Instruments

Test Instrument

Participants who completed both pretest and posttest assessments allowed

evaluating conceptual knowledge of simple linear regression. The examination consisted

of an online questionnaire with 12 multiple-choice questions (see Appendix F). All the

questions had five possible answers. The format of the questions followed a question

design similar to that of the Advanced Placement Statistics test (College Board, 2009),

with multiple-choice format. The participants had 60 minutes to choose answers to the

questions and then submit them.

Vogt (2007) and Black (1999) recommended either gaining expert approval for

the test instrument or conducting a pilot study to determine reliability and internal

validity. Based on those recommendations, the test instrument was validated with two

independent faculty members, one considered an expert in teaching statistics, having

taught over 300 statistics courses, authored a textbook, and reviewed a statistics textbook,

and an adjunct professor who has taught the ADM515 course many times at the private

Midwestern university where the research is being conducted. It took several iterations to

68
edit the questions to test student knowledge on the topic of simple linear regression

before the test instrument received approval as being a valid test instrument.

Reliability could also be an issue if the situation of taking the test changes

between the pretest and posttest. Because test delivery was over a professional Web site,

with thousands of customers daily, performance should not have deviated enough to

warrant any concern, but the connection to the Internet could deviate for a participant if

the person chooses to use a slow or unreliable connection during testing. Measuring

completion in a 60-minute timeframe can provide evidence that network connectivity and

speed were not an issue in completion of the test.

The concepts of simple linear regression tested match the chapter devoted to

simple linear regression in the textbook used in the ADM515 business statistics course

for the traditional education control group. The concepts covered also matched the study

material for the AP Statistics exam for high school students to attain college credit. The

questions tested the following knowledge: (a) recognizing the coefficients of the

regression equation, (b) predicting a dependent variable value based on an independent

variable value, (c) identifying which variable is dependent or independent, (d)

interpreting the shape of curves or linear data on a scatter plot, (e) interpreting coefficient

of correlation values, and (f) identifying the strength or direction of a slope, (g)

extrapolation, (h) hypothesis testing of the correlation coefficient, (i) assumptions for

simple linear regression, (j) interpreting the slope, (k) that linearity does not prove a

causal relationship, and (1) interpreting the coefficient of determination.

The online test required participants to enter a unique identification number

distributed at the beginning of the study after the informed consent form (Appendix H for

69
the control group and Appendix I for the quasi-experimental group) were signed and

turned in by the student. Demographic and descriptive questions (Appendix G) asked

about age, gender, number of hours spent per week on the computer, estimated skill level

with Microsoft Excel™, and estimated skill level with simple linear regression. The

scales used for demographics follow the format used by Farha (2007), except that instead

of check boxes, which allow for multiple inputs, radio buttons and dropdown lists

enforced singular answers. A paper form was available if the online instrument was not

available for any reason.

Training Materials

The traditional training used lecture, a textbook, and Microsoft Excel ™ for

calculations and charting. Homework problems were assigned and due over a 1-week

duration for a given chapter or topic. For the ADM515 course, chapter 13 in the textbook

Statistics for Managers Using Microsoft Excel covers the topic of simple linear

regression.

The combined online materials are all original works, and the student begins by

using an electronic concept map of the course, termed an e-Coursemap (Ruffini, 2004).

The e-Coursemap depicts a picture of the concept, along with navigable links on the Web

page to direct the learner to a syllabus, general course description, lessons, and tutorials.

Appendix A shows the image of what the student saw when beginning the first page of

content for the course. Lesson 14 on bivariate data relationships contained a link to the

WebQuest Web page covering the topic of simple linear regression. The only other

active links were the consultant for hire button area at the top to launch the CFTH.

70
The research also contained a WebQuest, which is a Web page devoted to content

related to the subject matter in a very structured format. The WebQuest provided a Web

link to launch the multimedia learning object. The learning object was generated using a

tool called Camtasia Studio ™. The FLASH™ material requires a FLASH™ reader,

downloadable without charge from the Adobe Web site. Another format provided is the

Windows Media Player ™ compatible content, allowing for maximum accessibility by

personal computers with Windows operating systems.

The fourth component of original work was the CFTH, which is an immersive

Web role-playing environment, consisting of over 80 Web pages specifically designed for

the student to role-play as a consultant using their knowledge of simple linear regression

on a real-world problem. The CFTH contained two case study assignments. Appendix B

shows the WebQuest content. Appendix C shows screen captures of critical stages of the

multimedia learning object. Appendix D shows screen captures of some of the CFTH

Web pages.

The CFTH was an immersive environment built entirely using pictures on Web

pages. Some pictures are without charge, and some pictures purchased from Big Stock

Photo (Big Stock Photo, 2010). The only programming tool used to build the Web pages

was NOTEPAD™, a free text editor available with all versions of Windows, although

any text editor provides the capability required for this function. Limited knowledge of

Hypertext Markup Language (HTML) was required to edit the text Web pages. The Web

pages depended heavily on tables, images, and fonts. Any reference materials for HTML

were freely available through browser searches on the Internet.

71
The only requirement for viewing the Web page was a personal computer with a

Web browser. Microsoft Internet Explorer version 6 ™ or above, Mozilla Firefox ™,

and Google Chrome ™ were tested to ensure that the course materials run successfully on

these browsers. Multiple browser support allows more students to participate even if

their software is different.

The expected costs of the training materials were a factor in evaluating the

findings of the study. Costly items include stock photographs, hardware, software

utilities, and particularly software labor. The cost breakdown excludes the cost of

hosting a website ($5.00 per month) because the website is multipurpose, and providing

quizzes ($3.00 per month) for the same reason. The cost analysis excludes labor for

learning about the research topic, although this could involve hundreds of hours over an

entire course. Costs do not reflect time needed to learn the basics of HTML

programming, and therefore is an assumed skill of the person creating Web content.

The summarized expected costs for this study appear in Table 2. Some of the

costs are one-time and do not reflect on each lesson. Meracl ™ software and Easy

Thumbnails ™ software are utilities available without charge. Stock photos represent a

cost for each lesson. The rate per hour for labor is $50.00, even though cheaper

alternatives are possible. Costs increase if professional video and audio capturing

devices, along with lighting and other equipment become involved.

72
Table 2

Expected Costs for a Simple Linear Regression Lesson

General or per lesson Item Cost


General Personal computer 0

Microsoft Windows XP® 0

Microsoft Office™ 0

Digital camera 0a

USB headset $40

Speakers $20

Camtasia Studio™ $200

Meracl software ("hot spot" editor) 0

Easy Thumbnails software (to resize images 0


easily)

Labor for CFTH C4Hire corporate buildingb $2,000

Subtotal $2,260

Per lesson Stock photographs $50

Labor for learning object(s) and CFTHb $2,000

Subtotal $2,050

Total $4,310°
a
$100 if needed. b40 hours @ $50 / hour. c$4,000 of the total represents labor costs.

To develop the Web pages required in this study, several utilities were required,

some of which are free. Microsoft PowerPoint™ aided in tailoring any images of

photographs, with the "Save As" feature used to output the slide as a graphic file. The

Meracl ™ software generates Web page lines to support "hot spots" on an image. Easy

73
Thumbnails ™ is a free utility to reduce the size of graphic files for faster downloads of

Web pages, and used on all images to maintain consistency.

StatCounter ™ (StatCounter, 2010) is a free Web utility that gathers statistics

from the Web page for each link traveled. Data collection occurred on each page, with

activity logs available for download from the StatCounter website. The logs allow for

further research by providing a mechanism to check whether students navigate at random

through the training materials. For this study, the logs verify usage of the training

materials on a summary level and not an individual level because no identifying

information is collected.

Operational Definition of Variables

The independent variable was the type of instruction (online versus traditional)

for between-groups assessment and the test administration (pretest versus posttest) for the

within-group assessment. The dependent variable was the test scores. Evaluating the

learning outcomes required nonparametric statistics because of the small sample size.

The following are operational definitions of the variables used for this study.

Test administration. The independent variable for Research Question Ql was

the test administration. Test administration was a nominal, dichotomous variable, with a

value of 0 for pretest and a value of 1 for posttest.

Type of instruction. The independent variable for Research Question Q2 was

the type of instruction. The type of instruction was a nominal, dichotomous variable,

with a value of 0 for traditional instruction and a value of 1 for combined online

instruction.

74
Test score. The dependent variable for both research questions was the test

score. Administration of an identical test occurred both before and after the intervention.

The test score was a ratio variable because it has a possible absolute zero, with possible

values ranging from 0 to 12.

Demographic information collected during pretest and posttest provided input for

descriptive statistics. Demographic variables included gender, age, and level of

Microsoft Excel™ experience. However, demographic variables were not covariates for

this study.

Data Collection, Processing, and Analysis

Before data collection begins, permission to conduct the study was obtained from

the Institutional Review Board (IRB) of Northcentral University. Testing of participants

was voluntary. Participants live in Southern Indiana, Ohio, or the Kentucky area and

study at a private Midwestern university. University procedures dictate the signup

process and recruitment of participants. Correct or incorrect answers in the study did not

reflect at all on class grades. Participants used an online questionnaire to complete the

pretest and posttest. The test for this study is in Appendix F. A paper version of the

questionnaire serves as a backup only in the event that network access is not available.

Review by a proctor ensured that electronic entries into a computer match written values.

The test consisted of 12 multiple-choice questions. The questions adequately

measured conceptual knowledge of the topic and basic skills. Microsoft Excel ™ is an

allowable tool to create the prediction values because it is a part of both the control group

and quasi-experimental group lessons. To test knowledge and more advanced

understanding, the participant was shown multiple-choice questions that are situational.

75
Questions designed similar to the questions appearing on the Advanced Placement

Statistics examination (College Board, 2009) ensured following current examination

formats in widespread use.

To protect anonymity, all participants entered a unique numeric identifier when

accessing the questionnaire. The identifier enabled the matching of pretests and posttests

and distinguished the treatment group from the control group. Unique identifiers also

made it possible to identify people who completed the test multiple times. Only first

attempts count as valid scores.

After data collection, data flowed from Microsoft Excel™ into SPSS (version

15.0) statistical software for analysis. Nonparametric statistics were required to analyze

the data because the sample size was too small for parametric statistics. For Research

Question Ql, the Wilcoxon matched-pairs test was appropriate to measure differences in

test scores between pretest and posttest. For Research Question Q2, the Mann-Whitney

t/test for independent samples was appropriate to compare differences in test scores

between the treatment group and the control group. The length of time required for each

student to complete the examination, or the number of questions completed within the

allotted hour appears in the results.

Each test item had five possible answers. A learner could randomly guess the

correct answer 20% of the time (one of five choices for each question). Therefore, an

overall score between 0 and 3 indicates a lack of knowledge, matching what is achievable

randomly. As a passing score on the Advanced Placement Statistics test is 60%, a score

of 8 or above exceeds passing level on that standardized examination and represents

76
mastery of the material. A score between 4 and 7 represents some knowledge of the

material.

Methodological Assumptions, Limitations, and Delimitations

Assumptions. Part of the findings list out any variations between the

demographics of the two groups. Descriptive statistics for age, sex, level of Microsoft

Excel™ experience, and test scores aid in determining if the groups are similar. If the

assumption of similar groups were satisfied, t tests on equivalent groups become usable.

Assumptions about the distribution of scores were not required because of nonparametric

statistics usage for the data analysis.

Limitations. Two independent statistics professors have reviewed the

examination and have verified that the questions reflect the concepts of simple linear

regression. Five possible test answers for each question minimize an ability to attain a

high score by guessing answers.

The possibility of carryover existed because the pretest and posttest use an

identical measurement instrument. To address this concern, students did not receive

feedback on correct or incorrect answers after the pretest. Therefore, even if participants

remember the questions and all possible answers, the participant would have to learn the

material through the training intervention or through external means to affect the score.

The time between pretest and posttest is one week, so the opportunity to learnfromothers

is minimized, but inherent in any pretest and posttest design.

Another limitation of the study is that group assignment was not random. The

students do not control the group assignments. In addition, the sample size for this study

was small, and nonparametric statistics were required for the analysis. With small

77
sample sizes, generalizations to a larger population from sample data are not reliable

(Black, 1999). Despite the lack of external validity for this study, the results are useful as

an exploratory pilot study to determine any problems with data collection (Vogt, 2007),

issues with testing measurements (Isaac & Michael, 1997), and as a basis for more

research.

Because the participants understand they are part of a study, the "Hawthorne

Effect" (Adair, 1984) of changing their behavior due to being studied is possible. The

groups did not have knowledge of each other's activities or the testing and materials for

the other group. The consent form explained the type of study and requirements of the

study, which could make the student more alert or vigilant, but the same condition is true

for both groups, so comparisons within group and between groups have the same level of

opportunity, making comparisons still valid.

Delimitations. The scope of the study included one lesson in an introductory

statistics course. An entire course was not feasible to produce during the limited

timeframe and budget of the research, although the framework for additional content and

for an immersive environment exists for further additions. The choice of campus

locations was limited to existing learning centers.

Ethical Assurances

Four major ethical areas of concern are protection from harm, informed consent,

right to privacy, and honesty with professional colleagues. Before data collection began,

obtaining permission to conduct the study was crucial from the IRB of Northcentral

University and from the university where research took place. IRB approval is a pivotal

step to protect human participants in research.

78
For human trials, informed consent was required. The informed consent form

contained language about protection from harm, informed consent, and right to privacy

(Appendix H, Appendix I). The informed consent document stated that participation is

voluntary and participants had the right to withdraw from the study at any time. To

protect the privacy and confidentiality of participants, each participant received a unique

identifier, provided with the consent form to participate in the study. The unique six-digit

identifier eliminated the need for personal identifying information when signing up or

taking the pretest or posttest. The consent form also stated that no matter what the score

on the tests, their grades in the course did not suffer as a result.

Honesty with professional colleagues in this study covers the complete and

accurate reporting of results, any communications between colleagues, and procedures to

ensure accurate reporting. For instance, a proctor's signature verified moving the data

from Microsoft Excel ™ into SPSS ™ for analysis and any documentation included in

the appendix. For the test instrument, validated by independent experts, the electronic

mail messages providing approval of the test instrument are included in the appendix.

The test instrument itself is in Appendix F. Screen captures of online materials are

placed in the appendix for independent verification as required (Appendix A, B, C, D).

Summary

This exploratory pilot study was quasi-experimental in design. Non-randomized

groups of MBA students at a private Midwestern university received a 1-week course

segment in simple linear regression. Members of one group received a combination of

four online instructional methods, while members of the other group received traditional

instruction. Pretests and posttests provided data for within-group assessments, and

79
posttest scores provided data for between-groups assessments. Data was analyzed using

nonparametric statistics.

80
Chapter 4: Findings

The purpose of the quantitative study was to further research into complying with

the GAISE recommendations to improve statistical literacy. The study compared

combined online teaching methods to traditional methods of teaching a topic of simple

linear regression in a business statistics course over a 1-week period. Participants came

from MBA cohorts at a private Midwestern university. The participants who volunteered

for the study go to school in the geographic region of southern Indiana and Kentucky.

The study was quasi-experimental, because of non-random assignment of

participants (Vogt, 2007). It was time sensitive to have control group participants take

the pretest and posttest around the lesson on simple linear regression in the ADM515

Business Statistics course. The quasi-experimental group only used combined online

materials. The quasi-experimental group cohorts were not time sensitive in the sense that

they could take the pretest, use the online materials, then take a posttest any time in their

academic studies prior to the ADM515 Business Statistics course. Random assignment

was not feasible in this instance and would have placed an undue hardship on the faculty

member to track which participants were from each group.

Descriptive statistics were useful to compare the demographics of the participants

in the control group and the quasi-experimental group. Along with descriptive statistics,

a quantitative analysis using nonparametric statistics on a small sample size was useful to

analyze differences within and between the two groups of participants in comparison to

pretest and posttest scores. The following sections will display the results of the

descriptive and nonparametric statistics run on the data collected.

81
Results

Not everyone asked to participate in the study signed up and actually completed

the study. For the control group, 11 out of 44 people completed the study. For the quasi-

experimental group, 9 out of 66 people completed the study. All available MBA cohorts

within a three-month period in all of Indiana, Kentucky, and Ohio had an opportunity to

participate in the study for the control group. All available MBA cohorts within a three-

month period in southern Indiana and Kentucky geographic regions had an opportunity to

participate in the quasi-experimental group.

The demographics of the groups included variables of age, gender, number of

hours spent per week on the computer, estimated skill level with Microsoft Excel™, and

learning preference. All demographic variables were broken down into categories, and

analyzed using the descriptive statistics and Chi-Square analysis within SPSS vl9. A

Chi-Square test enables us to determine whether the distributions are equal regarding

demographic categorical data.

It is reasonable to question if the control and quasi-experimental groups were

similar with regard to the demographic variables. Under a level of significance of 0.05,

the Chi-Square analysis for gender yielded /(I, N=20) = 0.002, p = 0.9640. The Chi-

Square analysis for Microsoft Excel ™ skills levels yielded )?(1, JV=20) = 0.02, p =

0.8876. The Chi-Square analysis for age yielded^(2, iV=20) = 4.7042, p = .0952. The

Chi-Square analysis for number of computer hours per week yielded ^ ( 3 , JV=20) =

4.2232,/? = .2384. The Chi-Square analysis for learning preferences yielded ^(2, N=20)

= 2.2992, p = .3168. The results showed that the counts are not statistically different for

the two groups.

82
Descriptive Statistics

The following tables display descriptive statistics for both the control and quasi-

experimental groups. Table 3 displays gender data, age data is within Table 4, number of

computer hours per week is within Table 5, Microsoft Excel ™ skill level is within Table

6, and learning preference data is within Table 7. Frequency, percentage, and cumulative

percentage for each demographic variable are common throughout all of the tables.

Table 3

Frequencies for Gender

Group Gender Frequency Percent Valid Percent Cumulative


Percent
Control Female 5 45.5 45.5 45.5
Male 6 54.5 54.5 100.0
Total 11 100.0 100.0

Q-Exp Female 4 44.4 44.4 44.4


Male 5 55.6 55.6 100.0
Total 9 100.0 100.0

83
Table 4

Frequencies for Age

Group Age Frequency Percent Valid Percent Cumulative


Percent
Control 21-30 3 27.3 27.3 27.3
31-40 2 18.2 18.2 45.5
41-50 6 54.5 54.5 100.0
Total 11 100.0 100.0

Q-Exp 21-30 3 33.3 33.3 33.3


31-40 5 55.6 55.6 88.9
41-50 1 11.1 11.1 100.0
Total 9 100.0 100.0

Table 5

Frequencies for Computer Hours per Week

Group Hours Frequency Percent Valid Percent Cumulative


Percent
Control 1-10 1 9.1 9.1 9.1
11-20 3 27.3 27.3 36.4
21-30 1 9.1 9.1 45.5
>30 6 54.5 54.5 100.0
Total 11 100.0 100.0

Q-Exp 1-10 2 22.2 22.2 22.2


11-20 4 44.4 44.4 66.6
21-30 2 22.2 22.2 88.8
>30 1 11.1 11.1 100.0
Total 9 100.0 100.0

84
Table 6

Frequencies for Microsoft Excel ™ Skill Level

Group Skill Frequency Percent Valid Percent Cumulative


Percent
Control None 0 0.0 0.0 0.0
Below
Average 4 36.4 36.4 36.4
Above
Average 7 63.6 63.6 100.0
Expert 0 0.0 0.0 100.0
Total 11 100.0 100.0

Q-Exp None 0 0.0 0.0 0.0


Below
Average 3 33.3 33.3 33.3
Above
Average 6 66.7 66.7 100.0
Expert 0 0.0 0.0 100.0
Total 9 100.0 100.0

Table 7

Frequencies for Learning Preference

Group Age Frequency Percent Valid Cumulative


Percent Percent
Control Online 0 0.0 0.0 0.0
Textbook + Lecture 3 36.3 36.3 36.3
Mixture 8 72.7 72.7 100.0
Total 11 100.0 100.0

Q-Exp Online 1 11.1 11.1 11.1


Textbook + Lecture 4 44.4 44.4 55.6
Mixture 4 44.4 44.4 100.0
Total 9 100.0 100.0

85
Nonparametric Statistics on Research Questions

The first research question for this research was if test scores for a lesson on

simple linear regression would differ over a single weekfroma pretest to a posttest when

college students received a combination of online teaching methods. In short, do the

combined online methods produce significantly different outcomes between the pretest

and the posttest?

Hl 0 . Test scores will not differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

HIa. Test scores will differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

To test the hypothesis that test scores will not differ significantly for college

students receiving a combination of a graphical organizer, a WebQuest, a learning object,

and CFTH for learning simple linear regression from the beginning to the end of a 1-

week segment of a college-level statistics course, a Wilcoxon signed rank test for paired

data was performed. Using SPSS to analyze the control group data (z = -.960, p - .337),

and quasi-experimental group data (z = -2.263, p = .024), the analysis makes us conclude

not to reject the null hypothesis for the control group and to reject the null hypothesis for

the quasi-experimental group. There was no statistically significant difference between

86
pretest and posttest scores for the control group. There was a statistically significant

difference found between pretest and posttest scores for the quasi-experimental group.

The second research question for this research was if test scores for a lesson on

simple linear regression would differ over a single week from a pretest to a posttest

between a group of college students who received a combination of online teaching

methods and another group of college students received traditional methods.

H20. Test scores will not differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

H2a. Test scores will differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

Do the combined online methods help MBA students achieve higher outcomes on

the posttest than traditional methods? To test the hypothesis, a Mann-Whitney U test on

two independent samples using SPSS showed that there was not enough evidence to

reject the null hypothesis when comparing pretest scores (z = -.699, p = .485). When

comparing posttest scores, the Mann-Whitney U test provided enough evidence to reject

the null hypothesis that the posttest scores were equal (z = -2.265, p - .024). There is not

enough evidence to reject that pretest scores between the two groups were not statistically

87
different. There is evidence to reject that the posttest scores between the two groups were

not statistically different.

Table 8 shows median values between the two groups for pretest and posttest

scores. Based on this data, the median value changed from 5 to 6 as a test score for the

control group using traditional methods. In addition, the median value changed from 5 to

8 for the quasi-experimental group using combined online methods.

Table 8

Median Values for Pretest and Posttest Scores

Pretest Posttest Posttest


Group Mdn Mdn Percent
Control 5 6 50%
Q-Exp 5 8 66%

Table 9 shows sample size, mean and standard deviation values between the two

groups for pretest and posttest scores. Based on this data, the mean value changed from

5.18 to 5.72 as a test score for the control group using traditional methods. In addition,

the mean value changed from 6.00 to 8.00 for the quasi-experimental group using

combined online methods. The standard deviation increased for the control group (SD =

2.49) and decreased for the quasi-experimental group (SD = 2.29) on the posttest.

Table 9

Mean Scores for Pretest and Posttest

Pretest Posttest
Group N M SD M SD
Control il 5.18 1.66 5.72 2.49
Q-Exp 9 6.00 3.04 8.00 2.29

88
Using nonparametric analysis techniques, equal variances of distributions and

normality are not assumptions, as is the case with parametric tests on means. Small

sample sizes are a reason to use nonparametric analysis techniques, as long as a

generalization to a population is not stated.

Production Actual Costs for the Content in the Study

Costs documented in the study are critical to show that high-quality materials are

feasible at an affordable price. One of the main objections to learning objects and CFTH

are the expenses involved (Jang, 2000; Ward, 2004). Expenses fall into four main areas

of hardware, software, photographs, and labor. Hardware includes a computer, digital

camera, USB headset, and speakers. Software includesfreeutilities called Meracl ™ and

Easy Thumbnails ™, and Camtasia Studio ™, which has an educational license at a

reduced rate. Some photographs were free by taking pictures with the digital camera.

Other photographs were purchased at an Internet site. The labor involved included Web

site development and adding clickable areas to the photographs for the Web pages.

Table 10 shows the actual costs for producing the content for the single lesson in

the study. The labor amount doubled to $4,000 for the CFTH Web pages taking longer

than expected (not an ongoing payment because it was internal labor). The learning

object took two days to build and test, so the bulk of the labor went to creation of the

Web pages. Time spent on the creation of Web pages included conceptual design, script

dialogs, researching the publicly available data to use in the case studies, choosing

photographs to use, writing the HTML for Web pages, and testing the application.

89
Table 10

Actual Costs for a Simple Linear Regression Lesson

General or per lesson Item Cost

General Personal computer, Microsoft Windows 0


XP®, Microsoft Office™
Digital camera 0a
USB headset $40
Speakers $20
Camtasia Studio™ $200
Meracl software ("hot spot" editor) 0
Easy Thumbnails software (to resize images 0
easily)
Labor for CFTH C4Hire corporate buildingb $2,000

Subtotal $2,260

Per lesson Stock photographs $50


Labor for learning object(s) and CFTHb $4,000
Subtotal $4,050

Total $6,310c
a
$100 if needed. b40 hours @ $50 / hour. c$6,000 of the total represents labor costs.

Evaluation of Findings

First, this was an exploratory study with a small sample size, used nonparametric

statistics to evaluate the results, and the study made no inference to a population from this

data. Having said that, the study proved useful in identifying anyflawsin the research

procedure for such an original set of materials used, demographics about the MBA

students involved, and costs involved in the study. Because no prior research existed that
90
combined the four online methods involved, no direct comparison was possible to prior

evidence.

Perhaps the most significant aspect of this research is that results are quantitative.

The quantifiable results measured whether the materials can be useful in raising test

scores from a pretest to a posttest using the combined online methods and whether the

test scores compared favorably to traditional textbook and lecture format of education for

this single lesson. The absence of prior research combining these methods, and the

originality of the materials produced at a low cost, is what makes this study useful.

Thefindingsshowed that the combined methods produced statistically significant

positive outcomes in the limited setting of MBA students in a private Midwestern

university. Having median scores improve from 5 out of 12 questions on the pretest to 8

out of 12 questions on the posttest could mean the difference in passing the 60% mark for

the AP Statistics exam that provides college credit. Using mean posttest scores, the

quasi-experimental group had 16% more correct answers than the control group.

The costs documented in this study include laying the groundwork for additional

material. Some of the costs will reduce for future lessons. The cost of building the

corporate setting for C4Hire, Inc. isfixedand most of it remains unchanged for any of the

remaining immersive case studies. For example, to add another set of assignments to the

C4Hire corporate building will be the work involved in adding lines to an existing Web

page for the launching of the case studies, and the textual material for the statistics guru

of the immersive environment. The labor for the case studies reduces drastically because

of the existence of template Web pages to copy and edit. Learning objects will take

similar amounts of time because of limiting their size, scope, and features to be similar.

91
Design time for case studies in the CFTH would also remain similar in locating pictures

and publicly available data sets.

Comparison to Theoretical Framework

A "best of breed" integration concept was implemented (Engle, 2008), which

combined an e-Coursemap (Ruffini, 2004) for navigation, a learning object and

WebQuest to address skills development, and CFTH for an immersive case study

environment. While the individual methods demonstrated positive results in prior

literature, the original materials in this study and the combination of the four individual

methods were new. The theoretical framework concepts of constructivism (Bush, 2006;

Connolly & Begg, 2006), problem-based learning (Abramovich & Cho, 2006), and

cognitive flexibility theory (Rossner-Merrill, Parker, Mamchur, & Chu, 1998; Wiley,

2002), demonstrated in this research study formed the foundation for the results.

WebQuests follow constructivist pedagogy (Halat, 2008), and by merging a

WebQuest and learning object, a skill can be developed and subsequently used in the case

studies. The fact that no learner asked for support during the research meant the learner

could follow the structure of the WebQuest independently. One complaint of prior

research is that very little quantitative research exists (Abbitt & Ophus, 2008). The fact

of the study being quantitative helps to address this prior complaint.

Learning objects provide a simple and content rich method to gain skills quickly,

incorporating problem-based learning. Learning object efficacy, proven in the Farha

(2007) study, quantitatively showed statistically significant positive improvement over

traditional methods. Weller (2004) produced 120 learning objects to represent material

for an entire Economics course in a cost-effective manner.

92
Cognitiveflexibilitytheory ensures that information comesfrommultiple

perspectives (Jacobson & Spiro, 1995). Cognitiveflexibilityis the ability to take

different conceptual and case perspectives in order to represent knowledge (Jang, 2000).

The multiple perspectives to the single topic of simple linear regression in the research

were a WebQuest, learning object, and business case studies in the CFTH. Two case

studies were included in the CFTH immersive case study environment. While any

number of cases could be set up within the CFTH, two were sufficient to provide multiple

industry examples of using the statistical technique within the development time limits

involved.

Thefindingsaffect the practical nature of developing effective online training

materials at a low cost (for both corporations and university settings). In addition, the

findings add to the research for online methods, adding to the quantitative research on

learning objects and CFTH, as well as adding to the sparse literature on WebQuests for

adult students. More specifically, thefindingsprovide one more step in achieving the

goals reflected in the GAISE college reportfrom2005.

Summary

The results of gathering research data can be surprising. For instance, it was a

surprise how many people would volunteer, and where they would come from. In this

study, the control group came predominantly from a single cohort in Greenwood,

Indiana, and the quasi-experimental group came from Lexington, Kentucky and Madison,

Indiana. The expectation was to see more participants comefromLouisville, Cleveland,

or Fort Wayne, but that was not the case. It was surprising that not a single question

93
came up on how to navigate the combined online materials, showing that MBA students

were able to navigate the research materials and the testing required within the study.

By comparing two groups of participants in terms of online and traditional

methods, students in the study achieved higher outcomes in terms of median and mean

posttest scores with the combined online materials. This result was not a foregone

conclusion before the research. Before the study, it was a question as to whether the

combined online methods would be as effective as traditional methods. The materials

were original and untested, took time and effort to build, causing the need for an

exploratory pilot such as this one to determine if it is worth proceeding any further and in

what directions.

Based on these results, it is now clearer that more research would not be a waste

of time, money, and energy. It is more obvious what directions research in this area can

take. More research studies focusing in the areas of types of participants, types of

content, amount of content, and hybrid learning are the topics for the next chapter.

94
Chapter 5: Implications, Recommendations, and Conclusions

On a practical level, businesses require employees who can collect data,

synthesize and analyze data into information, and present findings to colleagues and

associates (Seifer, 2009). The general problem is a lack of statistical literacy for students,

which translates into the same lack of statistical literacy in employees of corporations.

The specific problem is how to improve pedagogy in an introductory statistics course. To

improve statistical literacy, students need to be effectively and efficiently educated in

college-level statistics courses (Ben-Zvi & Garfield, 2008). The American Statistical

Association recognized the need to change the pedagogy for introductory college

statistics courses, funding a study and approving six recommendations for how to

increase statistical literacy called the college GAISE report (Everson & Garfield, 2008).

The purpose of the quantitative study was to further research into complying with

the GAISE recommendations to improve statistical literacy. The study compared the

effectiveness of combined online teaching methods to traditional methods of teaching a

topic in a business statistics course. The topic of the study was simple linear regression

in a 1-week segment of a required introductory college-level statistics course.

The literature review yielded several articles on each individual online teaching

method, but no existing literature on the combination of the four methods. For instance,

Farha (2007) reported that learning objects have a more favorable outcome for students

than traditional methods. WebQuests (Dodge, 2005) are prolific and touted as a

beneficial teaching tool, but there is a lack of quantitative evidence to support that claim

(Abbitt & Ophus, 2008). Cognitive Flexibility Theory Hypertext (CFTH) and e-

Coursemaps had the least amount of formal research written in the literature review.

95
Adding a quantitative study to the body of knowledge is very beneficial to all of these

topics.

The research method was quantitative, with a pretest and posttest design (Vogt,

2007), and participantsfromtwo groups of students, non-randomly assigned, at a private

Midwestern university. The two groups were a control group and a quasi-experimental

group. Members of the quasi-experimental group received an intervention combining

online teaching methods for learning simple linear regression, and members of the

control group received only traditional instruction for the same topic. The concepts

taught in the research study for simple linear regression arefromthe textbook, Statistics

for Managers Using Microsoft Excel (Levine, Stephan, Krehbiel, & Berenson, 2011).

Both within-group and between-groups comparisons were conducted using

nonparametric statistics. Nonparametric statistics "make no assumptions about

population parameters or distributions" (Vogt, 2007, p. 68). Because this is an

exploratory study, with a small sample size, generalizations about population normality

or its mean and standard deviation did not take place.

When performing research, ethical areas of concern are protection from harm,

informed consent, right to privacy, and honesty with professional colleagues. Successful

approval from the IRB of Northcentral University andfromthe university where research

took place aided to protect human participants in research. Collection of signed consent

forms prior to research beginning also satisfied some of the ethical protections required.

The informed consent form contained language about protection from harm,

participation being voluntary, and right to privacy (Appendix H, Appendix I). To protect

the privacy and confidentiality of participants, each participant received a unique

96
identifier, only provided after signed consent to participate in the study. The consent

form stated that grades in the course did not suffer because of the study, thus covering

any ethical concerns about harming the participant.

For the test instrument, electronic mail messages provided documentation of

validation by independent experts. Screen captures of online materials and the test

instrument were placed in the appendix for independent verification as required

(Appendix A, B, C, D, F).

This chapter discusses implications by drawing conclusions from each research

question, and any potential limitations that may have affected the results. A myriad of

future research opportunities exist. Included in the following section are implications of

the research, possible future research, and any conclusions.

Implications

The first research question for this research was if test scores for a lesson on

simple linear regression would differ over a single weekfroma pretest to a posttest when

college students received a combination of online teaching methods. In short, do the

combined online materials improve test scores for the student?

Hl 0 . Test scores will not differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regressionfromthe beginning to the end

of a 1-week segment of a college-level statistics course.

Hl a . Test scores will differ significantly for college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

97
CFTH for learning simple linear regression from the beginning to the end

of a 1-week segment of a college-level statistics course.

Evidence using median and mean test scores {Mdn = 5 to 8, M= 6 to 8), along

with a Wilcoxon signed rank test (z = -2.263,/? = .024) suggests that with the quasi-

experimental group, the combined online materials increased the test scores between the

pretest and posttest. More research is required to determine if the materials are useful in

a hybrid teaching environment as supplemental materials, because that was not the focus

of this study, but was the highest learning preference from the demographic questions

(60% prefer a mixture).

The second research question for this research was if test scores for a lesson on

simple linear regression would differ over a single week from a pretest to a posttest

between a group of MBA students who received a combination of online teaching

methods and another group of MBA students who received traditional methods. In short,

were the combined online materials more effective than the traditional lecture and

textbook?

H2„. Test scores will not differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

H2a. Test scores will differ significantly between college students receiving a

combination of a graphical organizer, a WebQuest, a learning object, and

CFTH for learning simple linear regression and students receiving

98
traditional methods for learning simple linear regression in a 1-week

segment of a college-level statistics course.

The traditional methods of lecture and textbook produced a 4.5 percentage-point

improvement in test scores, while the combined online methods and content produced a

16.7 percentage-point improvement in test scores. For these students, the combined

online methods showed results that are more positive.

Limitations of the Study

Limitations of the study were solidly around how many participants there would

be in the study. Limitations of sample size could have an effect on interpreting the

results for any populations outside of the one studied (Black, 1999). That is why no

inferences to an entire population have taken place. Demographic data showed no

statistically significant difference between the control group and quasi-experimental

group.

It is a realistic limitation in the study to have no Microsoft Excel ™ experts and

no participants with zero spreadsheet experience. Another realistic limitation of the study

is the professor teaching the traditional lecture and textbook format. It is important to

point out that no professor was present in the combined online methods, so the benefit or

lack of benefit in having a professor as a part of the learning experience is a potential

variable. In a full online course environment, a professor would be required to answer

questions and grade assignments. The fact that no professor enhanced the online

experience lends more credibility that the online materials were the factor in student

achievement during the posttest.

99
The MBA students are a limitation of the study. A test of IQ would have been

beneficial to determine aptitude and a test for math anxiety could have been a useful

measurement in the study. Achieving the status of an MBA graduate student was the

only prerequisite of the study. The fact that the pretest results were almost identical helps

to prove the control and quasi-experimental groups of students were on the same level to

begin learning the subject matter.

Recommendations

Combined online materials using new technology in this particular study

outperformed traditional methods in terms of student achievement. The median score for

pretest of both the traditional and quasi-experimental groups was 5 correct answers out of

12 questions. On the posttest, the traditional group achieved 6 correct answers out of 12,

while the quasi-experimental group achieved 8 correct answers out of 12 questions.

Low-cost material development of the new online materials was documented,

which is of interest when duplicating these results in future research. High quality,

affordable training materials are a practical application of this study for any company or

educational institution. The software used in creating the online tutorial learning object

cost under $200 and the labor for the initial learning object was under 2 days. In a

business setting, creating training materials can be very expensive, so low-cost methods

are an advantage.

In an Information Technology (IT) environment, each project presents new

challenges for both technical and business people. For example, implementation of a

new medical records application requires training of business staff, medical staff, and

100
information technology staff to be successful. Low cost learning objects are a possible

solution for small sessions of training and skills development.

More research in this area is recommended, due to the positive outcome. The

question becomes what types of new research are possible. Many different research

efforts can spring from this dissertation. The scope of the study was to create one lesson

out of 17 for an introductory college statistics course. The content of the study was a

lesson in statistics on simple linear regression. The first set of options for more research

relates to using the same materials in new ways, such as varying the participants, varying

the type of research study, and using hybrid education. The next set of options would

increase the content for a whole course. A third set of options would be to apply the

same techniques to other types of subject matter. For instance, information technology

topics like database design, systems analysis, or programming courses could be the new

subject matter for combining online methods. Any conceptually complex material could

be the subject matter for new research.

Using the Same Research Materials

One way to think about how to use this research is to broaden the audience of

participants, thus increasing the sample size and allowing use of more quantitative

methods. The research participants in this study were MBA students in Indiana and

Kentucky at a private for-profit institution. Opening up research to public education is an

option. Future research could include geographically disperse participants in the United

States or international institutions. Future research could use different participants, such

as undergraduates, doctoral level students, or Advanced Placement Statistics students at

the high school level. Research with corporate employees is another way to apply this

101
research. Researching with people that are hearing-impaired would be beneficial for a

population not supported well by traditional education. According to Richardson (2009),

individuals with hearing loss have 51% less chance of obtaining a good college degree

compared to students with normal hearing.

Another approach would be to vary the type of research. Quantitative research

was the choice of this particular study. The type of research to perform could be a

qualitative or mixed methods study of student's reactions to the training materials. Focus

groups could help determine if the content is engaging enough for class use.

Costs documented in this study illustrate what resources are required for the

combined online materials. Performing more quantitative cost analysis research in

comparison to traditional costs would be beneficial to determine return on investment for

administrators.

The type of education could vary for future research. The type of education in

this study was online using four combined methods. Research with hybrid education is

possible. The combined online training in combination with a learning management

system such as blackboard is one such option. Research using combined online training

methods as supplemental materials is another avenue.

One lesson in statistics was tested during this research. An approach to future

research is to complete the entire statistics course instead ofjust one lesson, and evaluate

student results over a broader set of subject matter. The question to answer would be if

all topics in one course were suited to produce equal degrees of competence compared to

this research. Having a whole course would allow creation of content by the participant

102
and grading of assignments, which is not possible with a single lesson and time

constraints.

Same Online Techniques, Different Content

Opening up research to different topics of instruction is another possibility. Can

learning objects, CFTH, e-Coursemaps, and WebQuests be useful for any other complex

conceptual subject matter, such as database design, systems analysis and design,

information security, natural sciences, physical sciences, behavioral sciences, finance,

management, or human resources? Would different combinations of online materials and

methods, such as using only WebQuests and learning objects create better outcomes for

some disciplines?

Future research on the type of content is possible from this research. Changing

the case study content or the learning object content are two options. For instance, what

is the measureable difference in outcomes if a picture of an instructor is visible with the

learning object? What is the difference in outcome if the case study data in the CFTH

matches the participant's industry or if the participant can choose which case study or

studies to work with? What is the difference in outcome if production of audio-visual

materials uses professional equipment, but adds more expense? Would posting the

learning objects on a video sharing Web site produce better outcomes?

Conclusions

The results are consistent with the achievement recorded for the individual online

teaching methods as being positive in terms of student achievement. The big difference

is that by combining four online teaching methods together, a whole course can take

shape. In prior research literature, researchers tried to use only a single online method for

103
the entire development of a course. The flexibility required to develop an entire course

was an idealistic goal for a single teaching method.

Most of the research studies marginally discussed cost of the online methods

(Weller, 2004). By making cost and affordability of a solution a critical component of

the educational materials, the applicability of the results can be more widespread than if

the materials were too expensive to develop. By using photographs in the CFTH, an

engaging and more real environment at a very low cost is possible. The programming

tool to create hot spots to click on for each picture was free, and did not require special

training to use (Meracl, 2010). The same tool was useful in creating the e-Coursemap.

Camtasia Studio ™ helped create the multimedia learning object. These inexpensive

alternatives for costly development are an innovative option in educational material

development for complex material. The cost component is a concern for corporations

because of limited budgets.

High-quality effective online content at a low cost is a reality. For a relatively

small amount of investment in time and money, a lesson on simple linear regression was

fully developed and tested on a group of MBA students with positive results. Using

combined online teaching methods allowed for flexibility in the materials, as well as a

challenging environment for the learner. Creating materials potentially useful for people

with a hearing impairment was a surprise, and a natural fit for the format of the materials.

A wide variety of research efforts can spring from this dissertation. Keeping the

same materials intact, but varying the participants, varying the type of research study, and

using hybrid education would yield new research opportunities. Increasing the content

104
from one lesson to an entire course, or applying the combination of online techniques to

other topics of instruction is another path leading to future research.

It is encouraging to know that positive outcomes are feasible, even considering

the limited setting of the research. The purpose of helping corporate employees and

students learn complex material, while adhering to the GAISE recommendations (College

Report, 2010), is possible. We now have years of potential additional research

possibilities to improve statistical literacy, while keeping cost in mind.

105
References
Abbitt, J., & Ophus, J. (2008). What we know about the impacts of WebQuests: A review
of research. AACE Journal, 76,441-456.

Abramovich, S., & Cho, E. K. (2006). Technology as a medium for elementary


preteachers' problem-posing experience in mathematics. Journal of Computers in
Mathematics and Science Teaching, 25, 309-323.

Active Worlds (2010). Main Page. Retrieved from http://www.activeworlds.com

Adair, J. G. (1984). The Hawthorne effect: A reconsideration of the methodological


artifact, Journal of Applied Psychology, 69(2), 334-345, doi: 10.1037/0021-
9010.69.2.334.

Alajaaski, J. (2006). How does web technology affect students' attitudes towards the
discipline and study of mathematics/statistics? International Journal of
Mathematical Education in Science & Technology, 37(1), 71-79.
doi: 10.1080/00207390500226002

Albaum, G., Roster, C. A., Wiley, J., Rossiter, J., & Smith, S. M. (2010). Designing Web
surveys in marketing research: Does use of forced answering affect completion
rates? Journal of Marketing Theory & Practice, 18(3), 285-293.
doi:10.2753/MTP1069-6679180306

Alias, M. (2009). Integrating technology into classroom instructions for reduced


misconceptions in statistics. International Electronic Journal of Mathematics
Education, 4(2), 77-91.

Allert, H., Richter, C , & Nejdl, W. (2004). Lifelong learning and second-order learning
objects. British Journal of Educational Technology, 35, 701-715.
doi:10.1111/j.l467-8535.2004.00428.x

Alsup, J. (2005). A comparison of constructivist and traditional instruction in


mathematics. Educational Research Quarterly, 28(4), 3-17.

Archibald, R. B., & Feldman, D. H. (2008). Explaining increases in higher education


costs. Journal of Higher Education, 79,268-295. doi:10.1353/jhe.0.0004

Ayas, C. (2006). An examination of the relationship between the integration of


technology into social studies and constructivist pedagogies. Turkish Online
Journal of Educational Technology, 5(1), 14-25.

Bennett, S., Maton, K., & Kervin, L. (2008). The 'digital natives' debate: A critical
review of the evidence. British Journal of Educational Technology, 39, 775-786.
doi:10.1111/j.l467-8535.2007.00793.x

106
Bennett, K., & McGee, P. (2005). Transformative power of the learning object debate.
Open Learning, 20(1), 15-30. doi:10.1080/0268051042000322078
Ben-Zvi, D., & Garfield, J. (2008). Introducing the Emerging Discipline of Statistics
Education. School Science & Mathematics, 108(8), 355-361.
Bergtrom, G. (2006). Clicker sets as learning objects. Interdisciplinary Journal of
Knowledge & Learning Objects, 2,105-110.
Big Stock Photo (2010). Main page. Retrieved from http://www.bigstockphoto.com
Black, T. R. (1999). Doing Quantitative Research in the Social Sciences. Sage
Publications, Ltd.
Bordens, K. S., & Abbot, B. B. (2004). Research Design and Methods (6th ed.). New
York, NY: McGraw-Hill.
Bronack, S., Sanders, R., Cheney, A., Riedl, R., Tashner, J., & Matzen, N. (2008).
Presence pedagogy: Teaching and learning in a 3D virtual immersive world.
International Journal of Teaching & Learning in Higher Education, 20(1), 59-69.

Bude\ L., Imbos, T., Wiel, M., Broers, N., & Berger, M. (2009). The effect of directive
tutor guidance in problem-based learning of statistics on students' perceptions and
achievement. Higher Education, 57(1), 23-36. doi:10.1007/sl0734-008-9130-8

Bush, G. (2006). Learning about learning: From theories to trends. Teacher


Librarian, 34(2), 14-18.
Butson, R. (2003). Learning objects: Weapons of mass instruction. British Journal of
Educational Technology, 34, 667-669. doi:10.1046/j.0007-1013.2003.00359.x

Buzzetto-More, N. (2008). Student perceptions of various E-learning components.


Interdisciplinary Journal of Knowledge & Learning Objects, 4,113-135.

Buzzetto-More, N., & Pinhey, K. (2006). Guidelines and standards for the development
of fully online learning objects. Interdisciplinary Journal ofKnowledge &
Learning Objects, 2, 95-104.

Buzzetto-More, N., Sweat-Guy, R., & Elobaid, M. (2007). Reading in A digital age: E-
books are students ready for this learning object? Interdisciplinary Journal of
Knowledge & Learning Objects, 3,239-250.

Capshew, T. F. (2005). Motivating social work students in statistics courses. Social Work
Education, 24,857-868. doi:10.1080/02615470500342207

107
Carmichael, C, Callingham, R., Watson, J., & Hay, I. (2009). Factors influencing the
development of middle school students' interest in statistical literacy. Statistics
Education Research Journal, 8(1), 62-81.

Carter, S. D. (2002). Matching training methods and factors of cognitive ability: A means
to improve training outcomes. Human Resource Development
Quarterly, 13(1), 71-87. doi:10.1002/hrdq.l014
Cebeci, Z. & Tekdal, M. (2006). Using podcasts as audio learning objects.
Interdisciplinary Journal of Knowledge & Learning Objects, 2,47-57.

Cerrito, P. B. (1999). Teaching statistical literacy. College Teaching, 47(1), 9.

Chen, I., Willis, J., & Mahoney, S. (2005). WebCT and its growth as a type II
application. Computers in the Schools, 22(1), 147-156.
doi:10.1300/J025v22n0M3

Cher, P. L., Siew, L. L., & Richards, C. (2006). Developing interactive learning objects
for a computing mathematics module. International Journal on E-Learning, 5(2),
221-244.

Chiesi, F., & Primi, C. (2010). Cognitive and non-cognitive factors related to students'
statistics achievement. Statistics Education Research Journal, 9(1), 6-26.

Coffey, H. (n.d.). Bloom's Taxonomy. Retrieved from http://www.learnnc.org/lp/pages/


4719
College Budgets. (2006). Radical Teacher, (77), 43.
College Report (2010). GAISE college report. Retrieved from http://www.amstat.org/
education/gaise/.
Connolly, T. M., & Begg, C. E. (2006). A constructivist-based approach to teaching
database analysis and design. Journal ofInformation Systems
Education, 17(1), 43-53.

Convertini, V. N., Albanese, D., Marengo, A., Marengo, V., & Scalera, M. (2006). The
OSEL taxonomy for the classification of learning objects. Interdisciplinary
Journal ofKnowledge & Learning Objects, 2,125-138.
Coutis, P. F. (2007). Responsive curriculum design in a statistics service
unit. International Journal ofMathematical Education in Science &
Technology, 38, 501-515. doi:l0.1080/00207390701240836

Cozby, P. (2004). Methods in behavioral research (8th ed.). New York, NY: McGraw-
Hill.

108
Cybinski, P., & Selvanathan, S. (2005). Learning experience and learning effectiveness in
undergraduate statistics: Modeling performance in traditional and flexible
learning environments. Decision Sciences Journal ofInnovative Education, 3,
251-271. doi:10.1111/j.l540-4609.2005.00069.x

Delmas, G., Joan, G., Ooms, A., & Chance, B. (2007). Assessing students' conceptual
understanding after a first course in statistics. Statistics Education Research
Journal, 6(2), 28-58.

De Salas, K., & Ellis, L. (2006). The development and implementation of learning
objects in a higher education setting. Interdisciplinary Journal of Knowledge &
Learning Objects, 2, 1-22.

Dinov, I. D., & Christou, N. (2009). Statistics online computational resource for
education. Teaching Statistics, 31(2), 49-51. doi:10.11 ll/j.1467-
9639.2009.00345.x

Dodge, B. (2005). The WebQuestpage: Site overview. San Diego, CA: Educational
Technology Department, San Diego State University. Retrieved from
http://webquest.org/index.php
Elliott, K., & Sweeney, K. (2008). Quantifying the reuse of learning objects. Australasian
Journal of Educational Technology, 24(2), 137-142.
Engle, P. (2008). Best of breed. Industrial Engineer: IE40(S), 18.

Evans, B. (2007). Student attitudes, conceptions, and achievement in introductory


undergraduate college statistics. Mathematics Educator, 17(2), 24-30.

Everson, M., Zieffler, A., & Garfield, J. (2008). Implementing new reform guidelines in
teaching introductory college statistics courses. Teaching Statistics, 30(3), 66-70.
doi:10.1111/j.l467-9639.2008.00331.x

Everson, M., & Garfield, J. (2008). An innovative approach to teaching online statistics
courses. Technology Innovations in Statistics Education, 2(1). Retrieved from
http ://www.escholarship.org/uc/item/2 v6124xr
Fairfield-Sonn, J., Kolluri, B., Rogers, A., & Singamsetti, R. (2009). Enhancing an
undergraduate business statistics course: Linking teaching and learning with
assessment issues. American Journal of Business Education, 2(7), 101-112.
Farha, N. (2007). An exploratory study into the efficacy of learning objects. Retrieved
from ProQuest Digital Dissertations (ATT 3259530).

Flash (2009). Flash player penetration. Retrieved from


http://www.adobe.com/products/player_census/flashplayer/
version_penetration.html

109
Ford, E. W., Menachemi, N., Huerta, T. R., & Yu, F. (2010). Hospital IT adoption
strategies associated with implementation success: Implications for achieving
meaningful use. Journal of Healthcare Management, 55(3), 175-188

Forster, P. A. (2007). Technologies for teaching and learning trend in bivariate


data. International Journal of Mathematical Education in Science &
Technology, 38(2), 143-16l.doi:l0.1080/00207390600913293

Friesen, N. (2004). The international learning object metadata survey. International


Review of Research in Open & Distance Learning, 5(3), 1-4.

Friesen, N., & Koohang, A. (2005). Interoperability and learning objects: An overview of
E-learning standardization. Interdisciplinary Journal of Knowledge & Learning
Objects, 1,23-31.

Gigerenzer, G., Gaissmaier, W., Kurz-Milcke, E., Schwartz, L. M., & Woloshin, S.
(2007). Helping doctors and patients make sense of health
statistics. Psychological Science in the Public Interest (Wiley-Blackwell), 8(2),
53-96. doi:10.1111/j.l539-6053.2008.00033.x

Goodfellow, G. W., & Maino, D. M. (2010). ASCOTech: World wide web as easy as 1.0,
2.0, 3.0. Optometric Education, 35(2), 62-63.

Gordon, S., Petocz, P., & Reid, A. (2009). What makes a "good" statistics student and a
"good" statistics teacher in service courses? Montana Mathematics
Enthusiast, 6(1), 25-39.

Government Accountability Office (2005). College Textbooks: Enhanced offerings


appear to drive recent price increases. Retrieved from http://www.gao.gov/
new.items/d05 806.pdf

Graduate Management Admission Council (2009). 2009 Application Trends Survey


Report. Retrieved from http://www.gmac.com/NR/rdonlyres/32D2B92A-776F-
4DFA-9903-C251B1 B7862D/0/2009AT_SR_Web.pdf

Grandzol, J. R. (2004). Teaching MBA statistics online: A pedagogically sound process


approach. Journal of Education for Business, 79,237-244.

Guo, R. X., Dobson, T., & Petrina, S. (2008). Digital natives, digital immigrants: An
analysis of age and ICT competency in teacher education. Journal of Educational
Computing Research, 38,235-254. doi:10.2190/EC.38.3.a

Hakeem, S. A. (2001). Effect of experiential learning in business statistics. Journal of


Education for Business, 77(2), 95-98. doi:10.1080/08832320109599056

Halat, E. (2008). A good teaching technique: WebQuests. Clearing House, 81(3), 109-
111. doi:10.3200/TCHS.81.3.109-112

110
Hanna, W. (2007). The New Bloom's Taxonomy: Implications for music education. Arts
Education Policy Review, 108(4), 7-9,12-16. doi:10.3200/AEPR.108.4.7-16
Harrington, C. F., & Schibik, T. J. (2004). Methods for maximizing student engagement
in the introductory business statistics course: A review. Journal ofAmerican
Academy ofBusiness, Cambridge, 4(1), 360-364.

Harrington, D. (1999). Teaching statistics: A comparison of traditional classroom and


programmed instruction/distance learning approaches. Journal ofSocial Work
Education, 35, 343-352.

Harvey, D. M., Godshalk, V. M., & Milheim, W. D. (2001). Using cognitive flexibility
hypertext to develop sexual harassment cases. Computers in the Schools, 18(1),
213.
Heath, B. P., McArthur, D. J., McClelland, M. K., & Vetter, R. J. (2005). Metadata
lessons from the iLumina digital library. Communications of the ACM, 48(7), 68-
74. doklO.l 145/1070838.1070839
Hughes, M., & Daykin, N. (2002). Towards constructivism: Investigating students'
perceptions and learning as a result of using an online environment. Innovations
in Education & Teaching International, 39,217-224.
doi:10.1080/13558000210150036

Hurlburt, R. T. (2001). "Lectlets" deliver content at a distance: Introductory statistics as a


case study. Teaching of Psychology, 28(1), 15-20.

Isaac, S., & Michael, W. B. (1997). Handbook in research and evaluation. San Diego,
CA: EdITS.

Jacobson, M. J., & Spiro, R. J. (1995). Hypertext learning environments, cognitive


flexibility, and the transfer of complex knowledge: An empirical investigation.
Journal of Educational Computing Research, 12(4), 301-333.
Jang, S. (2000). Effects of cognitive flexibility theory-based instruction on Korean high
school history teaching. Distance Education, 21(1), 136.
doi:10.1080/0158791000210109
Johnson, H. D., Dasgupta, N., Zhang, H., & Evans, M. A. (2009). Internet approach
versus lecture and lab-based approach for teaching an introductory statistical
methods course: Students' opinions. Teaching Statistics, 31(1), 21-26.
doi:10.1111/j.l467-9639.2009.00335.x

Jones, D. L. (2003). An overview of the sharable content object reference model. College
& University Media Review, 10(1), 27-36.

Ill
Kapp, K. M. (2003). How long does it take? Estimation methods for developing e-
learning. ASTD Learning Circuits. Retrieved from
http://www.astd.org/LC/2003/0703_kapp.htm

Kasonga, R. A., & Corbett, A. D. (2008). An assessment model for improving student
learning of statistics. South African Journal of Higher Education, 22,602-614.

Kendall, K. E., & Kendall, J. E. (1999). Systems analysis and design (4th ed.). Upper
Saddle River, NJ: Prentice Hall, 1999.

Knowles, A. (2005). Online education using learning objects. Journal of Distance


Education, 20(1), 104-106.

Kottemann, J. E., & Salimian, F. (2008). Engaging students in statistics. Decision


Sciences Journal of Innovative Education, 6,247-250. doi:10.111 l/j.1540-
4609.2008.00170.x

Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory into


Practice, 41,212. doi:10.1207/sl5430421tip4104_2

Krauss, F., Ally, M., & Koohang, A. (2005). A study of the design and evaluation of a
learning object and implications for content development. Interdisciplinary
Journal of Knowledge & Learning Objects, 1,1-22.

Lahaie, U. D. (2008). Is nursing ready for WebQuests? Journal of Nursing


Education, 47, 567-570. doi:10.3928/01484834-20081201-05

Larson, D. K., & Sung, C-H. (2009). Comparing student performance: Online versus
blended versus face-to-face. Journal of Asynchronous Learning Networks, 13(1),
31-42.

Lawrence, J. A., & Singhania, R. P. (2004). A study of teaching and testing strategies for
a required statistics course for undergraduate business students. Journal of
Education for Business, 79,333-338.

Leech, N. L. (2008). Statistics poker: Reinforcing basic statistical concepts. Teaching


Statistics, 30(1), 26-28. doi:10.1111/j.l467-9639.2007.00309.x

Levine, D. M., Stephan, D. F., Krehbiel, T. C , & Berenson, M. L. (2011). Statistics for
managers using Microsoft Excel (6th ed.). Boston, MA: Prentice Hall, 2011,

Kendall, K. E., & Kendall, J. E. (1999). Systems analysis and design (4th ed.). Upper
Saddle River, NJ: Prentice Hall, 1999.

Lima, M., Koehler, M. J., & Spiro, R. J. (2004). Collaborative interactivity and integrated
thinking in Brazilian business schools using cognitive flexibility hypertexts: The
Panteon project. Journal of Educational Computing Research, 31,371-406.
doi: 10.2190/TTK2-TDRP-D0DX-M8XN
112
Mallmann, E. M., & Nunes, I. K. C. (2008). The potentiality of the teaching-learning
objects (T-LO) and teaching-learning virtual environments (T-LIVE).
International Journal of Emerging Technologies in Learning, 5(4), 51-59.

Martin, R. (2002). Why tuition costs are rising so quickly. Challenge (05775132), 45(4),
88.

McKnight, J. (2006). MERLOT-Not just for wine connoisseurs. Strategic


Finance, 87(1), 21-22.

Meister-Emerich, K. A. (2008). Analysis and evaluation of learning objects for use in an


introductory statistics course. (Ed.D., University of Wyoming). (3320748)

Meletiou-Mavrotheris, M., Lee, C , & Fouladi, R. T. (2007). Introductory statistics,


college student attitudes and knowledge - a qualitative analysis of the impact of
technology-based instruction. International Journal of Mathematical Education in
Science & Technology, 38(1), 65-83. doi:10.1080/00207390601002765

Meracl (2010). Meracl image map generator. Retrieved from http://www.Stockholm


.bonet. se/meracl/mimg.htm

Monahan, J. (2007). Statistical literacy: A prerequisite for evidence-based


medicine. Psychological Science in the Public Interest, 8(2), i-ii.
doi:10.1111/j.l539-6053.2008.00033_l.x

Mykytyn, P. P. (2007). Educating our students in computer application concepts: A case


for problem-based learning. Journal of Organizational and End User
Computing, 19(1), 51-61.

Neo, M., & Neo, T. (2009). Engaging students in multimedia-mediated constructivist


learning - students' perceptions. Journal of Educational Technology & Society,
12,254-266.

Pan, W., & Tang, M. (2005). Students' perceptions on factors of statistics anxiety and
instructional strategies. Journal of Instructional Psychology, 32,205-214.

Papastergiou, M. (2008). Online computer games as collaborative learning environments:


Prospects and challenges for tertiary education. Journal of Educational
Technology Systems, 37(1), 19-38. doi:10.2190/ET.37.1.c

Parrish, P. E. (2004). The trouble with learning objects. Educational Technology


Research & Development, 52(1), 49-67. doi:10.1007/BF02504772

Porter, A., Griffiths, D., & Hedberg, J, (2003). From classroom to online teaching:
Experiences in improving statistics education. Journal of Applied Mathematics &
Decision Sciences, 7(2), 65.

113
Poupa, C, & Forte, E. (2003). Collaborative teaching with learning objects in an
international, non-profit context. The example of the ARIADNE community.
Educational Media International, 40,239-248.

Prenksy, M. (2001). Digital natives, digital immigrants. Retrieved from


http://www.marcprensky.corn/writing/
Ramos, F. (2009). Teaching engineering concepts by interactive-animated learning
objects. International Journal oj Learning, id(l), 143-151.
Richards, G. (2003). Learning object repositories. Journal of Distance Education, 17(3),
47-48.

Richardson, J. T. E. (2009). The attainment and experiences of disabled students in


distance education. Distance Education, 30(1), 87-102.
doi:10.1080/01587910902845931
Ruffini, M. F. (2004). Designing an eMap to teach multimedia applications
online. International Journal of Instructional Media, 31,383-391.
Ruiz, J. G., Mintzer, M. J., & Issenberg, S. B. (2006). Learning objects in medical
education. Medical Teacher, 28, 599-605. doi:l0.1080/01421590601039893

Rossner-Merrill, V., Parker, D., Mamchur, C, & Chu, S. (1998). Using constructivist
instructional design featured in two online courses: Notes from the field.
Educational Media International, 35,282. doi:10.1080/0952398980350412
Schagen, I. (2006). Statistical literacy is the essential skill for educational
managers. Education Journal, 98,21-21.

Schau, C, & Mattern, N. (1997). Use of map techniques in teaching applied statistics
courses. American Statistician, 51,171-175. doi:10.2307/2685413
Schield, M. (2004). Statistical literacy and liberal education at Augsburg College. Peer
Review, 6(4), 16-18.

Schweizer, H., Whipp, J., & Hayslett, C. (2002). Quality control in online courses: Using
a social constructivist framework. Computers in the Schools, 19(3), 143.
doi: 10.1300/J025v 19v03_l 2
Seifer, R. (2009). Statistical literacy an essential competency for both producers and
consumers of data. Brown University Child & Adolescent Behavior Letter, 25(1),
1-7.
Shah, R. R., Chandrasekaran, A. A., & Linderman, K. K. (2008). In pursuit of
implementation patterns: the context of Lean and Six Sigma. International
Journal ofProduction Research, 46(23), 6679-6699.
doi: 10.1080/00207540802230504
114
Sirias, D. (2002). Using graphic organizers to improve the teaching of business
statistics. Journal of Education for Business, 78(1), 33.

Sloan Survey (2008). Staying the course - Online education in the United States, 2008.
Retrieved from http://www.sloan-.org/publications/survey/staying_course

Spiro, R. J., Collins, B. P., Thota, J. J., & Feltovich, P. J. (2003). Cognitive flexibility
theory: Hypermedia for complex learning, adaptive knowledge application, and
experience acceleration. Educational Technology, 43(5), 5-10.

Stamey, J. W., Jr. (2006). A comparison of the performance of undergraduate statistics


students using intelligent learning objects versus those receiving traditional
classroom instruction. Retrieved from ProQuest Digital Dissertations (ATT
3233063).

StatCounter(2010). Main page. Retrieved from http://www.statcounter.com

Stoilescu, D. (2008). Modalities of using learning objects for intelligent agents in


learning. Interdisciplinary Journal of Knowledge & Learning Objects, 4,49-64.

Strasser, S. E., & Ozgur, C. (1995). Undergraduate business statistics: A survey of topics
and teaching methods. Interfaces, 25(3), 95-103.

Straub, D. (1989). Validating instruments in MIS research. MIS Quarterly, 13(2), 147-
170.

Summers, J., Waigandt, A., & Whittaker, T. (2005). A comparison of student


achievement and satisfaction in an online versus a traditional face-to-face
statistics class. Innovative Higher Education, 29,233-250. doi;10.1007/sl0755-
005-1938-x

Sze, D. (2004). The role of video lectures in teaching introductory statistics. International
Journal for Technology in Mathematics Education, 11(2), 59-62.

Taylor, D. L. L., Stewart, G., & Dunn, T. L. (2005). The design of reusable learning
objects to teach database concepts. International Journal of Learning, i2(8), 129-
141.

Taylor, S. I., & Hsueh, Y. (2005). Implementing a constructivist approach in higher


education through technology. Journal of Early Childhood Teacher Education,
26(2), 127-132. doi:10.1080/10901020590967353

Tompsett, C. (2005). Reconfigurability: Creating new courses from existing learning


objects will always be difficult! Journal of Computer Assisted Learning, 21,440-
448. doi:10.1111/j. 1365-2729.2005.00154.x

115
Tsolakidis, C. C, & Fokiali, P. P. (2010). Costs of low-scale distance learning
programs. International Journal of Emerging Technologies in Learning, 5(1), 32-
38. doi:10.3991/ijet.v5il.H32

United Nations Development Programme (2010). The United Nations Development


Programme (UNDP). Development dictionary. Retrieved from
http://www.undp.org/poverty/devglossary_main.shtml
U.S. Department of Education. (2009). Fast Facts. Retrieved from
http://nces.ed.gov/fastfacts/display .asp?id=76
Viles, E. (2008). Paper-clip case: A practical activity to improve statistical thinking for
engineering students. Teaching Statistics, 30(2), 57-60. doi: 10.1111/j. 1467-
9639.2008.00322.x

Vogt, W. P. (2007). Quantitative research methods for professionals. Boston, MA:


Pearson.
Walker, L. (2007). Narrated PowerPoint as a self-learning resource. International Journal
ofLearning, 73(12), 1-5.

Ward, B. (2004). The best of both worlds: A hybrid statistics course. Journal of Statistics
Education, 12(3). Retrieved from http://www.amstat.org/publications/jse/
v 12n3/ward.html
Watts, M., & Carlson, W. L. (1999). A case method for teaching statistics. Journal of
Economic Education, 30(1), 52-58.

Weller, M. (2004). Learning objects and the e-learning cost dilemma. Open Learning,
19(3), 293-302. doi:10.1080/0268051042000280147
Wells, M. (2006). Teaching notes: Making statistics "real" for social work
students. Journal ofSocial Work Education, 42,397-404.

Whatley, J., & Ahmad, A. (2007). Using video to record summary lectures to aid
students' revision. Interdisciplinary Journal of Knowledge & Learning Objects, 3,
185-196.

Wiley, D. A. (2002). The instructional use of learning objects. Bloomington, IN:


Tichenor Printing. ISBN: 0784208921
Yacovelli, S. R. (2003). Understanding learning objects: The basic "chunks". College &
University Media Review, 10(\), 17-26.
Yahya, Y., & Yusoff, M. (2008). Towards a comprehensive learning object metadata:
Incorporation of context to stipulate meaningful learning and enhance learning

116
object reusability. Interdisciplinary Journal of Knowledge & Learning Objects, 4,
13-48.

Yordanova, K. (2007). Meta-data application in development, exchange and delivery of


digital reusable learning content. Interdisciplinary Journal of Knowledge &
Learning Objects, 3,229-237.

Zheng, R., Stucky, D., McAlack, M., Menchana, M., & Stoddart, S. (2005). WebQuest
learning as perceived by higher-education learners. TechTrends, 49(4), 41-49.
doi:10.1007/BF02824110

117
Appendix

118
Appendix A:
Graphical Organizer for Business Statistics Course

Business Statistics
5. Chart for Variability
Syllabus
1
4. Chart for Central Tendency
3. Type of Data
Consultant for Hire
)

2. Surveys >. Combinations / Permutations


7. Basic Stats - Mean, Median, Mode,
1, Databases
Variance, Standard Deviation
8. Probability of Events Occurring
9. Conditional Probability

10, Probability as an Area under a Curve

11. Normal Distribution Workshop


12. Sampling Techniques
17. Interpretation
Testing a Hypothesis
16. Reporting
14. Relationships between Variables
15. Improving Processes - Control Charting

Enterin Summarizing Data


Excel Basic 9 a n d Formatting Data Excel Advanced
Excel Tutorials Modeling Spreadsheets
Tutorials Working with Formulas Tutorials
Creating Charts Integrating with MS PowerPoint

119
Appendix B:
WebQuest Sample Pages

To r m t m i t o i h e main screen cEcfc ^2*1 ' < ' \ L r s S v U r ' , > 1 > r *. M M V s< SJ • \

Simple Linear Regression


A WflbQua'sI far ^-ssplortrig telationsHps b e t w e e n variable* a i w q s»rnpt« linear r e ^ e s v a "

Derigoed b\ Joseph A Sastkr 2009

^ felUTIurt f Pfee«t*t£ Mc-de Off ,l«'i

p mtiK/AnwjsUHiv^^sunittbqbm-hinii * T i *• 1 ' •' •*flJ

O - « • E*9S * J T
B«'» •

4 . P r e o f r t a Y vafueTbFe gTvenTX"value

I n t>T&&r to m a s t e r the skill, y o u can r u n t h e v i d e o and audio


t u t o r i a l lesson as m a n y t i m e s as y o u need t o .

The Process
First you'll r u n t h e spreadsheet t u t o r i a l . . .
FLASH c !le - T u t o ^ a ! using Microso*t Excel [ R j
W rdovvs Mov.e Fjie T u t o - a . using Microsoft £>ce! [R;.

1. Now t r y w h a t y o u teamed o n this s«t of X a n d ¥ pairs of


values.
- Enter t h e following d a t a into a blank spreadsheet
- C r e a t e a s c a t i e r p l o t c h a r t from t h e m
- C r e a t e a t r e n d line showing t h e simple linear regression
equation
• Predict a V value w h e n X = 2004.2

X Y
1QQI.S 24.6
/(KM, 9 Ti.7
& 3nEgrn«itPtp8ccf»rf Mcsle- Of* MOOS
Appendix C:
Learning Object Sample Screens

121
Simple Linear Regression
using Microsoft Excel (TM)

Step 2: Create a Scatterptoi

122
\
Stmpli Linear Regression
using Microsoft Excel (TM)

Step 3: Rredict a Y Value *


"9*

I V'"
T h t i slope E3 1 '. 02
yimierce.pt is 006

M J „ „ K S ® *

* i t.

i t i
i >•

J. An

15
!L o ** 4 -i w * _« o*

123
Appendix D:
Cognitive FlexibUity Theory Hypertext (CFTH)

These are samples of Web pages in the CFTH. There are more than 80 pages.

I r Wn^Ww^hreJigWbOutnA.tOrJ '• * 1 * I-'"•&""•


*f *l £h)l^wWJrt»«»«9'c*hsi*i«(*htniJ *> • Q - «* - -8«3» ' u» T « * '

'[

™BB|

:
'P''l §83 §ili jf « i . ; •".^."ilf*." .'"fj f-!| " '

"^ GE3E2E9
1

:3JL
L.. ^ " '.'""""^^
|D°nc 0 foinrc! | Piitfrtf sd Mftdfc Cfl *, I » N -

124
f?fl

fftMsmtJ Pn**««<S Med* CM %MS -

•y'X&zcw
| p Hlp^/mw44lmAi9/«!Mb»a»>J«m ' i ** i * i '"T Sf*^*' P -|

1 «• « ^ T«B»ng »«h Ow«W ^ • a - i* - -tex- J ' * * -

^^^^^^^^^^^^8
^^^^^^^^^^^B

fl9T^vJ 1
^^^^^^^^^^•£

0 burnt | PniKhd Made Cff *, I » \ »

125
I{5. Wfp^Mnv«iiitMe«g>l6*hirt*jgMnef««E4Ji(nif

HT *< ^htfp^/wMwe4h*Mf^c«i«nMinmsntjJ***n* *> - a - S* - -6»J»- JTJ,*,-

06«i»M!lPmi«ticaMaileQW «U»S

l £ . i*lfi^iiniiw^hw&*fg^«*«^9ftWBtfi$J*i**

vf <M A WtfiJt-nHtxiimirilttltommmvtoqtoml
a - ** '!•*» * -»> T * < * '

9 hlfnw! | Preucut) M e * - <X»

126
•SJ ^TAingMhMi.KAni *> * Q - ** - •&*•' J1** •

*$ , Jwdauaeui&&*kBme4foi slufaiivt'Raatatsiase&daa 2 ncxtt foe to wrimUmdntoe it required and &e


I .|i i j u t m « i t c d Can VM esuttti m e n**

^Ifhlimrl} Prcftirtid Made C« VIMS '

127
Appendix £:
Cognitive FlexibiUty Theory Hypertext (CFTH) Flow Diagrams

Consultant for Hire C4H Description of Company

128
Front Desk Phone at "*\ Presentation
Assignment 14A I Rabin Consulting Front Desk Mr Rabin's Office I On Laptop

Assignment 14B / Reception Desk Dr. Bhutan


HTML table

•-.,--~M
"T x
Mr.Chothe
manager
-\
_\ Gerald the analyst
Excel file

, t - " Phone directories Other staff for more


/*•
assignments
C4H Headquarters
v Wall chart of phone iSi's __ )•*•}
Other Staff for
informational
purposes

\\\ Other staff for more


s
\. assignments

129
Appendix F:
Pretest and Posttest Survey Instrument
httpV/www proprcfs conVquiz schcol/quBshBwaH phpJtrtlc='lrguizzes *f A I Goog'e P -
Q » J S w d i * » T | * # ' ® 0Shar*' , ,VCheck- » \ r
§ snider -

*% ' Q * up *» Page » ^ Tools •»

V= 1 2 < B J C H 3 9 2 9
R*-0964

9 Internet | Protected Mode: Off *U00% -^

T
*V V snider

page » ^ Tools »

A specific model of car age and price were measured


Answer questions 1 through 5 based on the plot of vehicle age and price

Slope = 1235 Y Intercept = 13=101 coefficient of determination - 954

^ I Q C = - 1222 > lnWcflpt = 13890 orr»l3tion " ^ T c i e n t = 9 J - I

Slope- 12^1 V-lntercept = 13012,coiTel2fioncoeffiaent-954

Slope= 1203 Y!ntercept = 13923 coefficient of determinations 9St

Slope= 12033 Y intercept-13446 coefficJerrtof determination = 954

What Is the dependent variable in the car example above?

age ofthe^efi cle

prce^fthoyentcle

tfme

slcpe

intercept

•& % Intenwt | Protected Mode Off

130
f "BKiJiS****,Ta &Sr*ir-,r.w'?~'-:i^--'-'i- i^?,ra,'~x"'!

iSLS- P
Co gt€
1 httpvVwww proprcfs com/quu scflool/quizshowall php*trtle=5lrquKzes

j r ] - 3 Search • " Tlj - cjfr- ® gshare-gl-


^ | *+) A | Ocog e

,yCh«:k-» \ , T
f snufer
"I
* I

w 4< SLRQUKZB Qua ^ » Q ' m » * Page » j j Tools ^ I

r
•* 1
slope

> intercept

«r *< For the car example above, what Is the predicted value for Y when X • 4 years 7
predldedY-S9145

predicted Y = 59 009

predicledY = S9118

predicted Y = S10 012

predicted Y = S8 998

•- What is the % of variability in car price explained by the age of the vehicle in the above car example''

700%

96 4%

712%

224%

955%

v 5 With each additional year of age on the vehicle, what happens to the price 7

$> ©Internet | Protected Mode Off \ 1 0 0 % "•

i http-Z/wwrw proprcfs ccm/qurz *choolrqurz'howatl php*trtle=slrqurzze<: *• ++ X 1 Google P »

Gojde [ £ | >§ Search -<> Tl| • <%>' % @Shafe'®l- j^Chedc-» ^ " ^ snider T

StRQUBZES Que Q> » Q - <59 - * Page • ^ Took *»

71.2%

224%

L
955%

With each additional year of age on the vehicle, what happens to the price 7

The price increases Oy S1 222 dollars

The pnc» decreases 3 S1 203 dollars

The price decreases 0 S1 322 dollars

The pnee sta s the sane

The pnee increases bv S1 250 dollars

For a simple linear regression example with


r v a l u e of 9509, sample size of 14, level of significance = 05,
t statistic = +10 6411, and a critical value = +2.1788

To test the significance of the r value, we use


the null hypothesis = ' the population correlation coefficient is equal to 0" and
the alternative hypothesis • ' the population correlation coefficient is not equal to zero"

Which Is the most correct statement below 7

Done if % Internet | Protected Mode. Off

131
Goglc [»]f|S««Kh-«2^- ^ » 1$ 0 Share - | p - & Check - » \-^snider -

U il StSQUEZES Qua *«t " Q - w - • Bag* " j r TBOIS "

The pnce stays tile saTie

The pnce ncreases Sj S1 250 dollars

F o r a simple linear regression example with


r v a l u e o f 9 5 0 9 , s a m p l e s i z e o f 14, l e v e l o f s i g n i f i c a n c e = 0 5 ,
t statistic = + 1 0 6 4 1 1 , a n d a critical v a l u e * + 2 1 7 B S

T o t e s t t h e s i g n i f i c a n c e of t h e r v a l u e , w e u s e
t h e null h y p o t h e s i s - " t h e p o p u l a t i o n c o r r e l a t i o n c o e f f i c i e n t is e q u a l t o 0 " a n d
t h e alternative hypothesis • " t h e population correlation coefficient is not equal t o z e r o "

Which is t h e most c o r r e c t s t a t e m e n t b e l o w ?

^ The confidence interval being used is 80%

H The alternative hypothesis is rejected based on fhet-value of 10 6411

The null hypothesis is rejected based on the t statistic of 10 6411 being higher than the critical \<alue of+27188

The altematr e hypothesis is not rejected cased on the t value less than +2 1788

£ The null hypothesis is not rejected based on the coefficient of determination

U s i n g t h e c o e f f i c i e n t o f c o r r e l a t i o n v a l u e o f - 875, c h o o s e t h e c o r r e c t i n t e r p r e t a t i o n o f t h e v a l u e

The value shows a strong positive relationship

Thp value shnwn 3 wpaK rplafinnsh r


f

$> 0 Internet | Protected Mode-Off •.1005S -•

mm&MmtfSfmti
httpV/wwi* prcprofs ccm/qutt-cchool/quizshowall php*title==lrquE£es " *f A
i GoQ
9- g
Google 0 3 M e w d i - » T J J - * - S > ©Share' & Check' » % - £ snider

L4 & 5UQUSZZES Q » H * Page • j : Tgols •


11
The alternative hypothesis is re ected based on the t-va!ue of 10 6411

r
The null hypothesis is re ected oased on the t statistic of 10 6411 oemg higher than the critical v'aSue of +Z?188

The aitemative hypothesis is not re ected based on the t vaiue less than +21788

The null h -pothesis is not re ected based on the coefficient of determination

Using t h e coefficient o f correlation v a l u e o f - 675, c h o o s e t h e correct interpretation of t h e v a l u e

The ^alue shews a strong positive relationship

The value shows a weak relationship

The ^aiue shows a -^oaeiate positr e relationsh p

The value shows a strong inverse relationship

The value shows a tioderate inverse relationship

^ Internet | Protected Mode Off

132
http://vwivw.prcprofs.com/qu«-5chDol/quiZ5h<»iraltphp,tit!e=slrquiizes

Go gle [ » j ' 3 Search - « • Tl) - t j ^ » < § glShare-gl- ^Check-» \ , ' £ snider.. *

T
W « • SLRQUEZES.Qutz *> " £3 • on * Page • ,Jf Tacts •

0 3, ^ ^ ^ ^ ^ ^ ^ ^
-u

1
lo0k (fiber
ontaKdloEirtitar

yMlM6K-l*2Z
Rz=tt5n
250
*

2(0

150
/'

* / t

Done {§> 9 Interne* | Protected Mode Off \1U>% -

y=om««-R22

o » in M am ai n HI

(^' # Internet | Protected Mode; Off

133
• *m®ym&mmz.
httpy/www prcprofs conVquiz-sehool/qurzshowall php?trtle=strquizzes - I * » | X I Goojie

Go 8^ ^ 3 Search'»;«]• * - < § ^ Check- » \ - 0 snider.

StRQUE7.E5.Qun Q - Q - tf » __" E^SE ' ^ Tools •»

The planets are millions of miles away from the sun. Each has its own length of a year compared to the
earth. The following data shows the relationship between distance from the sun in millions of miles am
the length of the year in terms of an earth year

Planet, Distance to Sun, Length of Year compared to Earth Year


Mercury, 36, .24
Venus, 67, .61
Mars, 142,1.88
Jupiter, 484,11.86
Saturn, 887, 29 46
Uranus, 1784,84 07
Neptune, 2796,164.82
Pluto, 3666, 247 68

What is the most correct statement below about this data? L

the scatterplot shows a low r squared value and a curved upward pattern not a linear pattern

^ the scatterplot shows a high rsquaredvalue and a curved upward pattern not a linear pattern

the scatterplot shows a low r squared value and a linear upward pattern

the scatterplot shows a low r-squared value and a linear downward pattern

1
the scatterplot shows no pattern

moderate correlation

interpolation

strong correlation

e'trapolatiJn

cause and effect relationship

Which of the four assumptions for simple linear regression is broken by visually inspecting the solar
system scatter plot above?

Equal anances

Independence

riomalit

Linearity

none are orcen

(74, 240), (72, 200), (60,150), (58,144), (66,180), (22,44), (32, 56)
For the above (Height, Weight) pains, the following scatter plot was generated with the simple regression line.

9 Internet | Protected Mode Off


*r

134
5i!giH$mni£$vm£i
http.//wvAV proprofs com/qua school/quEshowall phpHrtte=slrqutzze£ » ** A I Goog e

Gosk 03Sn*"iT|-^'ll} @Share-gl- &Ottck- »


L \ , " Q snider

W * SLRQURZK Quiz Q ' * • Page ^ ^ Tgols •

Normality

Linearity

None are broKen

(74 240), (72, 200), (60,150), (58,144), (66,180), (22, 44), (32, 56)
For the above (Height, Weight) pairs, the following scatter plot was generated with the simple regression line
Which is the correct simple linear regression equation for the scatter plot regression line?

300

250

w 200
E
1
150
G
H
T inn

^ Internet | Protected Mode OR •M00* *•

mw!im®m:«m-$t
http//www proprofs ccm/quK scheol/quizshcwa[lprip7titte-Elrqutizes »•» x l Gooj c

Go git 3*|S«ich'«3|' * • * ) Share' 9 " A» Check • » \ - # snider


W * SUQMZZES QUE

300

250

W200

E
1
150
6
H
T 100

50

0 10 20 30 40 50 GO 70 SO

HEIGHT
Done i§ 9 Internet | Protected Mode Off

135
httpy/www proprofs conVque-schoct/quizshcwatl php?title=slrquK2es

Go gte [ » ] ^ | S e a r c h - " ^ - djf % gjShare'©- & Check - » \ - 0 smder... •

U & SLRQUIZZES.QUK *£ » O - <m - • ' Ease - j , Tgols -r '

0 10 20 30 40 50 GO 70 80

HEIGHT

predictedY=-Z51SrJmesX 172

precaaedV- 3519timesX*4004

predicted Y = 2 03 times X + 88 7

predictedY = 3519bmesX-4820
1
predicted Y = 5 34 times X £0 2

What coefficieni of correlation is needed to conclude that X causes Y?

coefficient of correlation = 1 00

coefficient of correlation = 70

,ou can never conclude causation from a coefficient of correlation

coefficient of correlation = 30

- coefficient of correlation = 01

Done i^> #tnternet| Protected Mode: Off •,100% -

bu^/wmjflBfretas^^sdNwVqifey^^ *t\ A l l « Jtsn

J *' SmffeUi*if£igi*H»ft:Qus ^ * Q * 9 " • '£*}«"• jl&h*

Mease w*w# i n j ewmierus you hswe f egging this re«af«h.

Submit My Aniwer*

Dent # tar*, PmtKtei Mode Of MOOS

136
Appendix G:
Sign-Up Demographics Questions
I p__ fmp-7/www.pn>prcrf5^tim/qua-«hoQVquii^^ » *j- A 1 . s Seo

O £ SIGNUP SIR: Qui:


L
'.> « Q
1

• m - • -£»se • /fids •

SIGNUP SLR: Quiz

Sign up questions »cr Smpie Lnea- Reress in i=seatti

Your Name » SLR01

Age?

J? 30

4;;;

51-60

D.r'H'

Gender?
f.'-3l£

^ Internet j Protected Mode Off

How many hours p^r week do you spend on the computer?

If you had a choice, would you prefer to learn by

us r j enure ~3c^!.,

4$ internet j Protected Mode- Off

137
Appendix H:
Informed Consent Form for Control Group

Quantitative Comparison of Traditional to Combined Online Instruction


for Simple Linear Regression
Purpose. You are invited to participate in a research study being conducted for a dissertation at
Northcentral University in Prescott Valley, Arizona. The purpose of this study is to evaluate new
online teaching methods compared to traditional methods of lecture plus textbook format. You
will receive the traditional teaching format outlined in your syllabus. There is no deception in
this study. We are testing knowledge of simple linear regression in a college statistics course.
Participation requirements. If you consent to the study by signing the document below, you will
be asked to complete a sign-up online which asks for demographics (no identifying information).
You will also take a pretest before the lesson on simple linear regression in the ADM515 course.
You then go through the material of the course and take the same test again after you learn the
material in class. The total time commitment for this study is up to 2 hours for taking the tests
and 5 minutes for the online sign up process. This is additional to your coursework.
Research Personnel The following people are involved in this research project and may be
contacted at any time: Joseph A Snider, Primary Investigator, Cellphone: 502-641-2337 and
Dr. Larry Flegle, Dissertation Committee Chairman, Phone; 770-720-6346
Potential Risk/Discomfort There are no known risks in this study, and this study in no way
impacts grades. However, even with no known risks, you may withdraw at any time.
Potential Benefit. There are direct benefits to you of participating in this research. Incentives are
offered to acknowledge your participation and your time commitment. The results will have
scientific interest and may have benefits for future students. Participants who fully complete the
pretest and posttest will receive a $15 gift card.
Anonymity/Confidentiality. The data collected in this study are confidential. All data are coded
such that your name is not associated with them. In addition, the coded data are made available
only to the researchers associated with this project. A unique identifier will be provided to use
when taking the tests and signing-up. If you provide consent to participate in this study, the
unique identifier will be provided along with more detailed instructions on how to participate.
Right to Withdraw. You have the right to withdraw from the study at any time without any
penalty to your grades whatsoever. If you feel uncomfortable answering any questions in the
study at any time, just withdrawfromthe study. It is important that the study have answers to all
questions to be valid.
We would be happy to answer any question that may arise about the study. Please direct your
questions or comments to: Joseph A Snider, Primary Investigator, Cellphone: 502-641-2337or
Dr. Larry Flegle, Dissertation Committee Chairman, Phone: 770-720-6346

Signatures
I have read the above description of the study and understand the conditions of my participation.
My signature indicates that I agree to participate in the experiment.

Date: MBA Cohort (such as MBA584):


Participant's Name: Researcher's Name: _____
Participant's Signature: Researcher's Signature:
138
Appendix I:
Informed Consent Form for Experimental Group

Quantitative Comparison of Traditional to Combined Online Instruction


for Simple Linear Regression
Purpose. You are invited to participate in a research study being conducted for a dissertation at
Northcentral University in Prescott Valley, Arizona. The purpose of this study is to evaluate new
online teaching methods compared to traditional methods of lecture plus textbook format. There
is no deception in this study. We are testing the new combined methods in the topic of simple
linear regression in a college-level statistics course. You will be part of the experimental group.
Participation requirements. All participants will be asked to complete a sign-up online which
asks for demographics (no identifying information). The sign-up process takes less than 5
minutes. After sign-up, you will take a pretest online, then use the new online training materials
over a 1 week span of time, and then after 1 week, take the same test again. This will all occur
before and in addition to receiving this material in the traditional class. The total commitment of
time is up to 2 hours taking the tests, 5 minutes to sign up, and from 1 to 4 hours using the online
materials, depending on how much wandering in the online case study environment you do.
Research Personnel The following people are involved in this research project and may be
contacted at any time: Joseph A Snider, Primary Investigator, Cellphone: 502-641-2337 and
Dr. Larry Flegle, Dissertation Committee Chairman, Phone: 770-720-6346
Potential Risk/Discomfort. There are no known risks in this study, and this study in no way
impacts grades. However, even with no known risks, you may withdraw at any time.
Potential Benefit. There are direct benefits to you of participating in this research. Having access
to the new online training materials might prove useful in understanding this subject matter.
Participants who fully complete the pretest, access and use the online materials, and fully
complete the posttest will receive a $25 gift card.
Anonymity/Confidentiality. The data collected in this study are confidential. All data are coded
such that your name is not associated with them. In addition, the coded data are made available
only to the researchers associated with this project. A unique identifier will be provided to use
when taking the tests and signing-up. If you provide consent to participate in this study, the
unique identifier will be provided along with more detailed instructions on how to participate.
Right to Withdraw. You have the right to withdrawfromthe study at any time without penalty. If
you feel uncomfortable answering any questions in the study at any time, just withdraw from the
study. It is important that the study have answers to all questions to be valid.
We would be happy to answer any question that may arise about the study. Please direct your
questions or comments to: Joseph A Snider, Primary Investigator, Cellphone: 502-641-233 7or
Dr. Larry Flegle, Dissertation Committee Chairman, Phone: 770-720-6346

Signatures
I have read the above description of the study and understand the conditions of my participation.
My signature indicates that I agree to participate in the experiment.

Date: MBA Cohort (such as MBA584):


Participant's Name: Researcher's Name:
Participant's Signature: Researcher's Signature:

139
Appendix J
IRB Approval Email from Northcentral University

March 7 , 2011

Reference: Joseph A. Snider


IRB: 2011-03-04-039

Dear Dr. Larry Flegle, Dissertation Chair:

On March 6, 2011, Northcentral University approved Joseph's


research project entitled, Quantitative Comparison of Traditional
to Combined Online Instruction for Simple Linear Regression.

IRB approval extends for a period of one year and will expire on
March 7, 2012.

Please inform the Northcentral University IRB when the project is


completed.

Should the project require an extension, an application for an


extension must be submitted within three months of the IRB
expiration date.

In the interim, if there are any changes in the research protocol


described in the proposal, a written change request describing
the proposed changes must be submitted for approval.

Sincerely,

Dr. Chris Cozby


IRB Committee Chair
Northcentral University

140

You might also like