ORIGINAL COPY This thesis copy is a reading copy made available by the author.
"In life we make the best mistakes we know how to make. Then, with luck, we go out and make new ones." — Joan Oliver Goldsmith, How Can We Keeping from Singing? (Norton)
“The very aim and end of our institutions is just this: that we may think what we like and say what we think.” — Oliver Wendell Holmes
“It's not that I'm so smart it's just that I stay with problems longer.” — Albert Einstein
"The wisdom of the wise and the experience of the ages are perpetuated by quotations." — Benjamin Disraeli, Earl of Beaconsfield (1804-1881)
ii ABSTRACT THE ROLE OF COMPUTERS IN SCHOOL RESTRUCTURING: A META-ANALYSIS This study explored how educators can more effectively use computer technology to meet the needs of 21st century students. David Jonassen proposes that the effectiveness of computers as instructional tools depends upon how they are used. An exploratory meta-analysis was performed to examine the relationship between instructional technique and computer use and their combined effect on student achievement. The results suggest that the instructional technique of collaborative learning, in conjunction with the use of computers as a tool, facilitates learning better than any other such combination of variables investigated. Additionally, the findings support Jonassen’s theory that the way computers are used in instruction determines the extent to which they affect learning. Further, computers used as MindTools had the largest effect on student achievement of all methodologies included in the meta-analysis. Finally, the findings suggest that computers’ greatest impact on student achievement seems to occur among students in grades 6 – 12. Robin Michael Roberts December 2002
THE ROLE OF COMPUTERS IN SCHOOL RESTRUCTURING: A META-ANALYSIS
by Robin Michael Roberts
A thesis submitted in partial fulfillment of the requirements for the degree of Master of Arts in Education in the Kremen School of Education and Human Development California State University. Fresno December 2002
© 2002 Robin Michael Roberts All Rights Reserved
APPROVED For the Department of Curriculum and Instruction:
AUTHORIZATION FOR REPRODUCTION OF MASTER’S THESIS
Permission to reproduce this thesis in part or in its entirety must be obtained from the author.
has been a good friend and teacher for five years and shares with me whatever rewards accrue from the finished product. Roy Bohlin and his wife Dr. both at California State University. Collaborative computing is both studied and practiced in the present study. Carol Fry Bohlin. Roy Bohlin. Secondly. especially. and Dr. I would like to thank those individuals and the groups of which they were a part: First. Dr. Thus. Sharon BrownWelty. Three projects provided financial and collegial support during the time I was engaged in writing and researching this study: The San Joaquin Valley History-Social Science Project. insight. One person cannot complete a study like this alone. Ron Unruh. In most cases. Fresno
. encouragement and support long after the last day I attended their class. Dr. three professors not on my thesis committee played important parts in the early development of this thesis: Dr. In addition. Susan Tracz provided encouragement. there we re three members in each group that participated with me in this endeavor. Each of them. and information when needed and pushed me to greater economy of prose and clarity of organization. each also provided. in their classes. the Teaching And Leading for Educational Needs with Technology (TALENT) Project. my three thesis advisors deserve a great deal of the credit for bringing this thesis to its present form. and training for much of what eventually became the review of literature. in different ways.ACKNOWLEDGMENTS This thesis has been what might be termed an “exercise in actuality:” doing what one is studying. provided the opportunity. I found myself collaborating with a number of people—yet nearly always within the context of larger groups. and Dr. Susan Harris. motivation.
Fresno—what fun to attend the same college as your child. began her college career at West Hills during the last semester in which this thesis was completed.
. took on extra duties to allow me to work on portions of the metaanalysis and listened sympathetically when things weren’t working right. Three colleagues at Pioneer Middle School in Hanford acted as sounding boards for ideas.. My wife.D. Laurie Goodman. The time was more special for sharing it with these two good friends. Principal of Pioneer Middle School provided encouragement and flexibility in scheduling that allowed me to physically meet many of these deadlines. and emotional releases for the stresses inevitably associated with looming deadlines: Rich Callaghan. Los Angeles. My eldest daughter. Sylvia. Literacy Mentor at Pioneer Middle School. three individuals who are not associated with each other also played small parts in the process leading to completion of my studies: Mark Cave. has joined me at CSU. Tamara. My youngest daughter. During that time I spent far less time with them than they deserved. Terra. but still they managed to assist me in numerous ways. Collaboration also took place within more informal. Henry Placenti and Sopheak Real joined me in running the twin gauntlets of red tape and deadlines to finish our programs together. My family provided support throughout the 5 years I spent pursuing my studies. Lastly. social contexts. served as the main collaborator for every idea that germinated whilst I pursued my degree program.vi and the Teacher-Researcher Initiative Project (TRIP) at the University of California.S. My classroom aide. who has been through this process herself. discussion partners. Miguel Rodriguez. D. read parts of the literature review and served as a peer evaluator and sounding board for many ideas. The Three Musketeers created a mutual support group for ourselve s.
I wish to thank Kathleen Vandermeer. remained interested in my education even though they have long since fulfilled any responsibilities they had for it. My parents.vii my best friend. Joe and Maureen Roberts. of the CSUF Graduate Office for her knowledgeable and skillful editing of the manuscript. Finally.
. She went above and beyond to see that it was worthy of everyone above and of the university whose name is on the cover. was a good companion and leant his considerable intellect to the discussion of several key aspects of the study.
. . . . . . . . . .
. . . . . . . . . . . . . . . . .
. 26 . .
. . 36 . . . . . . . . . . .
. . . . . . . . . .
. . . . REVIEW OF THE LITERATURE . . . . . . . . 64
The Role of Computer Technology Computer Use Categories . . . . . . . . . . 31 . . . . . . . . . . INTRODUCTION . . . . . . . . . . School Restructuring . . . .
Instructional Techniques Categories . . 38 . .
. . . . . . . . Purpose of the Study . Definition of Terms . . . 18 . . . Design of the Study . . Chapter
1. . . . . . . . .
. . . . . . . . . . . . .
. . . . x xi
LIST OF FIGURES . 3.
. Restatement of the Research Question. . . . . . . . . . . . .
. . .
Design of the Study . . . . . 33 . Characteristics of 21st Century Students . . . . . . . . . . . . . . .
2. . . . 20 . . . . . . . . . . .
. . .TABLE OF CONTENTS Page LIST OF TABLES . . . . . Background . . . . . . . . . . . . Introduction .
Causative Factors for the 21st Century Educational Imperative . . METHODOLOGY . . . . . 33 . . . . . .
1 1 3 3 4 7 7 8
Relevance of the Study .
. . . .
. . . 36 .
Limitations of t he Study
. . . . . .
Introduction . . 95 . 5. .
Implications of the Study . . . . 75 . . . . . . . . . . . DISCUSSION . . 74 . . . . . . . . . . . . . . . . . . . . 88 . . 88 . . . 74 . . . . . . 83 . . . . . . . . RESULTS . . . . . . . . . . . . . . 68 . . . . . . . . Page . . . . . . 99 . CONCLUSIONS Findings . . . . . . . . . . . . 66 . . . . . . 97 . .
What Effect Size Estimates Represent 6. . . . . . .
Individual Study Results
Results of the Meta-Analysis . . . . . . . . . . . . . .
. . . REFERENCES APPENDIX . . . . 66 . . . . . . . . . 94 . . . . . . .ix Chapter 4. . . 66 . . . . . . . . . . . .
Recommendations for Further Research . . . . . . . . 90 . . . . . . . . . . . . . . . . . . .
Controlling for Errors . Discussion Summary . . . . . . . . . . . . . . . . . . . . . . . . . .
Accounting for Heterogeneity . . . .
7. Independent and Dependent Variables for Each Study . . . Computer Use (CU) Coding Guide . . . . . .x LIST OF TABLES Table 1. . 47 . . 67
3. . . . . . Included Studies . . . . . . . .
12. . . . . . . 47 . . 72
. 55 . . . . . . . . . . 10. . . 52 . 69 13. . 52 . . . . 6. 53 . . 2. . . . . Number of Included Studies by Year of Publication. . 50 . . . Binomial Effect Size Display (BESD) for Al l Studies and SubGroups . . . . . . . Categories of Instructional Use of Computer Technology . . . . . 4.
5. . . . . 65 . . .
11. . . . . Study Independent Variable IT: Instructional Technique by Year . . . 34 . Number of Studies per Subject Age Group . . . . Instructional Technique (IT) Coding Guide . . 8. . Study Independent Variable CU: Computer Use by Year
9. . . Meta-Analysis Overall and Sub-Group Statistics: Standardized Mean Difference . 55 . . . . . . . . . Notation and Symbols . . . . . Complete Individual Study Statistics . . . . Page . .
. . . . . . . . . . . . 84 . . . . . . . Intervention in the meta-analysis . . . 7. . . . . . 86
3. . . . . . . . . . . . . Multiple interactions . .
6. . . . . . . . . . . . 82 . . The overall mean effect of computers used as tools . . .
9. 84 . . . 85 . . . . Overall mean effect of computers . . . 5. 61 . . . . . . Mean effect of collaborative learning . 62 . Interaction in the meta-analysis
8. . .LIST OF FIGURES Figure 1. 77 . . . . . . 76 . . . . Graphic depiction of standardized mean difference . . . . . . . Intervention variable 2. . . . Page . Interaction variable .
These conflicting results make it difficult to assess the instructional value of computers. While there is currently little empirical evidence to support Jonassen’s theory. 1986). suggests that it is not the computer itself that is responsible for differences in learning. Some studies have found that computer use leads to poorer performance on measures of learning (Brook & Boal. 1996). Kulik. 1999. developed by Jonassen (1996. Shaffer & Hannefin. Wilson. 1995. it has received favorable attention from many educators. Healy. 1980. A number of educators as well as political groups
. 1998. 1983. Many studies have been conducted that have found that classroom computer use contributes to learning (de Jong & van Jooligan. A number of hypotheses have been developed to explain the range of findings obtained from studies of classroom computer use. Wenglinsky. McClure. 1996. 1994. 1999.Chapter 1 INTRODUCTION In recent years. Gerlic & Jarusovec. One such hypothesis. 1998). Clark. Ku & Sullivan. The presence of this new technology has caused many to ask how computers might best be used to facilitate learning. Other studies have found no significant computer-related effect on classroom performance (Barry & Runyan. 1998. Relevance of the Study Beyond the presence of computers in the classroom. but how the computer is used. 1995. 2000). Russell. Kulik. education has experienced an influx of computers into the classroom. & Cohen. the advent of the 21st century has occasioned another series of questions about the best way to prepare students for this new century. 2000.
A primary reason for this suggestion is the central role that computers play in current and future economies (Gershenfeld. 1996). 2002. Institute for Learning Technologies. D'Agnese. Thornburg. 1973. This unresolved question is compounded by a lack of evidence for the best directions that school restructuring should take (Knapp & Glenn. Eastin. Smith & Curtin. 1996). Among the reasons cited for this concern are the shift from an industrial age economy to a knowledge economy (Bowman. 1999. Thornburg. 1996.
. Kaku. Selfe. 1996. 1996. and new understandings of learning (Pinker. the effects of an emerging postmodern culture (Bell. 1997. 1999). 1970). Oblinger & Rush.). Perelman. Pogrow. Mehlenger. Patterson. Best & Kellner. O’Reilly. This central role highlights the view of some educators that computers are not being used effectively in American education (Charp. Some researchers have suggested that computers might be an important part of meeting the educational needs of students in the 21st century (Bohlin. Oppenheimer. National Commission on Excellence in Education. 1999. 1996. 1986. Layton. Tapscott & Caston. Jonassen. 1997. 1992. 1995. 1993. Mayers & Swafford. 2000. 1996. 1999. 2000. Poster.2 have questioned whether the current American school system is meeting the educational needs of 21st century students (Carnegie Forum on Education and the Economy. Ulmer. Simon. 1996. The Secretary’s Commission on Achieving Necessary Skills. 1997. and the question of which restructuring strategies might meet those needs remains unresolved (Knapp & Glenn. Perelman. Tapscott. 1991. 1995). 2000. 1983. 1998. 1996. Shouse & Mussoline. 1997. 1998). Knapp & Glenn. 1999. 1999). 1997. 1999). Sprenger. Schlechty. 1996. 1998. Kuhn. 1990. 2000. Tapscott. 1992. Recent attempts at restructuring schools to address these emerging needs have yielded mixed results. 1997.
when applied to the use of computers in education. 1983) was calculated to determine the adequacy of each sample subgroup. Consequently. the standardized mean difference (Cohen. The studies comprising the sample were placed in sub-groupings based on computer use and instructional technique. it seems possible that the answer to the guiding research question posed above might be discovered among the results of those studies. these two questions formed the guiding research question for this study: Is there one combination of computer use and instructional technique that appears to be most effective in maximizing student achievement? Along with answering this question. contributes to greater student achievement? Taken together.3 Purpose of the Study The purpose of this study was to explore ways in which educators can more effectively use computer technology to meet the needs of 21st century students. The data from each study were subjected to statistical analysis to derive an estimator of effect size d. 1991a) for each subgroup was tested to measure the impact of influences other than the independent variables on the effect sizes. this study will add to the research base on Jonassen’s theories. Homogeneity ? 2 (Rosenthal. These results
. The fail-safe N (Orwin. 1977). Design of the Study In light of the large number of studies that exist that address either the effects of computers or instructional techniques on student achievement. This can be partially accomplished by answering the following questions: Which use of computer technology leads to the greatest student achievement? Is there a particular instructional technique that. this study applied the technique of meta-analysis to a convenient sample of those studies gathered from the Educational Resources Information Center (ERIC) database to derive a preliminary answer to this research question.
.g. 1982) was calculated for each sub-group to assist in interpreting the effect sizes. Formal methods of retrieval: The retrieval of articles and papers using published abstracts and indexes. 1994. 535). 532). r or d) and calculating measures of location (e.4 were ranked according to the magnitude of the effect size and compared to the overall main effect size. According to Davis (1986). p. Exploratory data analysis: “A data analysis that focuses on description rather than inference.g. 1994. Supplements numerical summaries with visual displays” (Cooper & Hedges. mean. p. Combining results: “Putting effect sizes on a common metric (e. Definition of Terms For the purposes of this study. 533). 535). p. median) or combining tests of significance” (Cooper & Hedges. and literature reviews” (Cooper & Hedges.
. a Binomial Effect Size Display (BESD) (Rosenthal & Rubin. the following definitions were used: Between-studies sample size: “The number of studies in a meta-analysis” (Cooper & Hedges. It is an inductive procedure to identifying factors that influence effect study outcomes. Cluster Analysis/Clustering of effect sizes: Cluster analysis is a multivariate procedure for detecting natural groupings among study level data. It can be used to identify potential moderators. p. 1994. Hedges and Olkin (1985) suggest grouping or clustering studies by effect size magnitude and examining each cluster for common characteristics that covary across the clusters.. cluster analysis is heuristic in nature and rests upon no underlying statistical theory. Lastly. 1994. popular computer-readable databases.
It can be thought of as “a collection of ensembles of studies. Universe: “The hypothetical collection of studies that could be conducted in principle and about which we wish to generalize” (Hedges. 1994a. Study population: The actual group of extant studies from which the study sample was selected. p. Study sample: “The ensemble of studies that are used in the review and that provide the effect size data used in the research synthesis” (Hedges.” Hunt (1997) differentiates between a “universe of studies” of a phenomenon and the “universe of actual instances” of that phenomenon. 1994a. 1994. Transformation: The application of some arithmetic principle to a set of observations to convert the scale of measurement to “one with more desirable characteristics” (Cooper & Hedges. See between-studies sample size. In addition. p. Since it is unlikely that every study ever conducted on a given phenomenon would be identical with the number of instances of that phenomenon. since the actual extant collection of studies on any given
.5 Inclusion criteria (eligibility criteria): “Conditions that must be met by a primary study in order for it to be included in the research synthesis” (Cooper & Hedges. p. even a study sample (see above) comprised of every extant study of a given phenomenon represents only a sample of the actual instances of the phenomenon (p. p. Moderator variable: “Any factor that influences the size of a particular relationship and is itself not a consequence of the relationship” (Cooper & Hedges. 30). 1994. 537). p. 1994. 534). 57). 542). 30). Overall effect size: The effect size derived from statistically combining a sample population comprised of individual effect sizes from various single studies.
.6 phenomenon rarely remains static for long. the search for a comprehensive meta-analysis is an illusory one.
1998) entitled The Information Age: Economy. Castells’s work centers around the idea of “the bipolar opposition of the Net and Self” (1996. categories of instructional technique and computer use are derived for use in the meta-analysis to follow. Crawford. 340). particularly those human beings within a societal context. in the relationships of power. p. social changes and technological changes are always intimately related by virtue of their interaction vis à vis production and development. uses and consequences of ICTs [information and communication technologies] that takes into account their interaction with institutional and cultural contexts. In his view. Kling. followed by an overview of recent studies on school restructuring and a similar look at the recent use of computers in schools. 3) as the organizing feature of a networked new century. As he puts it: “A new society emerges when and if a structural transformation can be observed in the relationships of production. and Weisband (2000) define it as the “interdisciplinary study of the design. and in the relationships of experience” (1998. Called social informatics. Finally.
. Rosenbaum. 1997. Sawyer. Society and Culture.Chapter 2 REVIEW OF THE LITERATURE Background This review of literature falls largely within the bounds of a relatively new approach to evaluating the effects of information technologies upon human beings. p. This literature review examines three causative factors that have combined to help create a new educational imperative for schools.” Perhaps the most comprehensive example of social informatics is the three-volume series by Castells (1996.
2001. 1985). upon taking office in 1995. 2) A number of writers have suggested that the latter decades of the 20th century exhibited the characteristics of what Thomas Kuhn (1970) called a “paradigm shift” (e. information overload (Besser. Among the characteristics attributed to this new “information age” are the centrality of digital computing technologies (Negroponte. 1998. Machlup. Minsky. an increase in the pace of change (Drexler. . What is happening to us now—the transition from the industrial era . 1996. The Information Age. 1991. 1997. Toffler (1991) argues that this period of time represents a “third wave”—an information revolution which succeeds the agricultural and industrial revolutions as the next important wave in human history. 1987. suggests that the former was “a period in which society learned how to process. Ronfeldt. and the network as metaphor for the age (Castells 1996. processes
. (quoted in Tapscott. when America was changing from a rural to a manufacturing society. p. 1993. that the most accurate analogy to what is happening to us now is to look at the period betwe en 1770 and 1800. 1962). 5). is forcing us to ask very similar questions about ourselves. 1996. recombine. Newt Gingrich. Toffler & To ffler. . Toffler. Castells. on the other hand. 1993). sort.8 Causative Factors for the 21st Century Educational Imperative Moving From the Industrial Age to the Information Age The industrial economy of the 19th and early 20th centuries has transitioned into what has been called the Information Age (Lubar. 1995). rearrange. Strackbein. 1998). former Speaker of the House of Representatives said. 1997. 1996.g.. 1995). and transport atoms in unprecedented fashion” (p. the “information float” (Thornburg 1996). in comparing the Industrial Revolution and the Information Revolution. Negroponte (1995).
2001). 1996. 1997. Strackbein. 1996). Negroponte goes on to point out that the atom is the fundamental unit of matter and the bit is the fundamental unit of information. and no change since has been as dramatic as networked computers” (p. Thornburg (1996) observes that this causes not just a change in speed but also a change in the rate of change (as the power of computing devices that control and move bits increase according to Moore’s and Metcalf’s laws) (Drexler. Morovec. 1998. 3). 1988). Ronfeldt (1996)
. The digital computer is the prime mover of these bits of information. It was the first true one-to-many communications medium.9 bits (data) rather than atoms. Dewar (1998) suggests that no invention since the printing press has had as profound an effect on world society as have networked computers: “There has only been one comparable event i n the recorded history of communications—the printing press. The resulting acceleration in the pace of change creates two phenomena that are unique to the Information Age: the volume of information doubles every 18-24 months. while the shrinkage or collapse of the “information float”(the time lag between a discovery and its application) decreases (Thornburg. 1996. in his influential book Society of Mind (1985). The importance of the networked computer is echoed by a number of theorists who suggest that the best metaphor for the information age is the network (Castells. 1987. calls it the “law of accelerating returns”—a phenomenon marked by technological change occurring as an exponential extension of Moore’s law. but its ability to move them depends on being physically connected to other computers— that is. Ronfeldt. networked. One ramification of the move from atoms to bits is that groups of bits can generally be moved more quickly and easily than groups of atoms. Minsky.
by tying the causes of postmodernism to modern technology: Our working hypothesis is that the status of knowledge is altered as societies enter what is known as the postindustrial age and cultures enter what is known as the postmodern age. We have lived through the shift. It may have its genesis in the rapidity and fecundity of technological innovation. and dehumanizing . and are unfamiliar and uneasy (at best) with what we experience. . It can fit into the new channels. Postmodern Society The term Postmodernism is applied. classified. only if learning is translated into quantities of information. . The Postmodern Condition. . Jean Francois Lyotard (1984). one of the leading theorists of the postmodern “condition. made available. (p.” opens his seminal book.10 sees the network as the next stage in the progression of societal forms that started with tribes and moved through hierarchies to markets. . artificial. generally. 118). . The nature of knowledge cannot survive unchanged within this context of general transformation. Johnson-Eilola (1998) believes that the 21st century transition from modern to postmodern has created a dichotomous culture formed from the last of the modern generation and the first of the postmodern generation: Those of us raised in the modernist first world tend to deride the second. and become operational. (pp. the miniaturization and commercialization of machines is already changing the way in which learning is acquired. . The modern world orients itself by where it has been and sees the present as a preparation for passing the past on to the future.
. . 1) Hlynka (1995) agrees with Lyotard that “technology is an integral part of the postmodern dilemma” (p. 185-186) The modern world Johnson-Eilola speaks of sees each generation as a link in a chain from past to present. . to the corpus of ideas surrounding a new philosophy which started as a movement within the art community in the early half of the 20th century. postmodernist world as superficial. and exploited.
11 The post-modern world. The workers in a knowledge economy will need to be “better educated” than either their industrial age or their information age predecessors (Nax. The knowledge economy worker. 1995). This emerging economy revolves around the manipulation of information rather than of objects.” constantly stimulated from multiple directions simultaneously. 1997. but knowledge. Demming (2000). illdefined problem domains” (p. no depth. and no differentiation between appearance and substance (Hlynka. 1996). Learning. is done “on-the-fly. there are no universal truths. and supposed forms of knowledge to produce a product" (Schlechty. It is characterized by “the ability to process multiple streams of information simultaneously and the propensity to experiment in free-form. theories. the information economy is in the process of becoming the knowledge economy. then. 1996). but even that structure itself is beginning to change. They will need to be “knowledge workers. 191). For the post-modernist. Drucker (1969). because without a temporal anchor (supplied by the past) there is no reason to retain it for the future. and orients itself to itself. beliefs. the worker in a knowledge economy uses information rather than possesses it. 1998. In other words. and Senge (1990) have all advocated that the basic economic resource of today’s economy is not labor or capital.” as needed. ideas. According to Brown and Duguid (2000) and Tapscott (1996). and then discarded. Castells. “lives on the surface. Unlike an industrial age worker whose primary task was
. The Knowledge Economy Workers in an information age economy need skills that differ from those required by an industrial age economy (Benjamin. as Johnson-Eilola sees it. “A knowledge worker is a person who puts to use facts. p. 37).
Integral to the success of the learning organization is the concept of “creative tension” (p. Keith Bogg. Computer technology is both the imperative for. 223. The foregoing suggests that 21st century students will need more than the traditional three R’s to be viable members of a knowledge economy. This is not unlike what academicians and researchers do. “expanding the ability to produce results we truly want in life” (p. see also Davis & Meyer. knowledge. p. This means that computers must take on much of the task of dealing with the overabundance. 142).12 to contribute to the production of something (a physical object). a successful knowledge economy. 1999) but their productivity depends on how both they and the knowledge they use and create are managed (Gates. Senge (1990) conceptualizes the 21st century company as a “learning organization. It is what stimulates the learning and drives the productivity. 1996). “The ability to capture information. As future knowledge workers they will need to know how to learn whatever it is that is needed to apply analyzed information in useful and creative ways. 1998). non-thinking work].” According to Senge. in the context of the learning organization. says that people should “use their intelligence to deal only with the exceptions [to repetitive. Knowledge workers are the most vital resource in a 21st century company (Microsoft. and data has far outstripped people’s ability to absorb and analyze this information in a focused way” (Gates. Tapscott (1996) suggests that this is as true for the educational world as it is for the business world. quoted in Gates (1999). The learning organization. the knowledge worker produces an idea or the application thereof (Tapscott. 1999. learning is both an individual and a team effort and means.
. and the instrument of. 1999). 154). letting computers make decisions about everything else” (p. 2).
deriving primarily from cognitive science and the study of the brain. 1990): Connectionism. In general. Apart from the way societal changes wrought by information technology and the differing demands of a knowledge economy give rise to changes in the way 21st century students are educated. The 21st century brings with it new understandings of knowledge and learning. cognitive science is defined as the interdisciplinary study of the mind and intelligence which attempts to further the understanding of intelligent activities and the nature of thought (Audi. The first. and Dynamism (Dynamical Systems).13 If knowledge management and learning organizations are to be hallmarks of business in the economic world of the next generation. The idea of brain-based learning seems redundant. Dynamical Systems. New Understandings of Learning Overview. takes a mathematical view of cognitive behavior. but scientific discoveries—made possible by new technological developments—have generated new thoughts regarding how that learning takes place. Symbolicism. as educators have known for quite a while that the brain was involved in learning. 1995). uses a systems or network model in its approach to explaining cognition. Theories within cognitive science can be classified into three broad categories (Lehrer. Connectionism. it seems likely that those businesses will expect the schools to follow suit.
. new understandings about the way people learn suggest that some past educational practices may not be as effective as previously thought. The second. The last. uses a semantic or symbolic language processing model in its attempt to explain thinking processes. Cognitive science. Symbolicism.
Van Gelder and Port (1995) have developed the view that natural cognitive systems are a kind of dynamic system and thus are best understood from the perspective of dynamics. Bechtel & Abrahamsen. 1987. or mathematics. 1997) is the instructional method most closely associated with connectionism. Pinker. Representative of this approach is the Physical Systems Symbol hypothesis of Newell and Simon (1976). 1969) and Chomsky’s (1957) ground-breaking work in syntactical structures for linguistics. 1998. This approach had been used for analysis of any complex system prior to its application to cognitive science. Brainbased learning (Jensen. it parallels the discoveries made by recent research into the physical functioning of the brain. 1986). the most recently developed of these categories. computational systems. Rumelhart. McClelland. 1990. The theory of Dynamical systems. i. Central to this model is the belief that symbolic language processing best explains the functions of the mind. & The PDP Research Group. It is based on the hypothesis that the mind is a type of computer whose functions can be reduced to algorithms (Bechtel.14 These broad categories all have at least two things in common: the recognition that cognition is a process. In so doing. Cotrell & Small. It sees cognitive processes as consisting of interconnections which take place between nerve cells in a vast network.. and an approach to the problem from the point of view of one particular discipline. It grew out of research in artificial intelligence (Minsky & Papert. Van Gelder and Port
. In this sense. Connectionism is a computational approach to understanding the function of the brain. attempts to explain the behavior of the brain through treating it as a complex system and employs differential and difference equations to explain it. Symbolicism (Classicism) is a semantic modeling approach to cognition.e. 1983. language processing.
Ned Block (1980) recognizes three types of functionalism: Decompositional functionalism. the parts' functionality. Computation-representation functionalism is a special case of decompositional functionalism and is more a theory of mind than a methodology. Computation-representation functionalism (Block. recent discoveries regarding the way the brain physically functions suggest that there are some things that educators can do to facilitate the thinking that the mind does. However. Decompositional functionalism. Meta-physical functionalism (Block. Functionalism is the view that the essential property of a component is its role in relating inputs to outputs and to other components. in turn. 1980) adheres to the mind-as-computer analogy. 1980) is a theory of mind which sees cognition as a mental state or a functional state.
. Computation-representation functionalism. only the functional relations matter. which is primarily a methodology (Eliasmith. explains any system in terms of the functionality of its components parts (i. mental processes are decomposable to the point where they are thought of as simply instantiations of a digital computer (e. the entire system is the sum of its parts. and Meta-physical functionalism. In other words.e. The mind is a computer and as such..15 reject the validity of both the connectionist and symbolic schools of thought (which. This is integral to any systems or network view.. and the relationship of the parts to every other component). The actual physical implementation of the processes are irrelevant to what makes a mind. reject the claims of dynamicism as well as the claims of the other).g. 1996). the Turing machine). understanding how the brain itself works is less important than understanding how the mind works—at least for educators.
What is known suggests that the brain “learns” by creating electro-chemical connections that represent. in some way not currently understood. Brain research is concerned with the physical processes that take place when the brain thinks or learns. 1998) According to Robin and Malkas (2000). the brain is what might best be described as a neural-network (Pinker.
. Meynert (1884). The search for meaning is innate. and others expanded on this earlier work. These connections can be invoked. Brain-based learning. The Core Principles of brain-based learning are the following (Robin & Malkas. In the late 19th century a connection between thinking and neurons was posited by Spencer (1872). 3. in turn. James (1890). is concerned with the instructional strategies that facilitate the way the mind learns by supporting the physical processes used by the brain (Sprenger. Integral to this is the idea that as long as the brain is not prohibited from fulfilling its normal processes.16 Brain research. Later. 1999). 2. The brain is a parallel processor. again through a process which is only beginning to be understood. meaning and memory. the learning theory associated with brain research is based on the structure and function of the brain. Rashovsky (1938). meaning it can perform several activities at once. What happens chemically and electrically in the brain is only now being understood—primarily as the result of newer technologies that allow scientists to observe the process. Learning engages the whole physiology. 1997). Modern brain research. researchers such as Lashley (1929). so as to reproduce in the “mind” the meanings and memories stored in those connections (Jensen. learning will occur. like tasting and smelling. 2000): 1. on the other hand. Physically. and Freud (1895). builds upon that work.
Learning involves both focused attention and peripheral perception. Learning involves both conscious and unconscious processes. We have two type of memory: spatial and rote.1991.17 4. Thus. There are the three instructional techniques primarily associated with brainbased learning: The first is orchestrated immersion in which learning environments are created that fully immerse students in an educational experience. We understand best when facts are embedded in natural spatial memory. Chard. Educators should let students learn in teams and use peripheral learning. 6. allows the learner to consolidate and internalize information by actively processing it (Robin & Malkas. 2000).
. Each brain is unique. This suggests that teachers may want to design learning around student interests and make learning contextual. tries to eliminate fear in learners while maintaining a highly challenging environment. The search for meaning comes through patterning. et al. The last. Learning is enhanced by challenge and inhibited by threat. encouraging students to also learn in settings outside the classroom and the school building (Blumenfeld. Relaxed alertness. 12. and the “big picture” can’t be separated from the details (holistic learning). 1998). 11. 9. a second technique. 5. Emotions are critical to patterning. one implication is that educators should allow learners to customize their own learning environments. The brain processes wholes and parts simultaneously. Because every brain is different. active processing. 10.. 8. 7. Learning experiences should be structured around real problems. people learn best when solving realistic problems (contextual learning).
They based this upon the fact that new forms of communication (such as that offered by computers) affect social relationships and. today’s children might be more comfortable with TV instructors than they are with parental instruction or classroom teachers. they seem to be better equipped for life in the 21st century than their parents or even their teachers. as he puts it. The shift in social relationships is. 1997). in his recent book. thus. Rushkoff (1998) believes that today’s children are people of the future. They are. Smith and Curtin (1998) believe that postmodern children differ from children of the past. Tapscott (1998). “is not looking backwards at our own pasts—it’s looking ahead. they appear to have fewer ties to the past than to the future.” Rushkoff suggests. It may also be a mistake to assume that the current generation is a continuation of the previous one: “Looking at the world of children. it is more natural for them to learn from a screen than from an adult. 214). This may be one reason why the children of today seem so ignorant concerning history: they perceive it as irrelevant. In many ways. 2). That is. Parents have often utilized TV for direct teaching. 7). “the latest model of human being.18 Characteristics of 21st Century Students Part of children’s attraction to—and connection with—computers may stem from the fact that computers interface with them via a screen (Johnson. observes that “for the first time in history youth are an authority
. As a result. Part of this difference lies in the fact that they seem to be intrinsically forward-focused.” As such. Growing Up Digital: The Rise of the Net Generation. Today's children are members of what might be termed the educational television generation: their first formal learning experiences are via TV. psychological make-up. from “face-to-face to symbolic communities” ( p. Johnson calls them “screen-agers” (p. They are our evolutionary future” (p. as they see it.
3). They want to be users—not just viewers or listeners” (p. and one that educational researchers seem to have over-looked.19 on an innovation central to society’s development” (p. He calls today’s postmodern children the “N-Gen”—the Net Generation. The difference is. A major difference between the two. There is a kind of “generation gap” between the Net generation and all other generations currently alive. If McLuhan’s children were the TV generation. unlike the Sixties. will be between the ages of two and twenty-two” (p. in 1999. who is also a 21st century citizen facing a career in a globally-networked “knowledge economy” is something substantially different from that of previous generations.” The nature of the Internet as a legitimate tool of business and
. The television. the N-Gen is equally likely to be shaped by the interactive nature of the Internet. Today’s parents deal with chat-rooms and web surfing rather than rock music and war as ignition points in the traditional battle of “growing pains. The interactive—with the emphasis on active—nature of the post-modern child. 3). but around technology and infrastructure.” Tapscott defines the Net Generation as “children who. shaped by the pervasive presence and influence of television. For Tapscott. Viewers' interactions with television are limited to changing channels or turning it off. is that a computer screen is connected to an interactive program—it is two -way communication. it broadcasts to the viewer. ix). They are different than the generation McLuhan (1964) wrote about which grew up as part of the post-war “baby boom. on the other hand is one-way. this difference is central: “The shift from broadcast to interactive is the cornerstone of the N-Generation. From this it follows that teaching methods predicated on a broadcast nature—whether by television or lecture—may not “connect with” children of the post-modern N-Generation. the gap doesn't appear to be focused around cultural mores or societal values.
traditional school structures are those that are identified by Lee and Smith (1994) as being fundamentally bureaucratic in nature while those considered "restructured" exhibited more "communal" tendencies. in turn. but it is hard for parents to deny their child access to something that has become an everyday tool for so much of the working world." Lee and Smith concluded that there was "solid" evidence that students in schools using non-traditional structures learned more than those in traditionally structured schools. School Restructuring Recent attempts to meet the needs of 21st century students have resulted in a number of restructuring efforts that have met with limited success. Smith. implies new methods and new structures. suggest a new task for education—which. Using a list of 12 practices that they identified as "significant departures from conventional practices. “Hackers” as they are sometimes incorrectly called. School restructuring refers to the practice of using non-traditional structural practices at the local or district level. This may explain why the N-Gen’s fundamental difference in world outlook and cognitive constructs are so often overlooked—their manifestations are obscured by the more overt expressions of identity and protest. can still cause mischief and trouble. and so on. In general. Such a categorical statement is hard to accept without broad-based support. and Croninger (1995) not only confirmed
. of a networked world. The effects of the transition from manipulating data to manipulating bits.20 government has blunted its potential as a device of teen protest. A follow-up study based on data from the National Education Longitudinal Study in 1988 and 1990 made by Lee. possibly because they are based on Industrial Age principles.
academic press (the expectation of high standards and maximum effort).21 their earlier findings but included the further revelation that the positive effects of restructuring appear to be cumulative. and Croninger identified three features common to more effective communal (or. as they termed them. It is important to note that both studies identified the significant role which the social component of restructuring efforts played in the positive findings. 31). Coulson surveyed a variety of studies on the effects of desegregation upon student achievement and found that there was no significant positive effect on student academic achievement to be gained from forced desegregation. Psychologically. . Smith. . restructuring offers to make school systems more collegial and participatory—indeed more democratic” (p. 1994). this may be a manifestation of the "locus of control" effect whereby a person does not seek a goal that he or she believes does not fall within his or her sphere of control (Coulson. 1). Lee. This social significance is the focus of a study by Andrew Coulson (1994). Moreover. Coulson believes that it is the shared goals of the participants who voluntarily chose to desegregate rather than the act of being desegregated itself that resulted in positive academic achievement. but that voluntary desegregation resulted in a few positive significant differences. Gutierrez and Slavin (1992) found that the preponderance of
. In their meta-analysis of studies on the effects of non-graded schools on student performance. who argues that "the success of any human organization depends upon the unification of its participants' goals"—which is fundamentally social in nature (p. Shouse and Mussoline (1999) support Coulson when he notes that ". and authentic instruction. "organic") schools: common academic curriculum. To support his contention that success in educational achievement derives from social factors rather than from organizational factors.
Further. perhaps. ". it has had no empirical benefit” (p. .1). and. our data show that it [restructuring] has been disruptive to student performance in poor school districts and especially the very poor. . is what he refers to as “reforming the core technology of schools. the conditions of teachers’ work in schools. In other words. Shouse and Mussoline opined that the primary cause for these disruptions lay in
. 360). the effectiveness of the nongraded program stems from the increased amount of time for "direct instruction at the students' precise instructional level” (p. Shouse and Mussoline (1999) said. teaching and learning in schools. they concluded that the effects of nongrading depend on the form that the nongrading strategy takes. and the governance and incentive structures in schools. The first. to exchange traditional forms of schooling for pedagogical and organizational processes that fit new missions. (p.” He wonders what form schools would take if they were designed around the best available knowledge about teaching and learning. In a study completed at Pennsylvania State University. but that the conclusion was only valid for simple forms of non-grading and not for the more complex forms which exist. to shift from one set of guiding values and assumptions to another.22 evidence indicated that nongraded schools resulted in higher achievement. 18) There is evidence that some restructuring efforts can be detrimental to student performance. Elmore (1990) notes the emergence of a general agreement that restructuring is “about” at least three types of changes: teaching and learning in schools. Hunter Moorman and John Egermeier (1992) add to this idea: Restructuring suggests the need to rethink the mission of education in light of changing conditions and imperatives of the coming century. Their analysis suggests that much of the benefits of nongrading accrue from two factors: flexible grouping and flexible timeframes. to embark on an ongoing process of transformation instead of seeking static solutions to fixed problems. Even in affluent schools.
There does not seem to be a consensus about what works in regards to restructuring. it is not conclusive for any one particular restructuring method. but conditionally effective strategy for increasing student achievement.
. In essence. thus adversely affecting the possibility of a positive result. Secondly. Childs and Shakeshaft (1986) and Alspaugh (1993). educators and politicians look to increase funding as a remedy for education's ills (whatever those may be). 1995. 14-15) find that there is no correlation between educational spending and student achievement. The more complex the plan. p. The fact that simply spending money doesn't necessarily result in any measurable gains was addressed by Coulson who hypothesized that this was because money did not guarantee any alteration in instructional practices and often ended up being spent for noninstructional purposes. restructuring is expensive. Frequently. While the evidence presented so far seems to indicate that restructuring is a risky. the effort applied to making complex restructuring work offset any educational gain that might have accrued due to those changes. the more disruptive it became. The similarity between Coulson's findings and those of Shouse and Mussoline is obvious: Many restructuring efforts seemed to get sidetracked in the implementation stage. Other researchers find that certain types of restructuring make no significant difference at all in student academic success. Many educators and members of the public believe spending money on educational reform of dubious value is a poor use of scarce monetary resources— particularly when many schools are in such a state of disrepair that the GAO in 1995 reported that one-third of the nation's schools were either unsafe or unsuitable for children (cited in Mehlinger. pp. 51). For instance. cited in Coulson (1994.23 two areas: (a) the inherent complexity of most restructuring plans and (b) the demand on resources that such plans make on the institution being reformed.
learning is not the business of schools: The business of schools is to design. facts. is what happens when schools do their business right. 2000) and learning is the business of schools—or should be. it is learning that makes intellectual property. and that most will be functionally literate (able to read well). but on more basic human and social resources in a school. practices or structures. like profit in business. capital. (p. Newmann and Wehlage conclude from their study that: The quality of education for children depends ultimately not on specific techniques. according to Schlechty (1997). (pp. This goal has been achieved. to reason. and milieu from which students come and in which they are likely to function. However. (p. 49-50)
. create. and assets useable (Brown & Duguid. Even in combination. 2) Potential Impacts Schlechty (1997) suggests that the reason reform has been tried so often is because what the schools were designed to do is no longer serving the needs of American society. especially on the commitment and competence (the will and skill) of educators. group. Learning in schools. and that a relatively small number (20 percent or less) will be able to meet reasonably high academic standards.24 Newmann and Wehlage (1995) suggest that the reason for this is that no single reform is sufficient to ensure ongoing success. and invent high-quality. intellectually demanding work for students: schoolwork that calls on students to think. The schools were designed to ensure that all citizens will be basically literate (able to decode words). and understandings whose perpetuation is essential to the survival of the common culture and relevant to the particular culture. 11) In terms of a knowledge economy. restructuring methods do not always result in the desired increase in student performance. and to use their minds well and that calls on them to engage ideas. and on students' efforts to learn. it might be said that while computers may create the need for knowledge management and anchor business’ ability to put knowledge workers to effective use.
select. . is to invent a system of education the like of which has never been seen anywhere in the wo rld: a system of education that provides an elite education for nearly every child. i.” “Thinking work” is what people do when they find. Integral to thinking work is the ability to “innovate and adapt in the face of change” (Microsoft. The system emulated the prevailing 19th century thought about industrial management by being designed with an eye toward centralization of authority and funding (Lane & Epps. 14-15)
.25 In other words. . The question is whether knowledge management is what today’s schools are doing or what they were designed to do. What educators must do.. They are both doing what Microsoft CEO Bill Gates calls “thinking work. organize. (pp. In order for students to engage in the kind of activity Schlechty proposes. chapter 13). America’s schools are now being asked to do things they have never done [before] in an environment that is more hostile to supporting quality education than has ever before existed. . but what they were designed for is not what is needed for the 21st century. 1999. and present information in a new way (Gates. As Schlechty (1997) points out. therefore. The business of schools. Schlechty (1997) believes that today’s schools are better at doing what they were designed for than they have ever been. 1992). learning is the business of learners. therefore.e. knowledge management. 1999). they need the same type of support that knowledge workers require. This 19th century structure serve d its purpose better than most give it credit for. but that does not diminish the fact that it is poorly suited for the 21st century task of knowledge management. seems to be much the same as the business of business.
Such a blanket finding. would seem to suggest that the use of computer technology. typical problems faced in any area of educational research. at least. These problems focused on questions of methodology and interpretation of the results. Knapp and Glenn (1996) in a meta-analysis of research that represented an aggregate of over 120 studies on the effectiveness of computers in producing
. these same students were found to learn significantly faster and to have a more positive attitude toward their classes and toward computers. 1997. in general. is practically mandatory in the interests of effective and efficient education. p. 42). some researchers—including some on the committee itself—questioned the validity of such a conclusion in light of what they term "serious problems" with both the meta-analyses and the studies upon which they were based (President's Committee of Advisors on Science and Technology. However. that the use of traditional computer-based learning systems resulted in superior performance by the students using them when compared to students who did not use them. the committee found. In its report on the use of technology to strengthen K-12 education in the United States. Panel on Educational Technology. particularly when based upon such a broad-based study (a meta-analysis of four meta-analyses encompassing a total of 172 studies). The committee felt that these limitations merely spelled-out the need for more research rather than invalidating the results.26 The Role of Computer Technology Positive Effects of Computer Technology Perhaps the most comprehensive study concerning the use of—and potential of—technology in education was completed in 1997 by the President's Committee of Advisors on Science and Technology. Further.
27 positive educational outcomes, concluded that children favor computers over television because of the interactive nature of computers and that computerassisted instruction (CAI) leads to higher academic gains. They noted that CAI primarily addressed lower-cognitive material and that research on the effect of computers on higher-order thinking skills (HOTS) was still "emerging.” The one caveat Knapp and Glenn placed on their sweeping conclusions is that "computer applications alone do not achieve the results teachers and learners want." Effectiveness resides in CAI being a part of a total program (i.e., instructional milieu). The California Education Technology Task Force (1996) reported that a 1995 survey of more than 100 studies showed that technology-based instruction "significantly improved student performance" in the core academic disciplines. The same study reported that the U.S. military found that computer-based instruction required 30% less time to achieve its educational goals than did traditional methods (see the executive summary). In a paper presented to the Conference on Teacher Education and the Use of Technology Based Learning Systems in 1996, J. D. Fletcher surveyed the bulk of research on the effectiveness of technology as a teaching tool and derived from that study ten commonalities regarding instructional technology use: (a) Technology can teach—in other words technology is more effective than no instruction at all; (b) technology increases instructional effectiveness; (c) technology reduces the time required to reach instructional objectives; (d) technology promotes equity in achievement; (e) technology appears to be equally effective for knowledge and performance outcomes; (f) technology can be used to teach "soft skills" (social or interpersonal skills); (g) Interactivity is important (i.e. increased interactivity yields increased student achievement); (h) simulation
28 requires guidance; (i) students enjoy using technology; and (j) technology lowers instructional costs and appears to be cost-effective. Fletcher does note that "hardware alone does not define an instructional approach—what is done with the hardware is what counts” (p. 2). He also points out that a major problem with assessing innovative technologies is their very nature: because they are innovative, such technologies often have nothing with which they can be compared. Negative Effects of Computers on Learning Hawley and Duffy (1998) found that the benefits of computer simulations tended to be diluted when teachers either failed to coach students in problemsolving strategies or played too big a role in the actual discovery process. This coincides with one of the findings of Fletcher (1996) listed previously. The delicate balance required for optimal effectiveness seemed a difficult one to maintain by the majority of participating teachers. Hawley and Duffy's study illustrates a common problem with using modern technology in the classroom: new technologies are not always a good fit with traditional teaching methods, nor with traditional learning theories. In fact, new technologies used in traditional fashion have been shown to have a detrimental effect on academic achievement. Among the findings by Wenglinsky (1998) was the disturbing fact that the use of computers to teach lower-order thinking skills was negatively related to academic achievement and the social environment of the school—at least for eighth grade students. One of the more high profile books to attack the use of computers in the classroom, Failure to Connect (Healy, 1998), finds fault with the educational use of computer technology, not the technology per se. Healy argues that (a) computers divert scarce resources from other, more sound, educational disciplines;
29 (b) computers are used in age-inappropriate ways—especially with younger children; (c) so-called “edutainment” software teaches children more about impulsive pointing and clicking than about thinking; and (d) some software may interfere with the child’s natural impulse to learn. Much of the fault for these problems lies, according to Healy, not with the computers themselves, but with the fact that schools do not provide sufficient budgets for technical support or teacher training. These arguments parallel those which characterize school restructuring and are the type of deployment problem that social informatics tries to address (Kling et al., 2000). The similarity in the arguments against the use of computers and those against school restructuring suggest that the two may address the same core structures in modern education. Further, Healy does not contend that computers are bad for children, but that they are bad for children when used improperly. This is the converse of saying that computers are good for children when used appropriately. Again, Jonassen’s point that the way in which computers are used is of the greatest consequence is supported. Like Healy, Oppenheimer (1997) believes that computers divert resources that might be better used for more verifiable change. Oppenheimer bases his belief not so much on what research shows, but on what it doesn't show and on the history of repeated failure of technological innovations to produce lasting change (see pp. 45-46). It is in the closing quote that Oppenheimer reveals the core argument behind his resistance to computers: The purpose of the schools [is] to, as one teacher argues, “Teach carpentry, not hammer,” . . . We need to teach the whys and ways of the world. Tools come and tools go. Teaching our children tools limits their knowledge to these tools and hence limits their futures. (p. 62)
because it’s a tremendously flexible. when combined with the generally positive results of those studies as regards the effectiveness of computer technology upon student achievement. In fact. professor of computer science at Yale.30 Here again. Because it has been shown to be an effective tool for learning and because it coincides well with the best guess about what students wi ll spend their careers using. suggests that educational institutions should do whatever is required to deploy that technology as quickly as possible—even if that means radical departures from traditional organizational structures. the willingness of the power structures in education—
. it is the popularity of the computer—particularly among businesses and politicians—that argues most soundly for adopting the computer as an instrument of restructuring. 2001. The Role of the Computer in School Restructuring Even granting that each of the major meta-analyses detailed above may have looked at some of the same individual studies. the scope of the total is staggering. the problem is not with the computer. David Gelernter. elegant and powerful tool (quoted in Tristram. They want a single hammer that can do a million things. 59). in a recent interview for MIT’s Technology Review. That in itself might be enough to recommend the use of computers in the classroom but. People don’t want a million tools. p. as Jonassen has suggested. It’s like a hammer. The shear number of individual studies on the effectiveness of computer-based education says volumes about the preoccupation of both the academic world and the general public with this new technology. characterized the computer as follows: The PC isn’t a Swiss Army knife. but how it is used.
learning from computers (“Computer Assisted Instruction” or CAI). what he calls “Mindtools. they are not using the computer as a tool. The teacher. His analysis of existing computer use produced three such interactions: learning about computers (Computer Literacy). The computer. in his or her role as person in charge of the learning environment. and learning with computers. This notion of instrumentality focuses on the relationship between the user and the computer and the roles each plays in the learning process. and so on. according to the interaction between humans and computers. it is the teacher who. Computer literacy (learning about computers). that is. While knowing how to use the computer is essential to using it in any other way there is little that
. 2000) categorizes computer use in education relationally. The relationship between the learner and the computer is completely one-sided: only the student is active and only to a certain extent. in this case. The teacher plays a more active role than does the student and the body of knowledge to be learned is set and static. determines how the computer is to be used. how to use it. in this case. what the various constituent parts are called. When students learn about the computer. Computer Use Categories Theoretical Basis: Jonassen Jonassen (1996. plays an external role.” The keys to understanding Jonassen’s differentiations is in the instrumentality accorded the computer and linked to the role of the learner. usually in terms of defining the instrumentality. That is. its instrumentality. is the object of the learning—it is what the student is learning.31 which might otherwise resist other forms of restructuring—to make computers available allows educators to better meet the needs of 21st century students.
9) . Jonassen (2000) elaborates on the basic concept of Mindtools: Mindtools are computer-based tools and learning environments that have been adapted or developed to function as intellectual partners with the learner in order to engage and facilitate critical thinking and higher order learning. represent[s] a constructivist approach for using computers or any other technology. 2000). The learner is active but not in charge of the learning. The teacher’s role becomes that of a designer of the learning environment. Both the computer and the student are now involved in the learning process. [A] Mindtool is a concept . . the role of the computer is reversed from that in CAI: It is the learner who is in charge and the computer that responds to the learner. when used as a Mindtool. 10)
. predetermined and static. a problem poser. In fact. . . Computer assisted instruction (CAI) (learning from computers). [which] . with the teacher primarily demanding accountability.32 is required of the student in the way of higher order thinking skills (see Jonassen. and reflecting on what they know. . . (p. The body of knowledge that can be learned is essentially unlimited. not reproducing what someone tells them. or activity to engage learners in representing. manipulating. Teaching takes on the role of the proverbial “guide on the side” so emphasized in constructivism. It is only in the last of the three interactions between students and computers that the role of the learner takes precedence. as before. (p. . and a holder of accountability. In computer assisted instruction (CAI) the computer takes on a significant portion of the teacher’s role and the student responds to the computer’s lead. Mindtools (learning with computers). environment. because the student uses the computer to discover or construct it. The computer is generally “in charge” and the body of knowledge that can be learned is.
regardless of what label might have been placed on the use of the computer in the sample studies. where the categories were preselected and relatively limited in scope.33 Categories of Computer Use Using Jonassen’s three computer interactions as a basic model. The categories for instructional technique. These categories of computer use created appear in Table 1. categories for “computer use” in the meta-analysis can be created. will be introduced in Chapter 3. the variety of instructional techniques that might appear in the sample studies is potentially so large that grouping them ahead of time would be counterproductive. These categories were later adapted for use in the meta-analysis and will be discussed in Chapter 3. Instructional Techniques Categories Unlike categories of computer use. By creating categories based upon the above characteristics. This can be
. Consequently. creating categories for grouping according to instructional technique was reserved until after the study sample was finalized and the actual techniques were inventoried. The purpose of this study was to explore how educators can more effectively use computer technology to meet the needs of present-day students. instructional situations involving computers can be grouped with others of similar functionality. role of the computer. as finally used in the meta-analysis. and locus of control (who is “in charge”). and adding the defining elements of role of the learner. Restatement of the Research Question The educational needs of 21st century students are not being met and computer technology seems to hold some potential for meeting those needs.
programming. Is pedagogy affected by ICT? In A. A. (2001). R. 71-77)..
. .. (Eds. Loveless and V. .
See Loveless. B. static content a Role of the Computer Delivery Mechanism Tutoring Mechanism I Tutoring Mechanism II Assessment Mechanism Role of the Learner Passive Active Active Role of the Computer Passive Active Active Locus of Control Third Party Program control Learner control
Testing tool: A special category of tutoring mechanism Computer as Tool Something is produced
Role of the Computer Communications tool Productivity tool Mindtool “Living Tool”
Role of the Learner Active Active Active Active
Role of the Computer Passive Active Active Active
Locus of Control — Learner control Learner control shared control
Note: These categories of computer use will be adapted for use on coding the studies included in the meta-analysis (Chapter 4).34 Table 1 Categories of Instructional Use of Computer Technology Computer Literacy Learning to use the computer hardware use. Something old. DeVoogd. etc. New York: Routeledge/Falmer. Pedagogy and the curriculum: Subject to Change (pp. ICT. software use. something new . & Bohlin. Computer Assisted Instruction (CAI) Defined. Ellis.). G. networking. L.
. contributes to greater student achievement? In light of the large number of studies that exist which address either the effects of computers or instructional techniques on student achievement. Consequently. it seems possible that the answers to the question posed above might be discovered among the results of those studies. when applied to the use of computers in education. this study will employ the technique of meta-analysis to a sample of those studies in an attempt to derive a preliminary answer to that question.35 accomplished by answering the following question: Is there one combination of computer use and instructional technique that leads to greater student achievement than any other such combination? Implied within this question are two others: Which use of computer technology leads to the greatest student achievement? Is there a particular instructional technique that.
The assumption. The preceding literature review suggests that today’s students have different educational requirements than students in the past and that those differences call for a change in the instructional milieu by which they are educated. based upon the Review of the Literature. these studies are somewhat isolated and undifferentiated.Chapter 3 METHODOLOGY Introduction The purpose of this thesis is to explore how educators can most effectively use computer technology to meet the educational needs of 21st century students. have a positive effect on student learning. is that computers. then the resultant effect on measurement of student
. not merely the effect of the tools themselves. in general. If this is. This study seeks to ascertain the effective use of computers as instructional tools. but that the effect is determined more by how the computer is used rather than simply that it was used. This thesis seeks a somewhat different set of data: the combined effect of the use of computers and particular teaching strategies or techniques on learning or student achievement. Though it has been established that computers seem to facilitate learning under a variety of circumstances. Previous research syntheses have tended to focus on the effect of computers in relation to particular subject matter areas or the use of particular computer software types or functions. It was suggested that the instructional milieu required of the new millennium be centered around the use of computers and brain-based educational strategies. in fact true.
Thus. also called an experimental group. The two distributions can be either a control group and a treatment group. By comparing the difference or change between the mean of the two groups in terms of standard deviations. or the pre-treatment and post-treatment performances of the same group. A meta-analysis. Effect size is calculated. The advantage to this translation is that the resulting effect sizes can be used to compare studies that use different dependent measures. By comparing the estimated effect size of various combinations of computer use and instructional technique. 1978b).. according to Glass’s (1976) formula. an effect size is a proportion that compares the differences between the mean of two sample distributions as measured in standard deviations. the effect of the treatment on the experimental or post-treatment group can be calculated.” Briefly. this study used a convenient study sample (i. as follows: (1)
Mathematically this is expressed as (2):
. some idea of which such combinations are more effective on student learning can be estimated. compares the results of individual studies by translating those results into a standardized metric he called “effect size. derived from a finite study population. a convenience sample). This variance should be accompanied by a corresponding change in estimated effect size.e.37 achievement derived from the instructional use of computers will vary according to changes in the way they are used. to explore the foregoing notion using the statistical technique of meta-analysis. according to Glass (1976.
this is the procedure followed in the present study. Analysis and Interpretation Stage: Estimating the magnitude of an effect—the degree to which the phenomenon is present in the population or the degree to which the null hypothesis is false (p. Outline of the Procedure Cooper and Hedges (1994) outline five major steps to conducting a metaanalytic research synthesis. 2. 9). 3. The five steps are: 1. Data Evaluation Stage: Coding the literature. (2)
. locate. Data Collection Stage: Identify. Design of the Study The empirical research design of this study centers around the use of an exploratory meta-analysis as an analytic procedure to estimate the instructional effect that computers used under various instructional strategies have on student learning. 11). 5. Problem Formulation Stage: Primary research must exist consisting of “a minimum of two hypothesis tests” (p. With a few minor modifications. 11). “missing data will arise in every research synthesis” (p.38 Me – Mc Glass’s d = sdc Where Me = Mean of experimental group. and retrieve all relevant study documents. 4. Public Presentation Stage: Assembling and presenting the results of the analysis. and sdc = standard deviation of control group. Mc = Mean of control group.
That. Normally. many of the studies forming this universe are in direct contradiction to other studies or involve dramatically different study populations. 1994. “in primary research. Light and Pillemar (cited in Hedges. p. is what a research synthesis strives to locate: “Any research synthesis should allow the researcher to see patterns across studies that are not apparent when studies are examined individually or serially" (Cooper & Hedges. however. Research hypotheses.
. p. for example. Such a research methodology exists: the meta-analysis. does this treatment work?” The Type 2 question. 360). as Cooper points out. 55). or universe. “On average. For example: “Under what kind of conditions does the treatment work best?” The Type 2 question allows one to modify the research hypothesis as a greater understanding of the subject is developed in response to the information gathered. [this] redefinition of a problem as a study proceeds is frowned on. provides ample material upon which to conduct a research synthesis. In research synthesis. Unfortunately. Both instances create difficulties for those looking for some point of consensus or mutual agreement. 1994a) identify two types of questions or hypotheses that can be asked in a research synthesis: The “Type 1” question refers to a precisely specified hypothesis posed in advance of the analysis. The data “speak for itself" in this thesis. 1984. it appears that some flexibility may be necessary and may indeed be beneficial [italics added]” (Cooper.39 Problem Formulation There is a body of extant research on many of the individual aspects of the problem addressed by this study and that reservoir of data. asks a vague question intending to derive a more explicit hypothesis from the data gathered and analyzed. Type 1 and Type 2 research questions should not be confused with Type I and Type II errors in statistical analysis. on the other hand.
Studies must be research documents. 1994. ERIC and MEDLINE. Criterion 3: initial screening of the documents returned by the search 1. The use of computers was recorded. a Type 2 question was posed. 2. non-evaluative systems exclude the unpublished and most recent literature” (Cooper & Hedges. The criteria for document selection used in this study are shown below. Criterion 2: EDRS search criteria 1. convenient. 3. 2. The instructional strategy was identified or inferred. these broad. “The most egalitarian sources of literature are the reference database systems such as PsychINFO.
. Studies must be English language documents. 4. Only studies published since 1997 will be considered. p. By using the ERIC database with a defined time frame of studies to choose from. 10). and replicable study population. The ERIC database provides an accessible. Explanations for the establishment of the criteria follow. The word “Computer” must be found in the document’s abstract. it becomes possible to compare the results of this meta-analysis with future meta-analyses composed of groups of studies which precede and succeed this one. namely. Still. Criteria for document selection. Criterion 1: The report must have been published on the ERIC Document Retrieval Service (EDRS).40 In other words. “Under what instructional strategies (conditions) do computers (the treatment) produce the best results?” Data Collection Rationale for data selection.
F. The ERIC database was chosen as the study population from which to extract the specific study samples. to allow for the possibility that with the passage of time. while MEDLINE is oriented around medical research. and 2. The study identified either a control group and a treatment group. on-line education. d. z. as much as possible. including instructional technology. The intent is to avoid. the limitations that earlier. to take advantage of any advances in computer technology that might have a meaningful impact on its use in the classroom. Criterion 4: Final stage of selection. r. Computer-based training (CBT). multimedia. the necessary statistical data must have been reported (i. software. and second. Computer-based Instruction (CBI). ERIC was chosen over PsychINFO and MEDLINE because it specializes in educational studies.. leading to a greater variety in the instructional milieu surrounding the use of computers in the classroom. or the study identified a Pre-test/Post-test assessment of a treatment group. Computing includes more than just computers. hardware. peripherals. a larger number of educators will have become technologically literate. of which learning is only a part. Student achievement was recorded. The search was limited to studies published since 1997 for two reasons: First. df.41 3.
. networked environments. Computer Assisted Instruction (CAI). required the study document to complete 1.e. t). and the World Wide Web. it includes such categories of computer use as distance education (using computers via the Internet or other network). less powerful computer technologies might have placed on instructional choices. whereas PsychINFO is focused on psychological issues. N.
& Smith. Studies which are included and excluded greatly determine both the statistical outcomes and the subsequent conclusions drawn from those outcomes (Thompson. On the other hand. Research synthesists who reject this idea are quite sensible. The point is to avoid missing a useful paper that lies outside one’s regular purview. (p. the documents retrieved from the EDRS were further limited in the following ways: Only documents located on January 28. McGaw & Smith. if not actual validity in the statistical sense. 44) This thesis will take the “sensible” view that a well-documented. . argues for using the “best evidence”—that is. as well as the criteria for inclusion in the ERIC database lend to each included document the cachet of respectability. leaving out seriously flawed studies. on the other hand. This represents a selection criteria somewhere between that of Slavin’s (1986) “best evidence” and Glass’s (Glass. quasi-experiments. the selection of documents to be included in a meta-analysis is the most important step in the process. 2002) runs contrary to Glass’s concept of the meta-analysis as exhaustive study (Glass. medical
. 1993). and correlational studies were considered. albeit limited. 1981) "exhaustive inclusion. the mediating value of publication. Slavin’s point is reinforced by White (1994): The point is not to track down every paper that is somehow related to the topic. 1981). Importance of document selection. Slavin (1986). . study universe of “useful papers” is contained within the ERIC database. For instance.42 Lastly. [italics in original] . Limiting the study universe to those studies found in the ERIC database as of a particular date (in this case January 28. McGaw. 2002 were included in the study and only documents purporting to be experiments.” It is also appropriate to point out that the notion of a limited study population has gained acceptability in other situations. In some ways. .
The movement is from general to specific and from aggregate to disaggregate. lastly. 2002 a numerical list of descriptions by ERIC Database Retrieval Number was finalized on February 7. statistical criteria (Does the study report include statistical results that yield information about the conceptual and categorical criteria?). The temporal sequence of cause and effect cannot necessarily be determined in a crosssectional study. This was occasioned by the progressively exacting nature of the criteria being applied.43 researchers—from whom meta-analysis originated—often use what is called a cross-sectional (prevalence) study. Documents were eliminated or retained through the progressive application of four categories of evaluative criteria: conceptual criteria (Does the study involve the basic conceptual elements being sought?). Cochrane (2000) defines a cross-sectional study as: [a] study that examines the relationship between diseases (or other health related characteristics) and other variables of interest as they exist in a defined population at one particular time [italics added]. (p. Document selection. and. From the original 825 docume nt titles returned by the search of the ERIC database on January 28. 2002 from the raw web pages
. the procedure for determining which studies to include was through a process of reduction. At each selection level in the present study. 8) That description closely fits the present study population. Each “level” of selection involves examining documents for evidence of the criteria required to retain it for the next level of selection. In general. meta-analytic criteria (Does the study report include the statistical information required by the meta-analysis?). categorical criteria (Does the study include the required categories of background information and variables?). the number of documents declined while the time spent per document increased.
8% of the original 825). because they employed phenomenological inquiry
. and saved on a CD-ROM. all of the ERIC database document descriptions for these studies were examined in several stages that yielded a list of possible studies to be included in the meta-analysis. This document list formed the reduced population from which the meta-analytic study sample was formed.” Once reduction for conceptual relevancy was completed. merely the fact that the abstract suggested that the study contained the requisite categorical data and that the variables reflected the interest of the thesis. only 191 of the original 825 studies remained (23. downloaded.. the study was retained for the next level of examination. did not involve student learning) or the actual study date (as opposed to the publication date) lay outside the established framework were more closely examined by scanning their abstracts. Next.e. “significant” or “not significant”) were not considered as criteria for retention. The abstracts for these documents were located. Next.2%). Studies were excluded because they were obviously nonempirical (policy statements. theoretical treatises.” A more thorough reading of the abstracts further reduced to 81 the actual number of study documents to be purchased (9. Reduction at this level was according to “Conceptual Criteria. Those studies that were not obviously dismissed because the subject was inapposite (i. the abstracts of each of the 191 studies that met the conceptual criteria of the study were scanned to identify those which were likely to contain the necessary categorical information. As a precaution to ensure that every possible document that might be applicable was found.. if the description of the study did not immediately eliminate it. etc. This initial reduction by category narrowed the list of potential studies to 130. literature reviews.).44 saved off the Internet. many of which were initially identified as “marginal” or “peripheral.e. The actual results of each study (i.
subjective or introspective reporting. Data Evaluation Rationale for coding procedures.) 6. Instructional level: K-3. Particularly numerous were studies of “computer use. Adult 7. see Eliasmith. Control/treatment or pre/post-test design Coding of studies. The following information about each study was deemed essential to the meta-analysis and was given priority during coding: 1. College/University.) 4.” meaning time spent using a computer without regard to how it was used. Only information deemed essential to the purpose of the thesis were coded (Cooper & Hedges. multimedia.45 techniques (i. Activity information (projects. 4-6. 7-8. or because they did not deal with learning. reports. etc. The coding list was intentionally kept simple for clarity.) 5. 9-12. A subset of the coded variables—those which were most relevant to the goal of this thesis—were used in the actual meta-analysis. 2001) or dealt with affective or attitudinal matters. lecture. The reported or calculated effect size 2. the remainder were coded in case they were needed during the interpretation phase of the meta-analysis. The instructional technique used (drill and practice. inquiry learning. The assessment instrument 3.e. etc. etc.
. The coding process is generally the lengthiest part of the meta-analysis. 1994). The computer activity or use (simulations.. not to be confused with the related philosophical movements from which this inquiry method is derived. word processing.
46 Coding took place directly onto the computer using Microsoft Excel. with a third partial coding focusing on the statistical data. Particular attention was paid to consistency in coding. selection basis–random or not). each numerical code for CU and IT were also assigned an alphanumeric identification label. A spreadsheet was created with categories for study characteristics (ERIC document number. the chance that coding errors might influence the outcome of the meta-analysis is minimal. The final error type commonly arising from coding activity involves mistakes in coding. relevant statistics were extracted or calculated for each study. consistency was improved. population characteristics (number of subjects. There were no intercoder reliability problems. On the other hand. For convenience sake. Seven values for Instructional technique (IT) were employed for coding (see Table 3). These labels will be used in the narration rather that the numerical codes. Age/Grade. each paper was fully coded twice. Three types of errors can occur due to mistakes: errors of omission (not finding relevant information that was included in the study report). type of study). Lastly. environmental characteristics (Subject matter studied and length of treatment). citation. that the use of only one rater is a limitation on the study. however. It must be acknowledged. and statistical characteristics (independent/dependent variable). coding papers provides its own practice and practice helps produce proficiency. Thus. Consequently. since only one coder was used. Five values for computer use (CU) were used to code each study (see Table 2). conceptual characteristics (Instructional Technique and Computer Use–see below).
. By recoding each study a second time. The second coding for the studies that were the first to be coded was especially important as the increased proficiency developed after coding the entire 37 studies resulted in numerous adjustments to the initial coded data.
of Studies 10 12 3 7
Table 3 Instructional Technique (IT) Co ding Guide Category Collaborative Learning (cooperative learning) Advance Organizers/informed strategy training None/computer used as delivery agent Distance Education/distributed learning Process Aid/Formative Feedback Learner Control/problem-solving (constructivist) Other Code CL AO CO DE PA LC OT
.47 Table 2 Computer Use (CU) Coding Guide Category Delivery Tutoring Other Tool Code CU1 CU23 CU45 CU678 No.
145). Thus. The meta-analysis software performed these corrections. 1998. To reduce this type of error. p. it is expected that very little relevant data were omitted. Consequently. and errors of application (incorrectly applying otherwise correct information). Errors of transmission are more difficult to catch. The impact of any errors in application that may have been made was minimized by averaging the aggregate of the multiple statistics. In addition to the normal averaging process. The Hunter and Schmidt method corrects for this lack by “estimating the corrections for reliability and range departures by constructing distributions for the independent and dependent variables” (Lyons. The last of the three types of errors is the hardest to address. The last step in processing the individual studies was to check each of the printed study manuscripts for the presence of the statistical information needed to
.48 errors of transcription (incorrectly transcribing numbers or categorical information). Orwin (1994) points out that “it is a psychometric truism that averages of multiple independent ratings will improve reliability (and therefore reduce error) relative to individual ratings" (p. impossible to obtain. Reliability and range departure information were not always available in the included studies and short of contacting the authors. that information was not entered into the coding database. the meta-analysis software programs further corrected for application errors by employing calculations using estimations of reliability and range departure. data was entered directly into the Excel spreadsheet with the study in hand and the figures double-checked. The first of these errors was minimized by the multiple coding process (three times for statistical data). 5). Particular care was taken with the sample and population size values as the addition of even one to the value of n can have a dramatic effect—especially on effect size and tests of homogeneity.
. or r statistics were desired although a probability coefficient coupled with the mean and standard deviation could suffice. where two or more reported statistics could be used. The final reduction used a combination of statistical and meta-analytic categories and took place as part of the coding process. The reported statistic was r effect size. Computer use (CU) was treated more like a constant (that is. Each study was represented by one identified Instructional Technique (IT) as the independent variable (where available) and by student achievement as the dependent variable. complete ANOVA statistics were reported. Identifying variables. The presence of the F. The selected study statistics were chosen with the intention of most closely measuring the combined effect of computer use and instructional technique. the sample and population sizes were required. The final number of included studies was 31 (3. While coding. t. Since only one statistic was selected from each study. z. if not. After reduction by statistical categorization was completed. a total of 37 studies remained (4. then 2.4% of the original search returns). In addition. computer use was part of every treatment) than as a second independent variable.8% of the original 825 returned by the search of the ERIC database). A list of the final study documents included in the study are listed in ERDC number order in Table 4. the following priorities were followed in determining which one would be included: 1.49 calculate effect size. two additional studies were eliminated and an additional four studies were discarded during the conversion to common statistical measures because the data necessary for the conversion could not be found or interpolated.
Number of Studies per Subject Age Group (Table 6). Schwarzer. when more than one statistic met the above criteria. 13 required extrapolation of the df for the F statistic. Four studies were eliminated from the meta-analysis at this point because they did not contain the data necessary to generate the measures required for the meta-analysis. the smallest effect size was selected as a conservative measure 4. an examination of the study sample was conducted and the results placed in tables for use later during the analysis phase of the meta-analysis. After all the relevant study information was coded and entered into the Excel spread sheet. Four of those five studies reported a t-test or a z. Converting to common statistical measures to create study-level summary statistics. the next step was to convert each study statistic into a common measure (Egger. 1997. Study Independent Variable CU: Computer Use by Year (Table 8). Compiling study population data. while five did not include either an analysis of variance or the individual statistics necessary to calculate it. Prior to performing any comparative statistics on the constituent studies.
.score and one reported only an r effect size statistic. Lyons. 1998. Two of those studies involved large sample populations and were performed by commercial statisticians as part of a government-supported study who reported only aggregate findings and summary statistics—neither of which could be used to generate t he required values. Those results are exhibited in the following tables: Number of Included Studies by Year of Publication (Table 5). Most studies yielded an F statistic (analysis of variance). & Phillips. Smith. 1991). and Study Independent Variable IT: Instructional Technique by Year (Table 9). Coder judgment.51
3. Independent and Dependent Variables for each Study (Table 7).
52 Table 5 Number of Included Studies by Year of Publication Year of Publication 1997 1998 1999 2000 2001 Total number of Included Studies Number of Included Studies 6 8 11 5 1 31
Percentage of Total 20% 26% 36% 16% 3% 100%
Table 6 Number of Studies per Subject Age Group Age Group Kindergarten – Grade 2 Grades 3 – 5 Grades 6 – 8 High School (Grades 9 – 12) College Undergraduate College Graduate Adult Total number of Included Studies Number of Included Studies 3 4 8 2 15 1 1 34*
Common Designation Primary Intermediate Middle School High School College Graduate School Professional
Note: *3 studies combined Grades 5 & 6 or 5 through 8 to give a total greater than 31
Table 8 Study Independent Variable CU: Computer Use by Year CU Code 1997 1998 1999 2000 2001 Totals
Delivery Tutoring Other Tool Total
4 1 0 1 6
4 3 0 1 8
2 6 0 3 11
0 2 1 2 5
0 0 1 0 1
10 12 2 7 31
Table 9 Study Independent Variable IT: Instructional Technique by Year Category Collaborative Learning Advance Organizers Computer as delivery agent Distance Education Process Aid Learner Control Other Totals 1997 2 1 0 0 2 1 0 6 1998 2 0 4 0 0 2 0 8 1999 2 1 4 2 1 1 0 11 2000 2 0 3 0 0 0 0 5 2001 0 0 0 0 1 0 0 1 Total 8 2 11 2 4 4 0 31
it was the easiest and most accessible conversion to make using the software available. From a practical standpoint. that is. and controls to overcome it. the r measure of effect size is more general in interpretation than d. Transforming to r. the difficulty arose because of a non-significant result involving that particular variable. Pearson’s r correlation coefficient was used as the common measure of effect size since it is unit-free as is d. & Smith. 1991. Glass. control/treatment
. In each case. 1981. Statistically speaking. The corresponding study report simply stated that the finding was either “not significant” or that no significant difference was found without including the accompanying statistics. the lack of complete statistics forced the use of less meaningful independent variables. This creates the statistical effect of changing all the different fruits in the metaanalysis basket to apples so that apples could be compared to apples. and is more simply interpreted (Rosenthal.e. Categorically speaking. McGaw. it remains consistent across sample-types. 1980). Glass.56
In a small number of cases. the r statistic is appropriate to use when the means of two groups are being compared (i.. 1991). Schwarzer. 1994. & Miller. 1990. By far the best way is to treat apples and oranges as fruit where possible and compare them only in respect to their characteristics as fruit (Glass. an effect size r statistic (correlation coefficient) was generated for each by transformation of a t or z or by conversion from F. making it useful in a broader variety of cases. Schwarzer. Rosenthal. Once the statistics for each study were identified and coded. 1978b. it requires no computational adjustment. Mixing apples and oranges has been a criticism of meta-analysis in the past and metaanalysts have typically dealt with that problem by using a number of corrections. Smith. weights.
1)(sc)2 ) / ( ne + nc . effect sizes.1)(se)2 + (nc . as was the case in every included study. Ralf Schwarzer’s (1991) meta-analysis software programs were used to convert the study level data into an effect size estimate r for each study.2) (4)
. Tests of homogeneity ? 2 and the fail-safe N were conducted simultaneously. Lastly.57 or pre/post-test) or when the variables are continuous. Standardized mean difference d is computed by subtracting the mean of a control group from the mean of an experimental group.e. the Binomial Effect Size Display (BESD) was calculated for selected effect sizes to aid in explaining what the effects sizes represented.Mc ) /SD (3)
Where SD is the square root of the weighted average of the two pooled variances calculated as follows: s2 = ( (ne . mean effect sizes were calculated for all sub-groups of the study sample grouped according to computer use and instructional strategy. and the fail-safe N for additional sub-groups were calculated as necessary to identify the relative effect of various treatments (i.. Next. One estimator of effect size used in this meta-analysis is standardized mean difference as proposed by Glass (1976). Reporting effect size. Instructional Technique and Computer Use). and dividing the result by the pooled standard deviation of both groups: d = (Me . tests of homogeneity. Calculating Mean Effect Sizes The final step was to calculate the mean effect size for the entire 31 study sample to derive an overall effect. Next.
Homogeneity. p. as calculated above has a small sample bias. Schwarzer's Meta-Analysis Programs separates the observed effect size variance into both parts using the formulas found in Hedges
. 1994. the degree to which effect size estimates "exhibit greater variability than would be expected if their corresponding effect size parameters were identical" (Cooper & Hedges. 80) show that d. "The observed variability in sample estimates of effect size is partly due to the variability in the underlying population parameters and partly due to the sampling error of the estimator about the parameter value" (p. using the following correction:
( 1 . Tests of homogeneity. They remove this bias. For the purposes of this study the more commonly recognized symbol d will be used to indicate Hedges’s gi for effect size.9 )) * gi è di Formula 5: Hedge’s corrected effect size
This is the effect size measure calculated by Schwarzer's (1991) Meta-Analysis Programs and reported in this meta-analysis as d.( 3 / 4 * N . 536) was calculated using the formula for ? as introduced by Rosenthal (1984. 191). p. 77): (6)
Formula 6: Where k is the number of effect sizes According to Hedges and Olkin (1985). yielding what they term the unbiased effect size estimator d. p.58 Hedges and Olkin (1985.
100%) is accounted for by sampling error.59 and Olkin (1985. 1982. Schwarzer's Meta-Analysis Programs follow the Schmidt-Hunter method of computing sampling error variance (Hunter.e.e.) is indicated (Kenny. If all of the of the observed variance (i. Schwarzer. subsequently. systematic) factors and the presence of moderator or mediator variables (study characteristics. 194). 1991. Schmidt. and its square root is called the residual standard deviation sres (Schwarzer. p.sampling error variance (7)
The percentage of the observed variance accounted for by sampling error is then computed by Schwarzer's Meta-analysis programs as shown in (8). 14). 1999.
sampling error * 100 observed variance =
% of observed variance accounted for by sampling error
The resulting percentage measures the degree of homogeneity of the sample data set. then the residual variance is due to other (i. s e = ((1 . the residual population variance (10) by subtracting the sampling error s e from the observed variance s r.r ) * k) / N Where r = the squared weighted mean of the effect sizes. etc. 1991).s e
2 2 2 2 2 2 2 2 2 2
The population variance s res is also called the residual variance. p. 44) as in (9) and. The population variance is computed by subtracting the sampling error variance from the observed variance: population variance = observed variance . the study data sets are homogeneous. & Jackson.. If not. and N = total sample size s res = s r . k = number of studies. p.
according to Baron and Kenny (1986). this thesis will adopt the terminology and distinctions suggested by Baron and Kenny (1986). In the more familiar analysis of variance (ANOVA) terms. According to Baron and Kenny (1986): In general terms. the core concepts focus on why. Formula 14): ne + nc estimated s (di) = ne * nc
2 ( ne + nc )
Intervening and interactive variables. p.. level of reward) variable that affects the direction and/or strength of the relation between an independent or predictor variable and a dependent or criterion variable.60 According to Schwarzer (1991). 1174) A mediator variable. Though there seems to be some disagreement among statisticians regarding semantics and terminology when referring to both interactive and intervening effects. a moderator is a qualitative (e. a moderator is a third variable that affects the zero-order correlation between two other variables.g. class.. Baron and Kenny (1986) distinguished between moderator and mediator variables when assessing the effects of influencing factors upon the relationship between independent and dependent variables. 29). “may be said to function as a mediator to the extent that
. The program calculates this value using the formula provided by Hedges and Olkin (1985. on the other hand. race. (p.g. . the "variance of a single effect size depends on its sample size" (p. . Specifically within a correlational analysis framework. . sex. is a variable that.) or quantitative (e. and how (and to what degree) the relationship between an independent variable and its associated/related dependent variable is changed. a basic moderator effect can be represented as an interaction between a focal independent variable and a factor that specifies the appropriate conditions for its operation. In this regard. when. 86.
as we have seen. Kashy. They are the effects of the independent variable which become causes of the change in the dependent variable. exerts a direct influence upon the dependent variable from an external position and represents a non-linear effect. a link in a causal chain between independent and dependent variables and represents a linear effect. it might be said that interaction effects are occasioned by moderator variables and intervening effects are caused by mediator variables (Kenny.61 it accounts for the relation between the predictor and the criterion. Graphically. mediators speak to how or why such effects occur” (p. B = Mediator variable. therefore. play a role that is just as significant. 1998). may either accentuate or minimize the influence that the independent variable (treatment or intervention) exerts over the dependent variable (outcome). though less obvious and less often studied. 1176). Hunt (1997) characterizes the difference as follows: Moderator variables. the two can be represented as in figures 1 and 2:
Figure 1. 51) In general.” They go on to further distinguish between moderator and mediator variables: “Whereas moderator variables specify when certain effects will hold. Mediator variables. A mediator variable is. Intervention variable. if not more so. on the other hand. & Bolger. A moderator variable. C = Dependent variable
. (p. A = Independent variable. Mediators explain how external physical events take on internal psychological significance.
MV = Moderator variable. due for example to the influence of extraneous variables. an unknown number of non-significant studies with effect sizes of zero have either not been submitted for publication (reporting bias) or have been rejected (publication bias) and. 1986. The fail-safe N calculates the number of these non-significant file-drawer studies required to bring the mean effect size down to a non-significant level as well. in any given meta-analysis universe. 1979. Fail-safe N. have remained in file drawers some where. so. Schwarzer's program uses formula (12) to estimate how many no-effect findings would have to exist in the file drawers in order to invalidate a significant overall p. 1). According to Rosenthal. 38). 108. 1984. Interaction variable. p.62
IV DV MV
Figure 2. is called confounding as it tends to confound our reading and to bias our estimate of the effect studied” (p. p. it is possible to estimate the number of additional studies that would be required to reverse the overall p to a value higher than significance (Rosenthal. DV = Dependent variable
According to Pearl (1998) “when there is a third variable Z that influences both X and Y. Wolf.” He prefaces this statement by observing that “the presence of spurious association.
. IV = Independent variable. such a variable is then called a ‘confounder’ of X and Y. Rosenthal (1984) described what he called the "file-drawer problem" which assumed that.
the fixed effects model doesn’t reflect the effects of heterogeneity as well as the random effects model and the detection of heterogeneity is important to the goals of this thesis (Hedges. but the principle holds for d as well. In meta-analysis. The most typical procedures are those developed by Hunter.50) – (ESc /2 – . Though the limited study universe used for this meta-analysis might better constitute a fixed effects model because the current study only concerns itself with what is happening with the included studies. Meta-analyses use weighted averages of the individual study results to generate an overall effects size (Egger.
Where ES t = Treatment effect size and ES c = Control effect size. 1997). (Rosenthal & Rubin originally calculated this using r. Raudenbush.63 (12)
Binomial Effect Size Display Rosenthal and Rubin (1982) created a metric to statistically illustrate this movement called the Binomial Effect Size Display (BESD).
(ESt /2 + . & Phillips. 1997. and Jackson (1982) and Hedges and Olkin (cited in Schwarzer.) Random effects model. 1994a). Schmidt.. 1991). a random effects model uses both the within study sampling error and the between studies variation to generate the meta-analysis confidence level (Cochrane. Hedges. BESD is the difference between the success rates of the post-treatment group and the pretreatment group. Smith. Two models exist to apply statistical techniques to this averaging process: The fixed effects (or conditional) model and the random effects (unconditional) model (Egger et al. It is calculated as in (13). 1994b). 2000.
Singer. the random effects model is used when sample populations and sample effects sizes are not homogeneous. Notation and Symbols). p. 191. Only one rater (coder) was used. Mean effect sizes for groups are reported in d.
. The reliability and validity of individual studies were not established. author. A convenience sample was used: Only studies from the ERIC database were used. 2002 were included. 1991. subject sample size N (aggregate number of individual subjects in the sample studies). A limited study population was used: Only studies published since 1997. homogeneity ? 2. significance levels p. 1985. quasi-experiments. Limitations of the Study The following limitations exist in this study: 1. independent and dependent variables. citation. and correlational studies were considered. and the sample statistics n and r. & Willett. 1994). p. when "the observed variability in sample estimates of effect size is partly due to the variability in the underlying population parameters and partly due to the sampling error of the estimator about parameter value" (Hedges & Olkin. degrees of freedom df. 4. Schwarzer. and fail-safe N statistics for each group Binomial Effect Size Display (BESD) is reported for selected sub-group effect sizes (see Table 10. translated or converted as required (Light. that is. along with the study sample size k (number of effect sizes).64 More importantly. 29) Reporting Meta-Analysis Results Reports for each included study contain the study name. only documents located on January 28. 3. 2. and only experiments.
” mean effect size = Binomial Effect Size Display
. standardized mean difference = symbol used for Hedge’s g corrected effect size = “d – hat.
Table 10 Notation and Symbols Symbol Definition ES es N n k M SD sd p ?2 Nfs d di d
= average effect size across a set of effect sizes. leaving the percentage of variation observed in effect sizes not accounted for by sampling error unaccounted for. Glass's symbol = effect size for a single study = number of subjects per grouping = number of samples in a study (participants) = number of effect sizes per grouping (or studies) = mean = average standard deviation across a set of studies = standard deviation for a single study = significance = Chi-squared.65 5. result of test of homogeneity = Fail-safe N = effect size. A formal statistical accounting for the heterogeneous nature of some of the meta-analysis subgroups was not conducted.
The range of r values for each subgroup of Instructional Technique (IT) was as follows: The r values for Distance Education (DE) were . Individual Study Results The reported or calculated effect size r for each individual study ranged from . These same effects sizes were further grouped into subgroups related according to Instructional Technique (regardless of Computer Use) and Computer Use (regardless of Instructional Technique).1612 (k = 4).1339 (k = 8). These studies were selected from a study population consisting of the studies in the ERIC Database Retrieval Center as of January 31.388 participants whose age-levels ranged from kindergarten through adult formed the study sample for this metaanalysis. These values are listed in Table 11 along with the original study data from which r was calculated or transformed. The study with the lowest individual effect size was Dehn (1997).0496 (k = 31). 2002 according to the criteria and limitations given in Chapter 3. Samplelevel statistics were calculated using the 31 individual effect sizes thus derived.. r = .. for Collaborative Learning (CL) were .6746 .4989 .1488 (k = 2).0496 (n = 90)..8266 . following the criteria and procedures listed before. for Process Aid (PA) were . One study level effect size r was created for each study.8266 to . r = .Chapter 4 RESULTS Introduction A total of 31 studies representing 6. The individual study with the highest effect size r was Mason-Mason (1999).8266 (n = 33). for Computers as Delivery Agent (CO) were
for Tools (CU678). Results of the Meta-Analysis Statistics for the overall and sub-group samples are reported in Table 12 which lists values for each sub-group by the standardized mean difference d.0630 (k = 11).0943 (k = 7).2323 . for Tutors (CU23) was . Tests of homogeneity ? 2 and the fail-safe N were conducted simultaneously..5311 to . for Learner Control (LC) were . Additionally.0897 (k = 12). Yu. Lastly. the conventional values suggested by Cohen (1977) for d will be used: Small (d = . The table ranks the comparison groups by effect size.7370 . the Binomial Effect Size Display (BESD) was calculated for selected effect sizes to aid in explaining what those effects sizes represented.7370 ..”
. group size k (number of effect sizes) and “fail-safe N” Nfs.0 are considered “very large. the range of r values for subgroups under Computer Use (CU) was: The r values for Other Uses of the Computer (CU45) were from .50). Medium (d = . and for Advance Organizers (AO) we re .68 . mean effect sizes were calculated for all sub-groups of the study sample grouped according to computer use and instructional strategy.4989 . The same statistics for additional subgroups were later calculated as necessary to aid in identifying the influence of various study characteristics on the relationship between the independent and dependent variables that modify the resulting effect sizes. The first step was to calculate the mean effect size for the entire 31-study sample to derive an overall effect. and Large (d = . in this thesis.80) (Schwarzer 1991. . When discussing effect sizes. and for Delivery (CU1) were . These results will be reported in Chapter 5... Likewise.0496 (k = 10). significance level p. 2002). effects sizes over 1.8266 . Next.0943 (k = 4).. group Homogeneity ? 2.20).0496 (k = 2).4489 .. degrees of freedom df.1612 (k = 2).
69 ?2 138.69 44.73 9.0000 k 31 Nfs 165
52.69 Table 12 Meta-Analysis Overall and Sub-Group Statistics: Standardized Mean Difference Grouping All studies Instructional Technique Distance Education Collaborative Learning Process Aid Computer as Delivery Agent Learner Control Advance Organizers Computer Use Other Tool Tutoring Delivery 1.0015 .40 df 30 p .0241 .0000 .93 0.0000 .84 3 1
56.52 30.69 0.45 1 6 .0005 2 7 12 10 17 53 58 39 1.57 0.81 1.29 21.63 0.1752 2 8 4 11 4 2 20 56 21 48 10 4 d 0.0000 .0000 .03 0.32 20.36 0.56 0.55 9
Notes: Statistics generated by Ralf Schwarzer’s MA d = Hedge’s g corrected (see Formula 5) ?2 = “Chi-square” using the Hunter-Schmidt Method df = degrees of freedom for ? 2 p = significance level for ? 2 k = number of effect sizes per group Nfs = Fail – Safe N
.43 1 7 3 .33 11 29.89 10 3.0000 .2822 .17 1.
The mean BESD for all studies was . p = . Effect size for use of the computer as a Tool (CU678) was d = 1. Effect sizes for groupings by Computer Use (CU) scaled from a high of d = 1. for Process Aid (PA) was d = .40 (df = 30. Effect sizes for groupings by Instructional Technique (IT) ranged from a high of d =1.
.03 (very high) and for Tutoring (CU23) was d = .66 for the treatment groups versus .56 standard deviations for the sub-group Distance Education (DE) (very high) to d = .0000).36 (small to medium).93 of a standard deviation (high).52 for Delivery (CU1) (medium).17 standard deviations (very high) for Other uses of the computer (CU45) to d = . Distance Education (DE). Homogeneity for all studies combined was ? 2 = 138. The effect size for Collaborative Learning (CL) was d = 0.32 of a standard deviation (small to medium) for the use of Advance Organizers (AO).63 (medium). The fail-safe N.70 Overall Statistics The overall mean effect size for the 31 studies comprising the sample for this meta-analysis was d = . ?2 = 20. and for Learner Control (LC) was d = . Nfs = 165. The ? 2 test for homogeneity for Instructional Technique (IT) groupings indicated that three groups.34 for the control groups. ? 2 = 44.69 (medium).69 of a standard deviation (medium). Collaborative Learning (CL).73 (df = 7.57 (medium). Sub-Group Statistics Effect sizes.69 (df = 1. indicates that it would require 165 non-significant studies to cause the combined significance of the meta-analysis to become in-significant. p = . Tests of homogeneity. for Computer as a Delivery Agent (CO) was d = .0000).
0015). was 48. ? 2 = 29.0000). ? 2 = 56. ? 2 = 21. The remainder of the groups were also heterogeneous.45 (df = 6. would require 20 non-significant studies to change significance for the two included studies to non-significant. Distance Education (DE). The Fail-safe N for the seven groupings listed above indicate that considerably more studies with non-significant findings would have to be included with the studies in those sub-groups to change their significant findings to insignificant. the fail safe N for Other uses of the computer (CU45).0005). were heterogeneous but had significantly low probability values for homogeneity. Nfs = 53. and for Delivery (CU1).89 (df = 10.
. were heterogeneous. ? 2 = 30. Nfs = 58.55 (df = 9. Tool (CU678). p = . p = . The Fail-safe N for Computer as Delivery Agent (CO). For Instructional Technique (IT) groupings. n = 10.29 (df = 1. Tutoring (CU23).0000). The ? 2 test for homogeneity for groupings by Computer Use (CU) indicated that all four groups. n = 12. n = 7. ? 2 = 52.71 p = . Other uses of the computer (CU45).0000). Nfs = 39. for Tutoring (CU23). but also had significantly low probability values for homogeneity. For sub-groupings by Computer Use (CU). p = . and Delivery (CU1). For Collaborative Learning (CL). with the exception of Learner Control (LC) and Advance Organizers (AO) which were homogeneous. was Nfs = 17. n = 2. Fail-safe N.33 (df = 11. for Tool (CU678).0000). p = . All of these statistics indicate a fairly low impact by reporting or publication bias. and Computer as Delivery Agent (CO). p = . but with large probability values for homogeneity. it would require 56 non-significant studies to change the significance of the eight studies comprising this group.
for Collaborative Learning (CL).37 .40 .36 0. BESDt = . and for Computer as Delivery Agent (CO) BESDt = . the BESD for groupings by Computer Use (CU) also reflect high levels of success for the treatment group versus the control group.30.
The BESD for the three sub-groups under Instructional Technique (IT) with low probabilities for heterogeneity previously identified are: for Distance Education (DE).62 BESDc .42 .22 .24
Notes: Statistics generated by Ralf Schwarzer’s MA.42 .73 .73 and BESDc = .32 .34 .23.27.27 . For the subgroup Other uses of the computer (CU45). Similarly.63 and BESDc = .59 .29 .03 0.41 .59 .63 .70 .71 .93 0.77 and BESDc = .69 0. BESD is calculated using r.35 .63 0.52 n 31 2 8 4 11 4 2 2 7 12 10 BESDt .66 .17 1.70 and BESDc = . Table 13 Binomial Effect Size Display (BESD) for All Studies and Sub-groups Grouping All Studies Instructional Technique Distance Education Collaborative Learning Process Aid Computer as Delivery Agent Learner Control Advance Organizers Computer Use Other Tool Tutoring Delivery d 0.56 0.23 .18 .46 .37.66 .The Binomial Effects Size Display (BESD) figures for all studies combined and for subgroups of Instructional Technique (IT) and Computer Use (CU) are shown in Table 13.30 .17 .72 Binomial effect size display (BESD).38 Difference . For the
.54 .57 0.34 .65 .26 .77 .32 1. BESDt = .30 .69 1. BESDt = .
for Tutoring (CU23).62 and BESDc = .38.73 sub-group Tool (CU678).23. All of these values indicate a considerable effect on achievement by the treatment group when compared to the effect on achievement by their associated control groups.29. BESDt = . BESDt = .
. BESDt = . and for Delivery (CU1).77 and BESDc = .71 and BESDc = .
controlling for study errors. These methods have been well documented and will not be discussed here other than to point out that. accounting for heterogeneity of the of study groups and. and Schwarzer’s MA correct for sampling error as much as is possible using study level statistics. 324). Controlling for Errors Hunter and Schmidt (1994) classify all errors that may occur in a metaanalysis into two categories: Systematic and Unsystematic artifacts. The precautions taken to reduce bad data were spelled out in the previous chapter. Systematic artifacts create errors that. if identified. Thus. both M. for most systematic artifacts there is a ‘correction formula’” (p. can be corrected. both the meta-
. Unsystematic artifacts are generally of two types: sampling error and bad (faulty) data. often there is an algebraic formula for the effect of the artifact.E.T. Bad data stems primarily from mechanical mistakes in handling the data.Chapter 5 DISCUSSION There are three basic tasks to be accomplished to make meaning of the results of this meta-analysis: First. second. Most algebraic formulas can be inverted. once again. Hunter and Schmidt (1994) have developed a systematic series of corrections for dealing with correctable (controllable) errors. There are two ways of correcting such errors: within each individual study if artifact information is available or at the level of the meta-analysis itself. last. As has been stated previously.A. determining what the effect size magnitudes generated actually represent. Hunter and Schmidt (1994) put it this way: “If such an effect [of artifacts on effect size parameters] can be measured and quantified.
In most cases. When multiple study conditions exist that may effect the outcomes being measured. Graziano & Raulin. 1991). a phenomenon known as confounding occurs. as is the case with a number of the groups in the present meta-analysis. Pearl. 1999. 38). Accounting for Heterogeneity A population effect size can only be interpreted reliably if the underlying data set is sufficiently homogeneous (Schwarzer. According to Eagly and Wood (1994) “when the effect sizes integrated into a synthesis prove to be heterogeneous. then “the observed variability in sample estimates of effect size is partly due to the variability in the underlying population parameters and partly due to the sampling error of the estimator about parameter value” (Schwarzer. the actual corrected effect sizes vary from the uncorrected values by relatively small amounts.69—a difference of only .1994. For instance. Confounding takes place when the measure of an effect is distorted because of its exposure to other factors that may influence the outcome under investigation (Fleiss. p. the mean uncorrected effect size across all studies in this meta-analysis was d = .01 of a standard deviation. If it is determined that a group is heterogeneous.75 analysis software programs used in the meta-analysis employ these and other correction methods to simultaneously calculate both corrected and uncorrected results. 2002. 1999. characteristics of the studies in the
.68 and the corrected value was . For a more in-depth explanation of the exact corrections employed see both Kenny (1999) and Schwarzer (1991). 1998). Matt & Cook. 1991. Hall. Moore & McCabe. 1994. those errors that can be corrected without going back to the original studies have been corrected and—provided the study data is homogeneous— should occasion little concern. Thus.
but both directly affect learning. a moderator variable. When such a heterogeneous condition occurs. CU and IT interact with each other.76 sample can be treated as moderators of the main-effect findings” (p. Theoretical Interaction of CU and IT In the present case. Learning = Dependent variable
. Interaction in the meta-analysis. Thus. Employing the definition provided by Baron and Kenny (1986). CU = Independent variable. suggests that IT moderates the effect of CU through mutual interaction and is called. which instructional technique (IT) is employed when using them (see figure 3).
CUiv Learning ITmv
Figure 3. because the hypothesis behind the study was that the effect of the use of computers (CU) on student learning (DV) is dependent on how the computers are used. Heterogeneity can generally be attributed to two influences: (A) the interaction between the independent variables and among the study characteristics and (B) the intervening effects of procedural and systematic factors at the level of the original studies. IT can be a priori considered a moderator variable for CU. one must go back to the studies to account for it. that is. IT = moderator variable. therefore. 489).
77 There is. For instance. such as age and prior knowledge. Computer Use (CU) could be considered to mediate its effects on learning. ho wever. This relationship is illustrated below (Figure 4):
Figure 4. Intervention in the meta-analysis. However. there seem to be few examples where the use of the computer actually modified the associated instructional technique. Participant characteristics at the individual study level.. a great number of the individual studies included in the meta-analysis did not have randomly drawn samples. when Instructional Technique (IT) was the independent variable. exert an influence on the relationship between the independent and dependent variables. Many of these samples were formed from pre-existing groups whose make-up suggests that they are not representative of the population as a whole (e. CU = mediator variable.g. IT = Independent variable. also the sense in which the use of the computer can be considered to mediate the effect of instructional technique. 1st year computer studies students or preservice teacher trainees) and whose prior knowledge might have impacted their performance on tests of assessment. In the sample studies comprising the present meta-analysis. while
. Variations between studies such as length of treatment (some studies consisted of one short treatment. Learning = Dependent variable Intervention Factors Study characteristics provide another way in which the relationship between the independent variable and the dependent variable can be altered. Systematic and procedural factors also affect this relationship.
N = 1577) versus a
. The difference of . and Adult combined) of .78 others lasted for several years) and assessment instrument added to the potential modification of the relationship between independent and dependent variable. Two tactics were used to identify factors which may have contributed to the observed variance in effect sizes across all studies: Blocking.49 (k = 24. where clusters according to individual study effect sizes were generated. That left 78% of the variance in effect sizes to be accounted for by population parameters and study characteristics. First. Using the strategy of blocking (Cooper & Hedges. 1994). study size.36 of a standard deviation in the effect of computers under various instructional strategies for teen-aged students was quite apparent. the percent of variance in the observed values for effect size due to unreliability and sampling error was 22%—a very low number.85 (k = 10. in which separate meta-analyses were conducted for each of these potential influences on the variation in effect sizes observed across all studies. College..84 (k = 11. study type. Blocking by study type revealed that the mean effect sizes for studies involving one group tested prior to treatment (pre-test) and after treatment (posttest) (i. The results of this blocking suggest that the instructional level of the participants in the individual studies had a large influence on the effect size and accounts for a portion of the variation not accounted for by sampling error. Post-Graduate.e. Blocking by instructional level of the study participants revealed that studies involving participants in grades 6 through 12 (middle school and high school) showed a mean effect size d = . N = 2278). and cluster analysis. pre-test/post-test study design) was d = . and length of treatment. N = 2332) versus a mean effect size for all other ages (K-5. four study characteristics were investigated regarding their influence on the across studies outcomes and their possible roles as mediators or moderators: Instructional level of participants.
The difference in .11 of a standard deviation between the effect sizes by study types suggests that study type had a small. N = 2169) for studies involving a control group and a treatment group (i. The effect size was d = . r = .1849). 1999) were removed.79 mean effect size d = . N = 1345) and the mean effect size across all studies dropped from d = . is attributable to the influence of length of treatment or to study size. and Mann.56 when those three studies were removed. 1997.3310) respectively.. The change of .94 (k = 3. and 6 years (n =242. the three studies with the longest treatment size were also the three studies with the largest sample sizes. Bar-Natan and Hertz-Lazarowitz (2000). Thus.58 (k = 30. first of all. There was virtually no change in the mean effect size across all studies when the two studies with the smallest sample sizes (Kao & Lehman. one year (n = 599. two years (n = 504. as shown above.e. the mean effect size across all studies of d = . blocking by size of study revealed that. yielded a mean effect size of d = . Shakeshaft. Lastly. Ignatz (2000). In effect. Becker and Kottkamp (1999). Blocking by length of treatment time revealed that the three studies involving the longest treatment times.69 when all studies were included and d = . With both the largest study and the smallest studies removed. the effect size across
.68 without the aforementioned two studies.69 to d = .69 dropped to d = .13 of a standard deviation accounted for by those three studies suggests that length of treatment time has some influence on effect size and accounts for a portion of the variance in observed effect size not accounted for by sampling error.58 when studies conducted under a pre-test/post-test design were removed. r = . a control-treatment study design). Murphrey.5311). but discernable influence on the observed effect sizes and can account for some of the variance unaccounted for by sampling artifacts. r = . it is difficult to tell whether the apparent influence on effect size by these three studies.
save the use of the computer as part of the instructional milieu. the fact that the highest effect size in the study
.. A cluster analysis (Davis. Mason-Mason was clustered by itself at each significance level. some characteristics of each study that were unique to that study in respect to the rest of the studies in the meta-analysis. The two other studies involved elementary school children. As such. It was also the only study among the entire ERIC database to specifically tie the use of computers to the concept of Mindtools. The MasonMason study. . Moore & McCabe. situated learning. was the only study in the meta-analysis whose participants were from a noneducational context: the participants were working adults from various occupations who volunteered their participation. C in Appendix). but also the most separation from the next smaller studies.01. k = 28. This suggests that length of treatment time may have had more of an influence on the variation in observed effect sizes than did study sample size. Examining the characteristics of these three studies finds that they had three characteristics in common: The treatment (instructional technique and computer use) involved problem solving in a contextual learning (i. anchored learning) using collaborative learning.1) identified three outlier studies: Mason-Mason (1999) (r = . with Shyu (1997) (r =.6257. was held in common by the three studies. 1986. There were. B. N = 3541).80 the remaining studies was unchanged (d = . for instance.05. 1999) for study level clusters using the 1%. These three studies not only had the highest individual effect sizes.e. No other characteristic. n = 47) at the 5% and 10% levels (see Tables A.8266. however.6746. n = 33) at all three significance levels.69. . and Shyu (1997) and Din and Caleo (2000) were clustered together apart from the remainder at the 5% and 10% significance leve ls. n = 37) and Din and Caleo (2000) (r = . 5 % and 10% levels of significance ( a = .
The Din and Caleo study was likewise uni que in that it was the only study to involve giving the members of the treatment group a computer (actually.81 sample came from the only study to specifically investigate the use of Jonassen’s Mindtool theory. The instructional milieu for this study was highly constructivist in nature. accounted for a portion of the observed variation in effect size from study to study. The a priori interaction by the independent variables IT and CU with each other and between the independent variables and the study characteristics from their associated studies. the systematic errors that could be controlled. but merely represents a conceptual model based
. There was no specific instructional technique associated with this study. a game machine) for home use and structured the study to involve the parents. Figure 5 was not generated statistically. and as stated in the study itself. and the variations due to chance that could be compensated for. along with the procedural and systematic differences between studies that could be corrected. makes a powerful statement regarding the validity of that theory. when combined with interventions by study and participant characteristics in the individual sample studies. despite the apparent similarity to the results of a path analysis. The Shyu study involved Chinese fifth graders in Taiwan who were randomly chosen and stratified according to ability. but it was one of only a few studies where the treatment (use of game machines at school and home) took place everyday for an extended period of time (40 minutes per day at school plus 30 minutes a day at home with the parent). accounted for the residual variation that led to the heterogeneity indicated by the ? 2 test of homogeneity. No other study involved parental involvement. both by observation of the characteristics. A visual representation of these interactions and interventions can be found in Figure 5. These influences. Note that.
little could be accomplished toward the generalizability of the study finding by further accounting for moderating influences.
. Given the exploratory nature of the present meta-analysis. IT = Independent variable. it must be acknowledged that the lack of a formal statistical accounting for the heterogeneous nature of some of the meta-analysis sub-groups places a further limitation on the results of this study. Learning = Dependent variable
Baron and Kenny (1986) delineate a four-part method for identifying mediators variables statistically. However.
Age Time Prior
Figure 5. and the use of a convenience sample.82 upon the a priori theoretical models discussed earlier combined with insights gained from the preceding search for intervention factors. CU = mediator/moderator variable. but none of these strategies were deemed necessary for the purposes of this study. Multiple interactions. while Pearl (1998) discusses the possibilities for detecting confounding variables.
This applies also to the differences in pretest and posttest scores by single groups. n = 8. The two highest effect sizes attributed to Instructional Technique are Distance Education (DE) . usually toward the right (i. above). higher or toward the hundredth percentile).85 of a standard deviation. The effect size is measured in standard deviations and indicates relative performance between the control and treatment (two or more groups) or between the pre-treatment and post-treatment scores on the same measure of achievement (one group). The result is two distributions. the shift represented by d = . one for the average performance of all the pretreatment groups in the meta-analysis and one for the average performance of all the post treatment groups from the same studies.
.56. These distributions normally overlap. Because only two studies comprise the subgroup Distance Education (DE). the distribution curve of the post treatment group is shifted.93. d = . This is illustrated in Figure 6 using statistics from a hypothetical meta-analysis that resulted in an effect size d = . d = 1. In effect. n = 2 and Collaborative Learning (CL). but their means are separated by the difference of those means in standard deviations. the results of that group should be interpreted with caution.69 can be illustrated as in Figure 7 below.83 What Effect Size Estimates Represent The effect size estimate d used in this meta-analysis represents the mean standardized difference between the performance on assessments of learning by treatment or experimental groups and their associated control groups. Applying this same principle to the overall meta-analysis of all 31 studies and using the effect size from Table 12 (page 71..e. The meta-analytic procedure averages the relative performance of each study pretreatment group together and compares it to the average of the performance of each study post treatment group.
Figure 6. Effects size figures are for illustrative purposes only and do not reflect the actual results of this or any other meta-analysis. Graphic depiction of standardized mean difference. Overall mean effect of computers on student achievement indicated by the results of the 31 studies included in this meta-analysis. Modified from Marzano (1998). Overall mean effect of computers.
. Adapted from Marzano (1998).
17. n = 7.”
Figure 8. the two sub-groups with the largest effects sizes for Computer Use (CU) are Other (CU45).93 is considered “high. An effect size of . The combined effect of Collaborative Learning (CL) on student achievement indicated by the results the 8 studies included in this meta-analysis sub-grouping. and Tool (CU678). Mean effect of collaborative learning.03.”
.03 is considered “very high. the Computer Use sub-group Other (CU45) is comprised of only 2 studies and should also be interpreted with caution. in this case. this study will focus on the more conservative result from the next highest sub-group. d = 1. n = 2. Modified from Marzano (1998). Figure 8 illustrates the effect of Collaborative Learning on student achievement within the studies included in the meta-analysis.
Likewise. As with the sub-group Distance Education (DE). d = 1. Collaborative Learning (CL). The more conservative measure from the Computer Use (CU) sub-group Tool (CU678) is shown in Figure 9.85 Accordingly. An effect size of 1.
Stanne. Lipsey and Wilson (1993) observe that a sampling of meta-analyses from the fields of medicine and psychology report effect sizes ranging from . mathematics.55 in a meta-analysis of the effects of small-group learning in undergraduate science. Put in perspective. Berrier.96. Sacks. reported an effect size of . Springer. Ancona-Berk. Nagalingham.18 (all assessments in English) to .11 to . Greene (1998).46 to . The overall mean effect of computers used as tools.
Figure 9. in a study on the effectiveness of bi-lingual education. engineering and technology courses. in a meta-analysis of studies on the effect of cyclosporine on organ rejection.39. Rosenthal (1991b). In their study on the effects of the drug dipyridamole on angina.24. and Chalmers (1988) reported an effect size of . The results of the present metaanalysis compare favorably with those results.74 (all assessments in Spanish). The overall mean effect of Computers Used as Tools on student achievement indicated by the combined results of the 7 studies included in this meta-analysis sub-group. and Donovan (1997) reported effect sizes from . Modified from Marzano (1998). reported four effect sizes ranging from .
This translates into a mean improvement of approximately 32 percentile points on measures of achievement after treatments using computers by the participants in all of the studies included in this meta-analysis.75 times as much. In terms of relative effectiveness.87 Binomial Effect Size Display Another way to represent what the effect sizes reported in this thesis represent is to use the Binomial Effect Size Display (BESD). BESD can also be used to compare the relative effects of the different treatments represented by the studies in this meta-analysis. Collaborative Learning (CL) was associated with increases in student achievement measuring more than twice that of Advance Organizers (AO). the Computer used as a Tool (CU678) showed a 42 percentile point increase between treatment and control groups versus a similar 24 point increase associated with the computer used as a Delivery device (CU1)—1. when the percentile growth figures for each sub-group above were compared. For instance.
. the difference between the terms in the BESD can be roughly thought of as increases in percentile points (Marzano. Since BESD translates easily into percentile. the studies grouped under Collaborative Learning (CL) resulted in a percentile point increase of 40 points whereas studies in the sub-group Advance Organizers (AO) showed a 17 percentage point increase on the scores measuring achievement after treatment. 1998). Likewise.32. Of course. all these results should be interpreted with caution due to the limits on this meta-analysis previously stated. For instance. the difference in BESD between the combined treatment groups and the combined control groups for all 31 studies in this meta-analysis is .
Chapter 6 CONCLUSIONS The purpose of this study was to explore how educators can more effectively use computer technology to meet the needs of present-day students. This was accomplished by pursuing the answer to the following question: Is there one combination of computer use and instructional technique that leads to greater student achievement than any other such combination? Whe n restated as a Type 2 research question, the above question takes the form: Under what instructional conditions do computers most affect student achievement? Answering the latter question requires addressing two questions implicit in the first: Which use of computer technology leads to the greatest student achievement? Likewise, is there a particular instructional technique that has been demonstrated to increase learning more than others? Findings This thesis used two different synthetic research techniques to fulfill the purpose of this thesis: A review of relevant literature suggested that the educational requirements of students in schools now and in the near future will be dictated by the need to function in a society that is becoming increasingly dependant on digital computing technologies. Those technologies are the driving force behind the transition of the dominant form of “making a living” from an industrial model to a knowledge-work model. To prepare today’s students for this new world requires that schools rethink their educational practices and modify their instructional environments by the creation of more communal and flexible school and classroom environments. Computer technology has been shown, in
89 aggregate, to contribute to student achieve ment but the level of that achievement depends on how the computer is used. One of the goals of this thesis—and the primary purpose for conducting the meta-analysis—was to identify those combinations of computer use (CU) and instructional technique (IT) t hat led to the greatest learning. The statistics derived from the meta-analysis suggest that this goal has been achieved by identifying collaborative learning and the use of the computer as a tool (especially Mindtools, but including such other uses as productivity tool, a programming tool, an individualized management tool, problem-solving tool, etc.) as having the strongest effect sizes. While the magnitude of the effect sizes and the pervasiveness with which they are maintained through different grouping make it tempting to generalize from this study to the “real world,” the exploratory nature of this study and its relatively limited study universe relegate those findings to a less ambitious role: They serve as triggers for future study. It is the finding of this meta-analysis that sufficient statistical and logical evidence exists to suggest that the instructional technique of collaborative learning, in conjunction with the use of computers as a tool facilitates learning better than any other such combination of variables investigated. This finding applies specifically to the study population included in this meta-analysis, but might be indicative of what a larger, more general population of studies would also show. In addition, the findings support Jonassen’s theory that it is the way in which computers are used in instruction that determines the extent to which they affect learning. Further, the specific identification of the Mason-Mason (1999) study in support of MindTools as having the largest effect on student achievement of all the studies comprising the sample for this meta-analysis, likewise supports
90 Jonassen’s contention that using computers as MindTools is the most effective use of computers for instruction. Lastly, the suggestion, deriving from the search for factors influencing heterogeneity, that the greatest improvement in student learning when using computers seems to accrue to students in grades 6 – 12, supports the conclusions made by Wenglinsky (1998). Discussion The meta-analysis suggests that cooperative learning and the use of the computer as a tool are the best combination of instructional technique and computer use when it comes to “inducing” student achievement. This suggests that the best way to prepare students for facing the realities of the 21st century is to employ cooperative learning instructional techniques in conjunction with the use of the computer as a tool and to do so within the context of a communal social atmosphere and a flexible schedule. This accords nicely with the idea of “knowledge work” because cooperative learning and using the computer as a tool are integral to knowledge work. It seems possible that the findings in regards to cooperative learning, as significant as they were, would have been even higher if the majority of the included studies had involved longer treatment times. According to Littleton and Light (1999), "Howe, Tolmie, & Rogers (1992) offer evidence that progress made as a result of peer interaction may not be apparent in individual performance until perhaps several months later” (p. 4). Since a considerable number of studies involved only a few (sometimes only one) isolated treatment sessions, it seems likely that these studies would have showed a stronger effect if more time had been allotted to treatment. On the other hand, it is possible that cooperative learning instructional techniques might produce the best results only when used for
91 short periods of time. It also seems reasonable to speculate that computers might be effective for the same reason. was deliberately designed to emulate the best thinking machine that could be found on Earth—the human mind (Pinker. whether in school or at work. Perhaps the reason that cooperative learning works so well with people is related to the way the mind processes input (i. 21st Century Educational Tasks In the networked world where humans work best. the educational task will be to (a) Develop the human component of the human-computer interface to its fullest extent. the joint construction of understanding (this is the phenomenon expressed by the old adage "Two heads are better than one"). If this is so. Some knowledge work tasks require more than one person also. (b) develop the human component
. Thus. Cognitive science arose from the attempt to create artificial intelligence and was first applied to computers. collaboration. the reason children (and some adults) “get along” so well with computers is because both are “wired” (in the systems-functioning sense) the same way.. in effect. One might suggest that any work with a computer is. By extension. it seems equally reasonable to suppose that the reason computers appear to be effective is because they “think” like humans. that is. Working in groups involves what might be called “socio-cognitive” processing. then one computer and one human is more like two minds working together than one. Either actuality poses interesting questions that could not be answered by the present study. it seems. 1997. neural networks). 1987).e. Winograd & Flores. learning. The modern computer. then. others may simply end up better taken care of when more persons are involved.
first of all. but rather. It is beyond the scope of this thesis to do more than superficially address the issues of access and elitism. and telephone. did not receive telephone wiring
. Even at the dawn of the 21st century there exist households without a telephone (For instance. (b) "normal" human face-to-face communications. is it realistic to expect that every person in the future will be a “knowledge worker” and need an education like the one described here? It can be argued. p. which involve access to high technology be available to everyone. and (d) the provision of resources that can't be taken home (what he calls “collective affordances”). and (c) develop in the individual the skills to leverage the strengths of both to do knowledge work. that universal access of the type described here is likely to be the equal of. the notion of a networked world educational system is indeed a fanciful one—in the sense that nothing in the future is certain and all glimpses into the future involve a bit of fanciful thinking. the Kennedy Meadows area in the Southeastern Sierra Nevada—4 hours from Fresno. 2002.92 of social networks (social skills). Acknowledging first of all that the solutions to those problems are as much socio-political as they are educational or technical. Many would argue that the picture painted here of a future educational system is fanciful and elitist. The question is no longer what can computers do that schools cannot. (c) athletics. however a few brief thoughts are in order. Is the education described herein elitist? That is. the type of access available currently through the likes of television. regardless of economic status? More than that. can the contents and structures of education described here. if not superior to. 21): (a) socialization. radio. what can schools do better than other alternatives? Dave Hughes suggests that these will become the major tasks of the 21st century school (Barabási.
indeed. every child (and by extension. choosing not to become prepared is theirs.. In virtually every case. “why not?” Is it not easier to get a job that requires lesser qualifications than one possesses than it is to get a job requiring more? Instead of aiming education at the less capable and expecting the more capable or more willing to find their own way to excellence. perhaps it is less elitist to see that everyone receives the best education—and the most appropriate education—that can be offered. Whether the American student cannot be prepared for the best jobs available in a knowledge economy is the fault of the education system. most low-income students today who have access to computers gain that access through schools. at least.
.e.” those that today’s educators are preparing today) nor it is likely that individuals will chose not to engage in knowledge work. however. in regards to whether everyone needs an elite education.93 until just a few years ago) and those without a television. the schools are likely to be the agent of access to a large number of people in the world. That said. The schools have become one of the universal organizing elements in today’s diverse society. but giving them the option to do so is the responsibility of education. the lack of those technologies involves some degree of choice on the part of the individuals involved (i. What better place to ensure that the technology of the future is in place than the place where everyone can get to it? Thus. trading residential location for modern convenience). if the only thing those people had access to were 19th or 20th century technology and thinking. a goodly proportion of adults) will come into contact with the school system. Secondly. one can simply say. It would be sad. In America. It is not likely that less technical occupations will cease to exist in the next century (or at least the next “working generation.
It was also argued that 21st students have different needs from their 19th or 20th century counterparts and that those needs stem from the characteristics of a post-modern generation. from that. and scientific discovery. The impact of cognitive science. Likewise. It was determined. contributed to the development of the constructivist theory of knowledge and. The literature review likewise examined recent school restructuring efforts and discovered that the preponderance of the evidence indicated that genuine efforts at restructuring can have—and have had—a positive impact on student achievement.94 Summary The literature review argued that the 21st century society is changing and likely to change e ven more. the e ducational theory of constructivism.” The transition from the 19th century industrial age economy to the 21st century “knowledge economy” was accompanied by a parallel shift in the dominant way of thinking about learning and knowledge. transportation. Among those impacts were qualitative changes in communication. itself derived from attempts to further the thinking capabilities of computers. that little evidence existed for
. These shifts took place on both the theoretical and the empirical levels. however. and by the demands of the emerging “knowledge economy. the results of recent research into the nature of the physical functioning of the br ain suggests that constructivist teaching techniques in general and specific instructional tactics advocated by brain-based learning models in particular best serve 20th century student needs. The teaching techniques associated with Constructivist theory seem to accord well with the nature of 21st century students. raised in an interactive digital world. Various examples of those changes and a glimpse at how those innovations might impact the world and education in the near future were presented.
and can provide support for. In general. has had a beneficial impact on student learning more often than not.95 advocating any specific restructuring strategy as being more efficacious than any other. they provide “grist” for the mill of discussion. Second. the more “in charge” the learner was. that certain uses of the computer for instructional purposes were more beneficial than others. and to a lesser extent. the results suggested that instructional technique and computer use mutually influence each other—that is. a setting for
. each effects the effectiveness of the other as regards student performance on measures of achievement. that is. as a communication device (which.” that is. and the greater the instrumentality accorded the computer. Implications of the Study There are several ways in which the findings above can immediately impact the future. Both principles seem consanguineous with. is a type of tool). the greater the effect on learning. Two observations with wide implications emerged from the meta-analysis: First. David Jonassen has theorized that the most effective educational use of computers was “learning with computers. the most effective combination was cooperative learning instructional techniques coupled with the use of computers as a tool. According to the review of recent and relevant literature it is clear that digital computer technology. but that two general beneficial principles emerged: flexible scheduling and a communal atmosphere. it may be noted. using them as a tool—what he termed a Mindtool. as a whole. it was observed that the greater the active role of the learner. First. The sometimes conflicting research findings suggested. cooperative learning techniques and the use of networked computer technologies within a social context. however.
The results of this meta-analysis suggest that a number of specific Type 1 research questions can now be created from it. A formal synthesis rarely offers an integration that is conclusive enough to answer a question decisively. The effect sizes derived from this exploratory study can perhaps be used as the basis for devising a Bayesian distribution scheme with which the results of future meta-analyses can be
. Second. ¶ 1). Instead. for instance. This is an apt observation regarding the applicability of the findings of the meta-analysis just completed.” Without data no computations can be made and without prior computations no studies should be made. a meta-analysis more commonly serves as a way station along a sometimes winding route to answering a particular question” (p. a “prior distribution” from which to pursue future study. Wilkinson and Task Force (1999) suggest that “power computations are most meaningful when done before [emphasis added] data are collected and examined . . In the beginning.96 future discussion. the data provides a “jumping off” place. a more general Type 2 question was posed. (Power and sample size subsection. A Setting for Further Discussion According to Cooper and Hedges (1994) “A qualitative synthesis of research is typically not an endpoint or first step in the investigation of a research topic. “Is the collaborative use of the computer as a tool more effective at increasing learner achievement than [any other instructional technique or computer]?” A Prior Distribution This study provides data for some preliminary power calculations from which to embark on future studies. 486). Each of these implications is discussed further below.” This is a “Catch –22. .
focusing on which organizational structure best effects students learning. extending the limited study universe employed in this meta-analysis to the entire accessible body of similar research studies (i. (c) Primary research studies using the Type 1 research question derived from the original Type 2 question posed at the onset of this study should be performed for all possible combinations of computer use and instructional technique.” Specific recommendations for future research arise out of this study: (a) An exhaustive meta-analysis should be conducted. instructional strategies) that seem to result in greater student achievement (as measured by whatever measurement devices were used in the original studies). if so.. a posterior distribution can be developed by incorporating these results.e.e. (b) A similar study should likewise be conducted. only at the level of systemic reform (strategic educational change). By acting as a “prior distribution” (Lewis & Zelterman. This is similar to the question regarding restructuring which this thesis addressed.97 compared. 1994) for later Bayesian analyses. which current restructuring schemes seem best suited to take advantage of those computer-based “best practices). (d) Meta-analytic studies focusing on all possible combinations of computer use and instructional technique should be
.. Recommendations for Further Research The results of this meta-analysis can be compared with the results of prior studies to discover (a) whether there are any conditions (i. the total extant universe) to ascertain whether the limited generalities of this present study can be generalized to the entire educational population. and (b) whether the conjunction of computer use and specific instructional conditions suggest “best practices” that may lead to restructuring (and.
. (e) Of particular import is the suggestion that the confounding effect of certain instructional techniques on specific uses of computer technology could lead to negative student achievement. would be a valuable contribution to the field of social informatics. As something to be avoided.98 performed to provide comparison mean effect sizes that can then be compared with one another. another meta-analysis aimed at identifying those combinations.
Cambridge. strategic. April). ED429 912). (ERIC Document Reproduction Service No. *Ayres.) (1995). E. *Adonri. Linked: The new science of networks. (1998. June). (Ed. J. San Diego. (ERIC Document Reproduction Service No. & Runyan. Paper presented at the Annual Meeting of the National Association for research in Science Teaching. *Bar-Natan. & Gittman. San Francisco. 51. October). and statistical considerations. & Melear.. ED 444191). M. MA: Blackwell. Ellenville. April). W. (ERIC Document Reproduction Service No. (1987). A. Effect of computer assisted instruction on students’ achievement in global studies. (1986). (1995). R. *Avitabile.. (1990). Connectionism and the philosophy of mind: An overview.REFERENCES *References marked with an asterisk indicate studies included in the metaanalysis. A. A. LA. 11731182. & Abrahamsen. D. Bechtel. UK: Cambridge University Press. Connectionism and the mind: An introduction to parallel processing in networks. T. ED 418 873). In NECC ’98: Proceedings of the National Educating Computing Conference. W. 37-47.. ED 419 492). Paper presented at the Annual Meeting of the Northeastern Educational Research Association.
. M. Barry. 17-41 Bechtel.S. (2002). (1998. Supplement. The Cambridge dictionary of philosophy. Baron.. New Orleans. A review of distance-learning studies in the U. Audi.. I. C. Cambridge. (1998.. The American Journal of Distance Education. C. Cambridge. (2000. MA: Perseus Publishing. The moderator-mediator distinction in social psychological research: Conceptual. R. CA. & Hertz-Lazarowi tz. Measures of writing developmentin learning environments using cooperative learning (CL) and computermediated communication (CMC). R. CA. (ERIC Document Reproduction Service No. O. & Kenny. 9(3). NY. Barabási. military. The Southern Journal of Philosophy. E. Increased learning of physical science concepts via multimedia exhibit compared to hands-on exhibit in a science museum.. Interaction of presentation mode and learning style in computer science. Journal of Personality and Social Psychology. Paper presented at the Annual Meetingof the American Educational Research Association. R.
Sacramento. Brook & I. D. pp..gov/edtech/ccc/
. San Francisco: City Lights. Bohlin. MA: Harvard University Press. Brook.ca. Revisiting tomorrow's classrooms. Boal (Eds. F. Bowman. and compete: The report of the California Education Technology Task Force. H.htm Besser. 60(4). The Education Review.). & Boal. San Francisco: City Lights Books. (1997). 268-305). (ERIC Document Reproduction Service No. 369-398. Best. (1997). Student model construction: An interactive strategy for mental models learning. Troubles with functionalism. (1998). (1995). Canada. S. From internet to information superhighway.C. ED 436 177). Cambridge. A. Resisting the virtual life: The culture and politics of information. Education reform in the information age. Boston: Harvard Business School Press. & Duguid. & Kellner. & Tessmer. *Bland. Summer). 304-307. Retrieved December 6. (Eds. compute.). S... (1991). supporting the learning. N.. *Brown. Educational Psychologist. (1999. The coming of the post-industrial society. P. Retrieved September 21.. M. 2002 from http://www. Blumenfeld. 1. (2000). (1995). J.ca/www/ipp/1tom. British Columbia. The postmodern turn. F. The California Education Technology Task Force. Houston. Soloway. In J. 26. ED 443 688). A. Motivating project-based learning: Sustaining the doing. Resisting the virtual life: The culture and politics of information. Computer assisted instruction in mathematics can improve students’ test scores: A study.mala. Renaissance. Published as part of the chapbook series for the Institute of Practical Philosophy at Malaspinsa University-College. M.. Marx.. D. 2002 from http://www. February). Benjamin. & Palincsar.). Readings in philosophy of psychology (Vol.S. New York: The Guilford Press. Brown. Krajcik.. J. (1980). R.bc. R.cde.14-16. (ERIC Document Reproduction Service No. J. R.A. New York: Basic Books. Computer technology can be the irresistible force needed to move education.W. (1996. (1973). In N. R. M. TX. Guzdial. CA: California Department of Education. Block (Ed. (1996). Connect. The social life of information. T. I. E. Block. (2000).101 Bell. Research report. 4(1). In Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology [AECT]. P.
R. (1999. Journal of Education Finance. The Cochrane reviewers handbook glossary. M.. 10-12. Media will never influence learning. Editorial: Educator’s acceptance of computer technology? T. Reconsidering research on learning from media.C. (Vol. (2000). Albuquerque. (1998). Castells. Navigation maps in a computer-networked hypertext learning system. Clark. G. The rise of the network society.H. Paper presented at the Annual Meeting of the American Educational Research Association. N. 29(9). Cambridge. 445-459. S. The effects of Internet-based formative and summative assessment on test anxiety. (ERIC Document Reproduction Service No. M. J. society and culture. MA: Blackwell. Pavlechko. C. perceptions of threat..5 retrieved December 6. (1986). & Mock. (2002.de/software/Documentation/Handbook/glossary. Clark. A meta-analysis of research on the relationship between educational expenditures and student achievement. Charp. (1996). (1997). (1986). (1983). In ED-MEDIA 99 World Conference on Educational Telecommunications Proceedings. E. Cambridge. A nation prepared: Teachers for the 21st century. (1997. S. H. 53. The project approach: Making curriculum come alive. C. Review of Educational Research. Childs. ED 453 815). April). E. (1998). 445-459. The report of the task force on teaching as a profession. April). 249-263. New York: Scholastic Books. the information age: Economy. J. MA: Blackwell. (1994).1. Washington. 2002 from http://www. version 4.. 53. Cochrane Collaboration. ED 435 097). & Shakeshaft. Journal (Technological Horizons in Education). (ERIC Document Reproduction Service No. W. & Lin.E. Seattle. R. 2). Budenz-Anders. Castells. M. *Cassady. (Vol. S.. MA: Blackwell. DC: The Carnegie Forum. the information age: Economy. 12(3). and achievement. NM. Chomsky. the information age: Economy. February).102 Carnegie Forum on Education and the Economy. 3). ED 403 882).doc
. The Hague: Mouton. (ERIC Document Reproduction Service No. The end of the millennium. Chard. The power of identity. Paper presented at the Annual Meeting of the Association for Educational Communications and Technology [AECT]. Seattle. The effects of training method on language learning performance. 1). society and culture. ETR&D. (2001.cochrane. Castells. *Chou. (1957).. C. Cambridge. H. June). society and culture. (Vol. *Chou. Syntactic structures.
The effects of informed strategy training and computer mediated test on comprehension monitoring and reading comprehension. D’Agnese.). Coulson. & van Joolingen. & Small. The integrative research review: A social science approach. LA. 21(10). Retrieved September 21. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching. The information age and the printing press: Looking backward to see ahead. J.rand.103 Cohen. (ERIC Document Reproduction Service No. A. 1-42. (Eds. Cotrell. Davis. H. (ERIC Document Reproduction Service No. (ERIC Document Reproduction Service No. Rand Corporation. Scientific discovery learning with computer simulations of conceptual domains. April). MA: Addison-Wesley. (1994. (1983). Research report. G. (2000). June).. (1997). 179-201. The handbook of research synthesis. Statistical power analysis for the behavioral sciences. W. Education Policy Analysis Archives. “Reasoner’s Workbench” program supports students’ individual and collaborative argumentation. *Davis. ED 402 545). Paper funded by a RAND President’s award. S. (2000). A connectionist scheme for modeling word sense disambiguation. T. Beverly Hills. C. (1984). H. The effects of computer skills and feedback on the gains in students’ overall writing quality in college freshman composition courses. (1977). 6. (1986).. Toronto: John Wiley and Sons. R. What you’ll need to know in twenty years that you don’t know now. Reading. de Jong. Cooper. C. & Hedges. J. CA: Sage. J. S. (1998). L. K. Statistics and data analysis in geology. Cooper. (2000. Dewar. J.. ED 440 873). New York: Academic Press. & Meyer. Discover. (1998). 8-120. L. 2 (9). W. 68 (2).org/ publications/P/P8014/#fn1 *Diehl. A. ED 435 097). Human life. Review of Educational Research. Cognitive and Brain Theory. & Mahoney. MA: MIT Press. New York: The Russell Sage Foundation. human organization and education. Demming.. (1994).. M. *Dehn. New Orleans. Out of the crisis. Research report. Davis. J. W. (1998). 2002 from http://www. Blur: The speed of change in the connected economy. V.
. 58-61. J. Cambridge. (1999).
Playing computer games versus better learning. 1. Clearwater. Retrieved February 9. VA: Society for Applied Learning Technology. Project for a scientific psychologie. A. Drucker.artsci. *Fernandez. VA: ASCD. F.V Hedges (Eds. Warrenton.doc Freud. (ERIC Document Reproduction Service No. S. (1987). M. S. Egger. Emergent readers’ responses to read aloud stories and stories presented by a computer. Eliasmith. New York: Harper & Row. New York: Anchor Books. Restructuring schools: The next generation of education reform. H. G. W. C. New York: Russell Sage Foundation. (1997). & Philips. Business @ the speed of thought. Does this stuff work? Some findings from applications of technology to education and training. Cooper & L. J. F. In H. (ERIC Document Reproduction Service No. Ne w York: Russell Sage Foundation. (1999). Fletcher. J. (1999). San Francisco: Jossey-Bass. S. Measures of effect size for categorical data. D. E.wustl. J. 2002 from http://www. The age of discontinuity: Guidelines to our changingsociety. In Proceedings of the Conference on Teacher Education and the Use of Technology Based Learning Systems. Eagly. (1999. (1990). R. 295-387. Gates. The handbook of research synthesis (pp. M. Meta-analysis: Principles and procedures. & Wood. 485-500). The third contender: A critical examination of the dynamicist theory of cognition. Using research synthesis to plan future research. In D. F. Eliasmith. March (Ed. 441-463.. FL.315/7121/1533. Dictionary of philosophy of mind. Philosophical Psychology. Smith. C. A. D. 2002 from http://bmj.
.edu/~philos/MindDict/ phenomenology. ED 418 380). February). New York: Warner Books. (1994).).). Getting to the heart of the matter: Education in the 21st century. (1969). Cooper & L. (1996). 2002 from http://www. 245-260). (1998). Retrieved October 16. E. & Caleo.V Hedges. (Ed. Retrieved December 10. Paper presented at the Annual Conference of the Eastern Educational Research Association.com/cgi/ citmgr?gca=bmj. (Eds. D. Eastin.. B.104 *Din. Drexler. L.judybrown. ED 438 905).com/judy/Docs/ SALT_Sum. (1994). In H. Alexandria.). Fleiss.) (2001). D. Engines of creation. (1996). P.html Elmore. Preparing our schools for the 21st century. Research paper. (1895). 9(4). N. The handbook of research synthesis (pp.
N. T. 5. (1994a). M. B. Greene. A meta-analysis of the effectiveness of bilingual education. Retrieved December 6.). G. Extraneous and confounding variables and systematic vs nonsystematic error. New York: Henry Holt & Co. Gershenfeld. V. N. The role of the teacher in simulation learning environments. In H. Hedges (Eds. Educational Technology Research and Development. M. secondary and meta-analysis of research.com/homepages/JWCRAWFORD/ greene. 285-299).. The Psychology World website at the University of Missouri – Rolla. L. Educational Researcher. 5. (1998). Review of Educational Research. Cooper & L. M.htm Hawley.htm Gutierrez. Educational Researcher. Glass. CA: The Tomas Rivera Policy Institute in the Department. R. Meta-analysis in social research. April). 3-8. E. J. R. & Smith. New York: Simon and Schuster.V. V. Glass. Paper presented at the Annual Meeting of the American Educational Research Association. Failure to connect: How computers affect our children’s minds—for better and worse. Glass. (1994b). L. & Duffy. 10. 351-379. & Slavin. J. Statistical considerations. CA. CA: Sage. Integrating findings: The meta-analysis of research. (1999). V.105 Gerlic. San Diego. Cooper & L. Claremont. Achievement effects of the nongraded elementary school: A best evidence synthesis. (1999). New York: Russell Sage Foundation.L.V. Retrieved December 9. V. Hall. (1978b). 47(3).. Beverly Hills.edu/~psyworld/extraneous. (1976). Reply to Mansfield and Busse. The handbook of research synthesis (pp.umr. (2002).
. Healy. (1998). Multimedia: Differences in cognitive processes observed with EEG. C. I. In H. 333-376. Hedges. (1999). & Jarusovec. 62(4). McGaw. L. A. G. When things start to think. (1978a). Review of Research in Education. 3-8. L. 2002 from http://ourworld. Research methods: A process of inquiry (4th ed. (1998. 29-38).compuserve. V. of Government.. R. P.. G. New York: Russell Sage Foundation.). (1981). Hedges (Eds. & Raulin. (1992).). University of Texas at Austin. Fixed effects models. Graziano. 2002 from http://web. Glass. G. New York: Allyn and Bacon. Primary. M. V. The handbook of research synthesis (pp. 8-14. M. Hedges.
185-210).). (1998. & Schmidt. Jonassen. San Francisco: Harper Collins. British Journal of Developmental Psychology. D. (1996). (1985). (1890). C. (1997). 10. & Type! Program in increasing the phonological awareness of first grade students. MO. J. The handbook of research synthesis (pp. Psychology (Briefer course). J. Institute for Learning Technologies. & Olkin. J. Six postmodernisms in search of an author In G. Louis. (1994). H. In I. (1997).) Instructional technology: Past.. Howe. In H. Tolmie. Englewood Cliffs. Cooper & L. In Proceedings of Selected Research and Development Presentations at the National Convention of the Association for Educational Communications and Technology (AECT) Sponsored by the Research and Theory Division. Computers in the classroom: Mindtools for critical thinking.V. (ERIC Document Reproduction Service No. M. F. Johnson. (1999). Teaching with the brain in mind. James. G. Effects of inductive multimedia programs including graphs on creation of linear function and variable conceptualization. Hlynka. Research report. A. (ERIC Document Reproduction Service No. W. Hunter. ED 423 841). I. E. M. Correcting for sources of artificial variation across studies. 323-336). F. E. VA: ASCD. (1992). & Jackson. Snyder (Ed. L. January 2000 through December 2004. Statistical methods for meta-analysis. V. New York: Academic Press. L. ED 453 814).
. *Ignatz. The acquisition of conceptual knowledge in science by primary school children: Group interaction and the understanding of motion down and incline. St. D. (2000). Englewood.106 Hedges. (1982). Johnson-Eilola.. 113-130. Educating America for the twentyfirst century: A strategic plan for educational leadership. Inc. present. Hedges (Eds. Meta-analysis: Cumulating research findings across studies. Interface culture: How new technology transforms the way we create and communicate. J. E. Reprinted 1962. (1998)..). C. Hunter. New York: Columbia University Teachers College. and future. Schmidt. (1995). Beverly Hills.. Anglin (Ed. February). B.. Alexandria. L. New York: Collier Books. Write. (1998). The effectiveness of the Read. CA: Sage. Page to screen: Taking literacy into the electronic era (pp. New York: Russell Sage Foundation. CO: Libraries Unlimited. Hunt. A. S. Living on the surface: Learning in the age of global communications networks. NJ: Merrill. New York: The Russell Sage Foundation. & Rogers.. Jensen. *Johari. How science takes stock: The story of meta-analysis. New York: Routledge.
(1999. & G. Kulik. & Cavalier. Storrs. 2002 from http://www. Effectiveness of computer-based college teaching: A meta-analysis of findings. 525-544. A. C. C.. R.. A. & Glenn. 252-259). Kashy. Boston: McGraw-Hill. In Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology [AECT]. *Klein. (1999. H. Center for Social Informatics.edu/CSI?report. Personalization of mathematics word problems in Taiwan. J. TX. H. 2001 from http://users. & Lehman. TX. Kenny. A.
. J. Handbook of social psychology (Vol. 49-59. (ERIC Document Reproduction Service No. & Sullivan. (2000). D.. Rosenbaum. & Weisband. Retrieved September 22. CT: University of Connecticut.html Knapp. Houston. Scaffolding in a computer-based constructivist environment for teaching statistics to college learners. Using cooperative learning and objectives with computer-based instruction. In D. S. Boston: Allyn and Bacon. Restructuring schools with technology. Houston. S. ED 436 134). ED 408 317). T. D. D. (2000).. ETR&D.. Retrieved October 25. February). Crawford. Learning from social informatics: Information and communication technologies in human contexts. Ku. H. (1997. J.). Indiana. P. Transforming learning with technology: Beyond Modernism and Postmodernism or whoever controls the technology creates the reality. Kling. Meta-analysis: Easy to answer (Version II). *Kao. February). Kuhn. Chicago: University of Chicago Press. (1980). 2(2). D.107 Jonassen. Fiske. (2000). M. The structure of scientific revolutions. (1998) Data analysis in social psychology.rcn. (1970).. (ERIC Document Reproduction Service No. Paper presented at the Annual Meeting of the American Educational Research Association. Indiana University. The use of audio in computer-based instruction. S. 1. 21-25. S. D. (1996). R. A. Lindzey (Eds. Kulik. J.. D. 40(2). T. M (1997). 2 nd ed.com/dakenny/meta. & Sullivan. ED 436 152). & Cohen. J. H. (1999). Bolger. In Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology [AECT]. 48(3). (ERIC Document Reproduction Service No.slis. N. L.htm Kenny. M. Gilbert. H. pp. H. Kaku. Chicago. March). Visions: How science will revolutionize the 21st century. J. *Koroghlanian. Review of Educational Research. C. Educational Technology. Sawyer. New York: Anchor Books. [Computer software and Manual]. D.
Chicago: University of Chicago Press. 2002 from http://www. D. Lewis. Issues in Restructuring Schools. K. V. E. Learning with computers: Analyzing productive interaction. Smith.com/MetaAnalysis. Retrieved December 8. Lubar. New York: Russell Sage Foundation.). (1994). New York: Russell Sage Foundation. 7. & Croninger. V. D.wcer. J.). C. D. 16. Restructuring the schools: Problem sand prospects. Cooper & L. (1994). B. London: Routledge. Lee. The visual presentation and interpretation of meta-analyses. B. (1999).. Boston: Houghton Mifflin. Singer. Issues in Restructuring Schools. Boulder. S.. T. Retrieved February 6. High school restructuring and student achievement: A new study finds strong links. & Light. The age of the spiritual machines..edu/archives/completed/cors/ Issues_in_Restructuring_Schools/ISSUES_NO_9_FALL_1995. M. Lane. T. (Eds. W. G.monumental. Cooper & L. CO: Westview Press. K. G. K. 439-453). New York: Viking Press. (1992). (1998). Bayesian approaches to research synthesis.wcer. Electronic School. S. (Eds. D. R.). High school restructuring and student achievement: A new study finds strong links. 9. Lashley. (1995. (Journal of the Center on Organization and Restructuring of Schools). InfoCulture: The Smithsonian book of information and inventions.edu/archives/completed/cors/Issues_in_Restructurin g_Schools/ISSUES_NO_7_FALL_1994.pdf Lehrer. The handbook of research synthesis (pp. (1993). J. Lipsey.). Lyons.). The efficacy of psychological. (Ed. J. Digital learning: Why tomorrow’s school must learn to let go of the past. (Dimensions of Philosophy series). 48(12). (2000). Light. (1999). L. Brain mechanisms and intelligence. Theory of knowledge.pdf Lee.V.wisc. 2002 from http://www. J. & Smith. (1994. Meta-analysis: Methods of accumulating results across research domains. CA: McCutchan Publishing Corporation. In H. & Zelterman.htm
. A. Fall). P.lyonsmorris. E. 2002 from http://www. & Wilson.. E. 1181-1209. (1990)..108 Kuzweil. Fall).G. R. American Psychologist..wisc. 411-422). Retrieved December 8. V. J. 187(9). Berkeley. R.html Now available at: http://www.com/Solomon/MetaA/Mapage1.. & Willett. educational. The handbook of research synthesis (pp. 1-5. (Journal of the Center on Organization and Restructuring of Schools). and behavioral treatment. 1-21. (1993). Littleton. 22-24. B. & Epps. Hedges (Eds. P. Hedges (Eds.. (1929). In H. J.. B. Layton.
109 Lyotard, J. F. (1984) The postmodern condition. Manchester, UK: Manchester University Press. Machlup, F. (1962). The production and distribution of knowledge in the United States. Princeton, NJ: Princeton University Press. *Mann, D., Shakeshaft, C., Becker, J., & Kottkamp, R. (1999). West Virginia story: Achievement gains from a statewide comprehensive instructional technology program. Beverly Hills, CA: Milken Family Foundation. (ERIC Document Reproduction Service No. ED 429 575). Marzano, R. J. (1998). A theory-based meta-analysis of research on instruction. A report sponsored by the Office of Educational Research and Improvement, United States Department of Education. Aurora, CO: Mid-Continent Regional Educational Laboratory. *Mason-Mason, S. D. (1999, November). Expert systems as a Mindtool to facilitate mental model learning. Paper presented at the Annual Meeting of the Mid-South Educational Research Association, Point Clear, AL. (ERIC Document Reproduction Service No. ED 435 755). Matt, G. E., & Cook, T. D. (1994). Threats to the validity of research syntheses. In H. Cooper & L.V. Hedges (Eds.), The handbook of research synthesis (pp. 503-520). New York: Russell Sage Foundation. Mayers, T., & Swafford, K. (1998). Reading the networks of power. In T. Taylor, & I. Ward (Eds.), Literacy theory in the age of the Internet (pp. 146-157). New York: Columbia University Press. McClure, P.A. (1996). Technology plans and measurable outcomes. Education Review, 31(3), 29-30. McLuhan, M. (1964). Understanding media: The extensions of man. Cambridge, MA: Massachusetts Institute of Technology. Mehlinger, H. D. (1995). School reform in the information age. Bloomington, IN: Center for Excellence in Education, Indiana University. Meynert, T. (1884). Psychiatry. trans. B. Sachs. New York: G.P. P utnam's Sons. Microsoft. (1999). Practicing knowledge management: Turning experience and information into results. Retrieved January 21, 2002, from http://www. microsoft.com/business/articles/productivity/kmoverview.asp Minsky, M. (1985). Society of mind. New York: Simon and Schuster. Minsky, M., & Papert, S. (1969). Perceptrons: An introduction to computational geometry. Cambridge, MA: MIT Press. Moore, D. S., & McCabe, G. P. (1999). Introduction to the Practice of Statistics, 3rd ed. New York: Freeman.
110 Moorman, H., & Egermeier, J. (1992). Educational restructuring: Generativemetaphor and new vision. In J. J. Lane & E. G. Epps (Eds.), Restructuring the schools: Problems and prospects ( pp. 15-59). Berkeley, CA: McCutchan Publishing Corp. Morovec, H. (1988). Mind children. Cambridge, MA: Harvard University Press. *Murphrey, T. P. (1999). Comparing and contrasting the effectiveness of computer-based instruction with traditional classroom instruction in the delivery of a cross-cultural educational module for agriculturalists. A summary report of research. Research paper. (ERIC Document Reproduction Service No. ED 438 427). National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: US Go vernment Printing Office. Nax, S. (1996, 25 October). “Education, skills called 21st century job musts.” Fresno Bee, p. C1. Negroponte, N. (1995). Being digital. New York: Alfred A. Knopf. Newell, A. & Simon, H. A. (1976). Computer science as empirical enquiry: Symbols and search. Communication of the Association for Computing Machinery, 19, 113-126. Newmann, F. M., & Wehlage, G. G. (1995). Successful school restructuring: A report to the public and educators by the center on organization and restructuring of schools. Madison, WI: Board of Regents of the University of Wisconsin System. Oblinger, D. G., & Rush, S.C. (Eds.). (1998). The future compatible campus: Planning, designing, and implementing information technology in the academy. Bolton, MA: Anker Publishing Company, Inc. Oppenheimer, T. (1997, July). The computer delusion, The Atlantic Monthly, 4548, 50-56, 61-62. O’Reilly, T. (2000, Winter). Everything in connected: Ubiquitous computing will turn the computing world topsy turvy. The Future of Software Magazine, 1(1), 72-73. Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8, 157-158. Orwin, R. G. (1994). Evaluating coding decisions. In H. Cooper & L.V. Hedges (Eds.), The handbook of research synthesis (pp. 139-162). New York: Russell Sage Foundation.
111 Patterson, D. A. (1996). Microprocessors in 2020. In Key technologies for the 21st century: A Scientific American special issue (pp.1-8). New York: W. H. Freeman and Company. Pearl, J. (1998) Why there is no statistical test for confounding, why many people think there is, and why they are almost right. Technical Report R-256, Department of Computer Science, University of California, Los Angeles. Retrieved Oct.17, 2002 from ftp://ftp.cs.ucla.edu/pub/stat_ser/R256.ps Perelman, L. (1992) School’s out. New York: Morrow. Pinker, S. (1997). How the mind works. New York: Norton. Pogrow, S. (1996, June). Reforming the wannabe reformers: Why education reforms almost always end up making things worse. Phi Delta Kappan, 656-663. Poster, M. (1990). The mode of information. Chicago: University of Chicago Press. President's Committee of Advisors on Science and Technology, Panel on Educational Technology. (1997, March). Report to the President on the use of technology to strengthen K-12 education in the United States. Washington, DC: The White House. Retrieved November 21, 1999 from http://www.whitehouse.gov/WH/EOP/OSTP/NSTC/PCAST/k-12ed.html *Priebe, R. (1997, March). The effects of cooperative learning in a secondsemester university computer science course. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, Chicago, IL. (ERIC Document Reproduction Service No. ED 406 189). *Ramirez, A., & Rivard, S. (1998, December). Hypermedia aids for advanced learning in complex and ill-structured knowledge domains. In Proceedings of the International Academy for Information Management (IAIM) Annual Conference, Helsinki, Finland. (ERIC Document Reproduction Service No. ED 431 435). Rashovsky, N. (1938). Mathematical biophysics: Physicomathematical foundations of biology. Chicago: The University of Chicago Press. Raudenbush, S. W. (1994). Random effects models. In H. Cooper & L V. Hedges (Eds.), The handbook of research synthesis (pp. 301-322). New York: Russell Sage Foundation. Robin, D., & Malkas, B. (2000). Contextual learning: Using work based learning experiences in the content area. PowerPoint presentation. Retrieved September 18, 2002 from http://www.jedco.com/malkasweb/ powerpoint/sld001.htm Ronfeldt, D. (1996). Tribes, institutions, markets, networks: A framework about societal evolution. RAND document P-7967.
(1994). C. D. Nagalingham.Clinical Pharmacologic Therapy. Meta-analysis: A review. W. Ferguson-Hessler. San Francisco: Jossey-Bass. 53. D. T.). Meta-analytic procedures for social research (rev.D. (1984). CA. & Chalmers. Berrier. D. (1998. A. & de Jong. Journal of Educational Psychology.” Retrieved September 25. Learner control over full and lean computer-based instruction under differing ability levels. Schlechty.). 43. New York: HarperCollins. MA: MIT Press. New York: Russell Sage Foundation. T. ed. 610-615. New York: Russell Sage Foundation. Psychosomatic Medicine. S. T. Meta-analytic procedures for social research. McClelland. Ancona-Berk. (1998). (1988).
. J. A simple. 1998 from http://teleeducation..). (1986). & the PDP Research Group. L. Cambridge. E. The “no significant difference phenomenon. V. Rosenthal. (1991b). In H. Rosenthal. 1). Cooper & L V. 74.112 Rosenthal.. (ERIC Document Reproduction Service No. The file-drawer problem and tolerance for null results. 231-244). (1982). J. Rosenthal. Wachter & M. (1991a).. Psychological Bulletin. M. April). Straf (Eds. (1997). L. (1997). Arizona State University. Dipyridamole in the treatment of angina pectoris: A metaanalysis. R. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching. The handbook of research synthesis (pp.ca/ nosignificantdifference/ Sacks. Parallel distributed processing (vol. R. L. Russell. *Savelsbergh. R. 247-271. R. Parametric measures of effect size. CA: Sage.B.. In K. general purpose display of magnitude of experime ntal effect. ED 421 346). An evaluation of procedure and results. *Schackenberg. R. Beverly Hills. (1990). R. CA: Sage. 86. M. R. ED 408 935). Rushkoff.. 166-169. Rosenthal. 638-641. San Francisco. Rosenthal. (ERIC Document Reproduction Service No.. R. Hedges (Eds.nb. H. (1999). The future of meta-analysis (pp. 123-133). Physics learning with a computer algebra system: Towards a learning environment that promotes enhanced problem representations. (1979). M. G. P. C. C. & Rubin. Inventing better bchools: An action plan for educational reform.. dissertation. Ph. H. Rumelhart. Playing the future: How kid’s culture can teach us to thrive in and age of chaos. Beverly Hills. Rosenthal..
).C. P. & Mussoline. Washington. A. The Accelerated Reader program.3). A SCANS report for America 2000. Slavin. W. Snyder (Ed. P. Schwarzer.htm *Scott. Educational Researcher. IL: Southern Illinois University Press.. ED 434 431). Shouse. (ERIC Document Reproduction Service No.113 *Schoffner. (1986). Sponsored by the Research and Theory Division. Educational Communication and Technology Journal. 15. and cooperative strategies on visual literacy instruction. Visions of technology (pp. R. S. reading achievement. Selfe. M. Paper presented at the Annual Conference of Association for Educational Communication and Technology. New York: Routledge. Children. M. February). L. Albuquerque. H. Smith. 5-11. Paper presented at the annual meeting of the American Sociological Association.yorku. (ERIC Document Reproduction Service No. L.. Best evidence synthesis: An alternative to meta-analytic andtraditional reviews. and attitudes of students with learning disabilities. ED 423 860). Retrieved from http://www. 34(2). C. February). (1999. 185-210). (1999).. The fifth discipline: The art and practice of the learning organization. R. C. Effects of problem-based. Carbondale. Effects of anchored instruction on enhancing Chinese students’ problem-solving skills. What work requires of schools. The Secretary’s Commission on Achieving Necessary Skills (SCANS). In Proceedings of Selected research and Development Presentations at the National Convention of the Association for Educational Communications and Technology (AECT).
. *Shyu. The effects of progressive interactivi ty on learning from interactive video. Atlanta. Shaffer. B. (1998). (1991). (1999). networked hypermedia.S. Louis. (1998. computers and life online: education in a cyber-world.[Computer software and Manual]. (1991). MO. E. ED 405 841). In I. J. (1990). St. Government Printing Office. (1997. L. Page to screen: Taking literacy into the electronic era (pp. (ERIC Document Reproduction Service No. Rhodes (Ed. &Dalton.Technology and literacy in the 21st century: The importance of paying attention. &Hannefin.). J. New York: Doubleday. (1999).. Senge. 89-96. New York: Simon and Schuster. R. (1986). Simon. Research report. M. L. NM.ca/faculty/academic/schwarze/ meta_e. GA: Georgia State University. & Curtin. Statistics software for meta-analysis (Version 5. Will 'one-size' school restructuring fit all? Some cautionary evidence for disadvantaged schools. Chicago. D. August). Culture to the nth power. H. DC: U. R. 246-250). In R. Berlin: Frie Universität Berlin.
D.html Tapscott. Survive the transition: Industrial age to information age. Research report. Ward (Eds. & Toffler. Literacy theory in the age of the Internet. Toffler. I.com/html/articles. 52-59. Ulmer. (1993).. R. D. A. Baltimore: Johns Hopkins University Press. (1998). 56-59. Foreword/Forward (Into Electracy) in T. Retrieved December 6. VA: Association for Supervision and Curriculum Development. D. The third wave. The digital economy: Promise and peril in the age of networked intelligence. Madison. Learning & memory: The brain in action. Brown and Company. S. The benefits of psychotherapy. London: Williams & Norgate. (1996). (1980). CPU: Computer Power User. D. (1993). S. D. New York: McGraw-Hill. A. Campfires in cyberspace. New York: Bantam Books. C. (1998). Paradigm shift: The new promise of information technology.wisc. December). The next computer interface. San Carlos. & Miller. Glass. Tristram. G.). E. New York: McGraw-Hill. Growing Up Digital: The rise of the net generation. Tapscott. CA: Starsong Publications. H.. (2001. Thornburg. Where nanotechnology and the computer industry meet: Shrinking the PC. The principles of psychology (Revised and enlarged. March). 2002 from http://www. G.strackbein. Boston: Little. Springer. Spencer. M. & Donovan. Alexandria. 2. 2(3). Toffler. Tapscott. (2002. Statistical Methods in Medical Research. 104(10). (1872). S. 173-192. 2002 from http://www. V. Measuring the success of small-group learning in college-level SMET teaching: A meta-analysis. G. & I. (1996).). (2001). T. & Caston. (1997).edu/nise/cl1/cl/resource/scismet. H. Taylor. (1991)... M. A. Controversies in meta-analysis: The case of the trials of serum cholesterol reduction. WI: The National Institute for Science Education. Smith. University of Wisconsin-Madison.wcer. C. (1999). MIT’s Technology Review. L. (1993).pdf Strackbein. Stanne.114 Smith. Sprenger. War and anti-war: Survival at the dawn of the 21st Century. New York: McGraw-Hill. M. New York: Columbia University Press.. Article series: How to survive in the information age. Thompson. Retrieved September21..
org/journals/amp/amp548594. Understanding computers and cognition: A foundation for design. E. C. Research report. American Psychologist. 594-604. M. Paper presented at the Joint Meeting of the Central States Communication Association and the Southern States Communication Association. & Flores. (1999. Louis. R.html Wilson. Cooper & L. (2002). (1999). D. April). Reading. Meta-analysis: Quantitative methods for research synthesis. In Mind as motion: Explorations in the dynamics of cognition. L. S. NJ: Educational Testing Service. It's about time: An overview of the dynamical approach to cognition.V.. *Yaw. St. F. T. (1987).. MO. Statistical methods in psychology journals: Guidelines and explanations. ED 444 435). IL: Wolfram Media. L. D. H. Winograd. Cambridge. Chronicle of Higher Education. F. D. Princeton. Wolfram. Meta-analysis and effect size. Retrieved April 30.edu/~alex/teaching/WBI/es. The handbook of research synthesis (pp.html
. 4155). (ERIC Document Reproduction Service No. White.asu. Yu. *White. 2002. Champaign.apa. D. (1986). S.115 Van Gelder. & Task Force on Statistical Inference.). (ERIC Document Reproduction Service No. (1998). (1999). 47 (21):A19-A20. (1994). Wolf. New York: Russell Sage Foundation. Scientific communication and literature retrieval. H. Retrieved December 7. (1996). ED 430 261). The effectiveness of web-based instruction: A case study. 54(8). CA: Sage.ed. (2002). from http://seamonkey. Wilkinson. Hedges (Eds. Does it compute? The relationship between educational technology and student achievement in mathematics. and Port. A new kind of science.. In H. Beverly Hills. A comparison of final grades of distance learners to classroom learners. C. T. MA: MIT Press. 2002 from http://www. & Gilman. (1995). Wenglinsky. MA: Addison-Wesley. Inc. Self-paced studies.
2376 Effect Size= 0.1046 Effect Size= 0.0897 Effect Size= 0.3885 Effect Size= 0.1057 Effect Size= 0.4989 Effect Size= 0.6746 Effect Size= 0.5311 Effect Size= 0.4489 Effect Size= 0.2786 Effect Size= 0.117
Table A Clusters at 1% Level of Significance ( r ) Cluster 1 StudyID= 19 Effect Size= Cluster 2: StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= 3 24 23 27 9 5 29 2 17 14 25 13 11 18 7 26 16 20 8 4 30 31 15 12 6 21 10 22 28 1 0.2020 Effect Size= 0.4807 Effect Size= 0.1849 Effect Size= 0.1612 Effect Size= 0.4704 Effect Size= 0.0496
.2745 Effect Size= 0.4443 Effect Size= 0.2164 Effect Size= 0.6257 Effect Size= 0.0943 Effect Size= 0.3184 Effect Size= 0.2323 Effect Size= 0.1339 Effect Size= 0.3310 Effect Size= 0.5281 Effect Size= 0.0630 Effect Size= 0.2845 Effect Size= 0.8266
Effect Size= 0.1488 Effect Size= 0.5315 Effect Size= 0.
4807 Effect Size= 0.1488 Effect Size= 0.1046 Effect Size= 0.1612 Effect Size= 0.0630 Effect Size= 0.2845 Effect Size= 0.3885 Effect Size= 0.2745 Effect Size= 0.5281 Effect Size= 0.5315 Effect Size= 0.4989 Effect Size= 0.2020 Effect Size= 0.2164 Effect Size= 0.0496
.8266 0.2376 Effect Size= 0.0897 Effect Size= 0.6257
Effect Size= 0.1057 Effect Size= 0.0943 Effect Size= 0.3184 Effect Size= 0.2786 Effect Size= 0.2323 Effect Size= 0.4704 Effect Size= 0.1849 Effect Size= 0.4443 Effect Size= 0.5311 Effect Size= 0.118 Table B Clusters at 5% Level of Significance ( r ) Cluster 1: StudyID= 19 Effect Size= Cluster 2: StudyID= 3 Effect Size= StudyID= 24 Effect Size= Cluster 3: StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= 23 27 9 5 29 2 17 14 25 13 11 18 7 26 16 20 8 4 30 31 15 12 6 21 10 22 28 1 0.6746 0.1339 Effect Size= 0.4489 Effect Size= 0.3310 Effect Size= 0.
2020 0.3885 0.1612 0.4704 0. Dev = 128.5703 Correlation between Sample and Effect Sizes = -0.0897 0.5315 0.8387 Sample Size Std.119 Table C Clusters at 10% Level of Significance ( r ) Cluster 1: StudyID= 19 Effect Size= Cluster 2: StudyID= 3 Effect Size= StudyID= 24 Effect Size= Cluster 3: StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= StudyID= 23 27 9 5 29 2 17 14 25 13 11 18 7 26 16 20 8 4 30 31 15 12 6 21 10 22 28 1 Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= Effect Size= 0.5281 0.4989 0.1849 0.2376 0.5311 0.4443 0.2745 0.0943 0.1057 0.6257 0.0865
.1046 0.1488 0.2845 0.0496
Notes: Average Sample Size = 120.6746 0.1339 0.2164 0.4489 0.3310 0.0630 0.3184 0.2323 0.4807 0.8266 0.2786 0.