You are on page 1of 27


Will the rigor of the course be compromised?

No. Studies show that students need time to move from their naïve conceptions about
science to more expert-like views. In addition, students who study fewer topics in greater
depth in high school have higher success in college science than their peers who had a
greater breadth of coverage. Instead of being asked to demonstrate their knowledge about
a wide-range of topics at a superficial level (like the Regents exam), students must show
they have a deep understanding of the essentials.
[BLUE Attached: Depth vs. Breadth study]

What is essential to prepare students for college?

The College Board recently identified the knowledge and skills students need to be
successful in college science. These standards adhere to the “more depth, less breadth”
approach. I can attest that the physics standards are pedagogically sound – they draw from
large volume of research on how students learn physics.
[PINK Attached: College Board Standards for College Success]

How will students learn physics to obtain a deep understanding of the essentials?
We currently teach physics using Modeling Instruction whenever possible. Frank was trained
in Modeling Instruction at a 2.5 week summer institute at Buffalo State in 2004. Jim
attended 2.5 weeks of summer training at Arizona State in 2005. Modeling instruction has
embraced Tony Wagner’s 7 essential skills long before he wrote the book! Instead of relying
on lectures and textbooks, Modeling Instruction has students design experiments to
determine physics concepts and relationships themselves. Students must reconcile the
results of their experiments with their own (often naïve views) of how the world works. They
then present their results to the class so we can come to a consensus. This entire process
takes much more time than a traditional-lecture based approach, but the depth of student
understanding and retention is much greater.
[GREEN Attached: Modeling Instruction article]

What types of alternate assessments might students do?

There are a variety of ways students can be assessed besides a paper-and-pencil exam. I’ve
included a list of possible projects/assessments that can be used as a culminating activity in
a modeling cycle or even given at the start of a unit as motivation to drive instruction
forward. Again, implementing experiences like these for students takes more time than
traditional instruction.
[YELLOW Attached: Sample assessments]

How will you know students are performing at the same level without the exam?
I am implementing a criterion-based grading system. Learning goals for teach topic are
clearly laid out and students are graded using a 1-4 rubric for each goal. Every HW, class
discussion, quiz, project, etc. can be used as evidence to see how well students are
reaching these learning goals. Students are evaluated on eventual mastery and any initial
missteps while exploring the content are not considered. This system ensures I know (and
students know) whether students have the essential skills and knowledge needed for
success in college science.
[WHITE Attached: Samples criterion-based grading and rubrics used by students]
Depth Versus Breadth: How
Content Coverage in High School
Science Courses Relates to Later
Success in College Science

University of Texas, Arlington, TX 76019, USA


Science Education Department, Harvard-Smithsonian Center for Astrophysics,
Cambridge, MA 02138, USA

Curriculum, Instruction, and Special Education Department, Curry School of Education,
University of Virginia, Charlottesville, VA 22904, USA

Received 5 March 2008; revised 1 September 2008, 10 October 2008;

accepted 17 October 2008

DOI 10.1002/sce.20328
Published online in Wiley InterScience (

ABSTRACT: This study relates the performance of college students in introductory science
courses to the amount of content covered in their high school science courses. The sample
includes 8310 students in introductory biology, chemistry, or physics courses in 55 randomly
chosen U.S. colleges and universities. Students who reported covering at least 1 major topic
in depth, for a month or longer, in high school were found to earn higher grades in college
science than did students who reported no coverage in depth. Students reporting breadth in
their high school course, covering all major topics, did not appear to have any advantage in
chemistry or physics and a significant disadvantage in biology. Care was taken to account
for significant covariates: socioeconomic variables, English and mathematics proficiency,
and rigor of their preparatory high science course. Alternative operationalizations of depth
and breadth variables result in very similar findings. We conclude that teachers should
use their judgment to reduce coverage in high school science courses and aim for mastery
by extending at least 1 topic in depth over an extended period of time. !C 2008 Wiley

Periodicals, Inc. Sci Ed 1 – 29, 2008

Correspondence to: Marc S. Schwartz; e-mail:

Contract grant sponsor: Interagency Eductional Research Initiative.
Contract grant number: NSF-REC 0115649.
C 2008 Wiley Periodicals, Inc.

academically weaker students? We systematically investigated the interactions be-

tween performance variables and the depth and breadth variables and found only
two interactions significant below the .05 level. Contrary to the main effects analysis
results, the significance of the interactions was not consistent and did not provide
a clear picture of the connections between the outcome and predictors. The results
suggest that the associations for high- and low-achieving students are similar with
respect to depth and breadth of content coverage.4
– Students of highly rated teachers. Our survey asked students to rate several teacher
characteristics. If depth and breadth were to a considerable extent correlated with
those teacher characteristics, this would make the results harder to interpret because
of the conundrum of colinearity: that particular teacher characteristics (rather than
depth or breadth of coverage) might be the actual determinants of student outcomes,
and that depth or breadth themselves might merely reflect those teacher charac-
teristics by association. Teacher characteristics, however, were found to be only
minimally correlated with breadth and depth. Among the teacher characteristics,
only very weak correlations are found between breadth and the teacher’s rated abil-
ity to explain problems in several different ways (r = .04). Depth correlated weakly
with the ratings of the teacher’s subject knowledge (r = .05). It appears that highly
rated teachers are no more likely than others to teach with depth or breadth.

The baseline model (Table 2) shows the association between breadth and depth of high
school science curricula and first-course introductory-level college science performance.
The most striking analytical characteristic is the consistency of the results across the three
baseline models, revealing the same trends across the disciplines of biology, chemistry,
and physics. In addition, the contrast of positive associations between depth and perfor-
mance and negative associations between breadth and performance is striking, though not
significant in all cases. This replication of results across differing disciplines suggests a
consistency to the findings. However, we caution that the baseline model offers only out-
comes apart from the consideration of additional factors that may have some influence on
these associations. The robustness of these associations was put to the test, as additional
factors were included in an extended model with additional analyses.
The extended model, shown in Table 3, included variables that control for variations
in student achievement as measured by tests scores, high school course level (regular,
honors, AP), and high school grades as well as background variables such as parent’s
educational level, community socioeconomic level, public high school attendance, and
year in college. With the addition of these variables in the extended model, we find that
the magnitude of the coefficients decreased in the majority of instances, but the trends
and their significance remained. Keeping in mind the generalized nature of the analyses
we have undertaken, the most notable feature of these results stems not simply from the
magnitude of the coefficients, but rather from the trends and their significance. Here, the
positive associations between depth of study and performance remain significant whereas
the negative association between breadth of study and performance retains its significance
in the biology analysis. Although this association is not significant in the chemistry and
physics analyses, the negative trend is consistent with our result for biology.

A significant interaction (p = .03) was found between the math SAT score and depth. In addition, a
second significant interaction (p = .02) was found between breadth and most advanced mathematics grade.
Apart from these two significant interactions, all others were nonsignificant.

Science Education

What these outcomes reveal is that although the choice to pursue depth of study has
a significant and positive association with performance, the choice to pursue breadth of
study appears to have implications as well. These appear to be that students whose teachers
choose broad coverage of content, on the average, experience no benefit. In the extended
model, we arrive at these results while accounting for important differences in students’
backgrounds and academic performance, which attests to the robustness of these findings.
The findings run counter to philosophical positions that favor breadth or those that advocate
a balance between depth and breadth (Murtagh, 2001; Wright, 2000).
Finally, a particularly surprising and intriguing finding from our analysis indicated that
the depth and breadth variables as defined in this analysis appear to be uncorrelated;
additionally, the interactions between both variables are not significant in either model.
Figure 6 offers an analysis of the outcomes treating breadth and depth as independent
characteristics of high school science curriculum. The first panel reveals that over 40%
of the students in our survey fall within the “depth present–breadth absent” grouping,
whereas the other three varied between 14% and 23%. The distribution across these four
groupings shows that the teachers of students in our survey are making choices about which
topics to leave out and which to emphasize. These choices have consequences. The second
panel indicates that those students reporting high school science experiences associated
with the group “depth present–breadth absent” have an advantage equal to two thirds of a
year of instruction over their peers who had the opposite high school experience (“Depth
Absent–Breadth Present”). This outcome is particularly important given that high school
science teachers are often faced with choices that require balancing available class time
and course content. It appears that even teachers who attempt to follow the “best of both
worlds” approach by mixing depth and breadth of content coverage do not, on average,
provide an advantage to their students who continue to study science in college over their
peers whose teachers focused on depth of content coverage.
Rather than relying solely on one definition of depth and breadth, we examined alternative
operationalizations of the breadth and depth variables and replicated the analysis. The
findings remained robust when these alternative operationalizations were applied.
However, a limitation in our definition of depth and breadth stems from the smallest unit
of time we used in the study to define depth of study, which was on the order of weeks.
Our analysis cannot discern a “grain size” smaller than weeks. Teachers may opt to study
a single concept in a class period or many within that period. Likewise, teachers might
decide that a single laboratory experiment should go through several iterations in as many
days, while others will “cover” three different, unrelated laboratories in the same time span.
These differences are beyond the capacity of our analysis to resolve. In the end, our work
is but one approach to the analysis of the concepts of depth and breadth generated from
students’ self-reports of their high school science classroom experiences.

Findings in Light of Research From Cognitive Science

Most students hold conceptions that are at odds with those of their teachers or those
of scientists. In general, students cling tenaciously to these ideas, even in the face of
concerted efforts (Sadler, 1998). These “misconceptions” or “alternative conceptions” have
been uncovered using several research techniques, but qualitative interviews, traceable to
Piaget, have proven highly productive (Duckworth, 1987; Osborne & Gilbert, 1980). A
vast number of papers have been published revealing student misconceptions in a variety
of domains (Pfunt & Duit, 1994).
Students do not quickly or easily change their naı̈ve scientific conceptions, even when
confronted with physical phenomena. However, given enough time and the proper impetus,
Science Education

students can revisit and rethink their ideas. This change can often be painstakingly slow,
taking much longer than many teachers allow in the rapid conveyance of content that is
the hallmark of a curriculum that focuses on broad coverage. In the view of Eylon and
Linn (1988), “in-depth coverage can elaborate incomplete ideas, provide enough cues to
encourage selection of different views of a phenomenon, or establish a well-understood
alternative” (p. 263). Focused class time is required for teachers to probe for the levels of
understanding attained by students and for students to elucidate their preconceptions. It
takes focused time for students to test their ideas and find them wanting, motivating them to
reconstruct their knowledge. Eylon and Linn note, “Furthermore, students may hold onto
these ideas because they are well established and reasonably effective, not because they are
concrete. For example, naı̈ve notions of mechanics are generally appropriate for dealing
with a friction-filled world. In addition, students appear quite able to think abstractly as
long as they have appropriate science topic knowledge, even if they are young, and even
if their ideas are incomplete. Thus, deep coverage of a topic may elicit abstract reasoning”
(p. 290).
The difficulty of promoting conceptual growth is well documented in neo-Piagetian
models (Case, 1998; Fischer & Bidell, 2006; Parziale & Fischer, 1998; Schwartz & Sadler,
2007; Siegler, 1998). These researchers argue that more complex representations, such as
those prized in the sciences, require multiple opportunities for construction in addition to
multiple experiences in which those ideas and observations may be coordinated into richer,
more complex understandings of their world. This process is necessarily intensive and time
consuming because the growth of understanding is a nonlinear process that is sensitive to
changes in context, emotions, and degree of practice. The experience of developing richer
understandings is similar to learning to juggle (Schwartz & Fischer, 2004). Learning to
juggle three balls at home does not guarantee you can juggle the same three balls in front of
an audience. Alternatively, learning to juggle three balls does not mean that you can juggle
three raw eggs. Changes in context and variables lead to differences in the relationships
that students have with ideas (both new and old) and their ability to maintain them over
time (Fischer & Pipp, 1984; Fischer, Bullock, Rotenberg, & Raya, 1993). The process
of integrating many variables that scientists consider in scientific models is a foreign and
difficult assignment for anyone outside the immediate field of research.
As just noted, the realization in the cognitive sciences that the learning process is not
linear is an important pedagogical insight. Because learning is a dynamic operation that
depends on numerous variables, leaving a topic too soon deprives students and teachers
the time to experience situations that allow students to confront personal understandings
and connections and to evaluate the usefulness of scientific models within their personal

Implications in Light of High-Stakes Testing

We are currently in the “Age of Accountability” in U.S. education. The hallmark of this
era is high-stakes standardized testing. The nature and consequences of these examinations
influence the pedagogy that teachers use in the classroom. As Li et al. (2006) observed:
“Academic and policy debates seem somewhat remote to practitioners on the frontline of
science education. Science teachers, department heads, and instructional specialists need to
survive and thrive in a teaching environment increasingly driven by standards and measured
by accountability tests. They are the ones who must, here and now, find solutions to the
pressing problems of standards-based reform” (p. 5).
Our concern stems from the connection between the extent that high-stakes state ex-
aminations focus on the “facts” of science, secondary school science teachers will focus
Science Education

their pedagogy on ensuring that students can recite these “facts.” Clearly, high-stakes ex-
aminations that require recall of unrelated bits of scientific knowledge in the form of facts
and isolated constructs will increase the likelihood that teachers will adjust their teaching
methodologies to address these objectives. The more often discrete facts appear on high-
stakes examinations, the more often we imagine that teachers will feel pressured to focus
on breadth of knowledge. Conversely, we feel that the adoption of a different approach in
high-stakes testing that is less focused on the recall of wide-ranging facts but is, instead,
focused on a few widely accepted key conceptual understandings will result in a shift of
pedagogy. In such situations, we envision teachers’ instructional practice and curricula
designed to offer students greater opportunity to explore the various contexts from which
conceptual understandings emerge. We suspect that the additional focused time will allow
students to recognize (and teachers to challenge) students’ naı̈ve conceptions of nature.
These concerns fall in line with Anderson’s (2004, p. 1) three conclusions about
standards-based reform in the United States:

• The reform agenda is more ambitious than our current resources and infrastructure
will support.
• The standards advocate strategies that may not reduce achievement gaps among
different groups of students.
• There are too many standards, more than students can learn with understanding in
the time we have to teach science.

Anderson’s (2004) final conclusion strongly resonates with that of the NRC (2007)
and more specifically with findings, at the international level, by Schmidt et al. (1997,
2005). In a comparison of 46 countries, Schmidt et al. (2005) noted that in top-achieving
countries, the science frameworks cover far fewer topics than in the United States, and
that students from these countries perform significantly better than students in the United
States. They conclude that U.S. standards are not likely to create a framework that develops
and supports understanding or “coherence,” a strategy that encourages the development of
a deeper understanding of the structure of the discipline. By international standards, the
U.S. science framework is “unfocused, repetitive, and undemanding” (p. 532).
From the perspective of our study, both Schmidt et al. (2005) and Anderson’s (2004)
conclusions are particularly relevant. Clearly, increasing the quantity of information re-
quired for examination preparation will lead to an increased focus on broader coverage
of course content, an approach that our findings indicate is negatively (or certainly not
positively) associated with performance in subsequent science courses. If teachers know
that they and their students are accountable for more material, then the pressure to focus
on breadth would seem like the natural pedagogical choice to make. Hence, teachers must
decide whether they choose to maximize students’ test scores or maximize preparation
for success in future study. Although they have considerable feedback concerning the test
scores of their students (from state tests, SATs, ACTs, and AP examinations), they have
almost no knowledge of how the bulk of their students do in college science, save the few
who keep in touch. Moreover, the rare college student who reports back to their high school
science teacher is often the most successful and simply reinforces a teacher’s confidence in
his or her current methods.

There are several concerns we wish to clearly elucidate. A potential criticism of this
study stems from a central element of this analysis, the length of time spent on a topic
Science Education

and how this variable might be influenced by the ability level of students to grasp the
material. One might imagine classrooms containing many struggling students actually
spending more time on particular topics and classes with generally high achieving students
requiring less time to grasp the concepts and quickly moving on to other topics. In this
scenario, depth of content coverage would be an indicator for remedial assistance by the
teacher at the classroom level. However, we assume that students in our sample—those
who went on to take introductory science courses—are typically well above average in
their science abilities. Thus we would expect our depth measure, if it really signified a
remedial scenario, to have a negative impact, if any, on student performance in introductory
college science courses. As it was, we observed the opposite (i.e., depth was associated
with higher performance, even when controlled for student achievement in high school).
This argument suggests that the remedial depth scenario is unlikely.
Another issue in this analysis is the degree to which we might expect other variables in
the FICSS survey to correlate with depth and breadth as defined. We identified one such
item in our survey: “How would you best describe learning the material required in your
[high school science] course?” The respondents were provided with a 5-point rating scale
ranging from “A lot of memorization of facts” to “A full understanding of topics.” This
question is constructed in a manner that clearly alludes (at one end of the scale) to a type
of memorization that presumes a cursory or superficial level of understanding. This was
certainly our intention when we wrote this question. However, Ramsden (2003) notes that
the act of memorization may help learners deconstruct the structure and connections in the
body of information they are attempting to commit to memory. Thus, is memorization a
feature of both depth and breadth in learning? On the basis of this more carefully considered
characterization of memorization and taking into account the tone and clear intention of
this questionnaire item, we hypothesized that the responses to this item would have a
weak positive correlation, if any, with depth, and a weak negative correlation, if any, with
breadth. In the analysis, we found the correlation with depth was r = .15 (very weak) and
the correlation with breadth was r = −.09, a result commonly considered “not correlated.”
The results in this case suggest that memorization (as a new variable) is orthogonal with
our definition of breadth and depth and is not subsumed by depth or breadth. However, it
remains to be seen how well our definitions hold up as new variables are identified.

Further Research: Looking at Individual Subject Areas

A key issue for further research in the analysis of the constructs of breadth versus depth is
whether there are specific topics that high school teachers should focus on or whether high
school teachers might choose to focus on any topic and still potentially reap an advantage
in terms of students’ enhanced future performance. In other words, do students perform
better in introductory college science courses if they were exposed to at least one science
topic in depth, regardless of what it was, or does it help them if they specifically studied
topic “A” in depth, but not if they studied topic “B” in depth? An analogous argument can
be made regarding breadth. It therefore appears useful to examine the potential effects of
depth in individual subject areas, and of the omission of individual subject areas, in future

The baseline model reveals a direct and compelling outcome: teaching for depth is
associated with improvements in later performance. Of course, there is much to consider
in evaluating the implications of such an analysis. There are a number of questions about
Science Education

this simple conclusion that naturally emerge. For example, how much depth works best?
What is the optimal manner to operationalize the impact of depth-based learning? Do
specific contexts (such as type of student, teacher, or school) moderate the impact of depth?
The answers to these questions certainly suggest that a more nuanced view should be
sought. Nonetheless, this analysis appears to indicate that a robust positive association
exists between high school science teaching that provides depth in at least one topic and
better performances in introductory postsecondary science courses.
Our results also clearly suggest that breadth-based learning, as commonly applied in
high school classrooms, does not appear to offer students any advantage when they enroll
in introductory college science courses, although it may contribute to higher scores on
standardized tests. However, the intuitive appeal of broadly surveying a discipline in an
introductory high school course cannot be overlooked. There might be benefits to such a
pedagogy that become apparent when using measures that we did not explore. The results
regarding breadth were less compelling because in only one of the three disciplines were
the results significant in our full model. On the other hand, we observed no positive effects
at all. As it stands, our findings at least suggest that aiming for breadth in content coverage
should be avoided, as we found no evidence to support such an approach.

The authors thank the people who made this large research project possible: Janice M. Earle, Finbarr
C. Sloane, and Larry E. Suter of the National Science Foundation for their insight and support; James
H. Wandersee, Joel J. Mintzes, Lillian C. McDermott, Eric Mazur, Dudley R. Herschbach, Brian
Alters, and Jason Wiles of the FICCS Advisory Board for their guidance; and Nancy Cianchetta,
Susan Matthews, Dan Record, and Tim Reed of our High School Advisory Board for their time
and wisdom. This research has resulted from the tireless efforts of many on our research team:
Michael Filisky, Hal Coyle, Cynthia Crockett, Bruce Ward, Judith Peritz, Annette Trenga, Freeman
Deutsch, Nancy Cook, Zahra Hazari, and Jamie Miller. Matthew H. Schneps, Nancy Finkelstein,
Alex Griswold, Tobias McElheny, Yael Bowman, and Alexia Prichard of our Science Media Group
constructed of our dissemination website ( We also appreciate advice and interest
from several colleagues in the field: Michael Neuschatz of the American Institute of Physics, William
Lichten of Yale University, Trevor Packer of the College Board, Saul Geiser of the University of
California, Paul Hickman of Northeastern University, William Fitzsimmons, Marlyn McGrath Lewis,
Georgene Herschbach, and Rory Browne of Harvard University, and Kristen Klopfenstein of Texas
Christian University. We are indebted to the professors at universities and colleges nationwide who
felt that this project was worth contributing a piece of their valuable class to administer our surveys
and their students’ willingness to answer our questions. Any opinions, findings, and conclusions or
recommendations expressed in this material are those of the authors and do not necessarily reflect
the views of the National Science Foundation, the U.S. Department of Education, or the National
Institutes of Health.

American Association for the Advancement of Science. (1989). Project 2061: Science for all Americans.
Washington, DC: Author.
American Association for the Advancement of Science. (1993). Benchmarks for science literacy. New York:
Oxford University Press.
Anaya, G. (1999). Accuracy of self-reported test scores. College and University, 75(2), 13 – 19.
Anderson, C. W. (2004). Science education research, environmental literacy, and our collective future. National
Association for Research in Science Teaching. NARST News, 47(2), 1 – 5.
Anderson, R. D. (1995). Curriculum reform. Phi Delta Kappan, 77(1), 33 – 36.
Baird, L. (1976). Using self-reports to predict student performance. Research Monograph No. 7. New York:
College Entrance Examination Board.

Science Education
College Board Standards for College Success Page 1 of 2

Close x

Educators - Information & Tools For Teachers, Counselors, Higher Education Faculty and
Administrators Home > K–12 Services > College Board Standards for College Success™ Print Page

College Board Standards for College Success

About the College Board Standards for College Success (CBSCS)
The College Board Standards for College Success (CBSCS) define the knowledge and skills students need to
develop and master in English language arts, mathematics and statistics, and science in order to be college and
career ready. The CBSCS outline a clear and coherent pathway to Advanced Placement® (AP®) and college
readiness with the goal of increasing the number and diversity of students who are prepared not only to enroll in
college, but to succeed in college and 21st-century careers. The College Board has published these standards freely
to provide a national model of rigorous academic content standards that states, districts, schools and teachers may
use to vertically align curriculum, instruction, assessment and professional development to AP and college
readiness. These rigorous standards:

provide a model set of comprehensive standards for middle school and high school courses that lead to
college and workplace readiness;
reflect 21st-century skills such as problem solving, critical and creative thinking, collaboration, and media and
technological literacy;
articulate clear standards and objectives with supporting, in-depth performance expectations to guide
instruction and curriculum development;
provide teachers, districts and states with tools for increasing the rigor and alignment of courses across
grades 6-12 to college and workplace readiness; and
assist teachers in designing lessons and classroom assessments.

Standards Development Process

To guide the standards development process, the College Board convened national committees of middle school
and high school teachers, college faculty, subject matter experts, assessment specialists, teacher education faculty
and curriculum experts who had experience in developing content standards for states and national professional

The standards advisory committees relied on college-readiness evidence gathered from a wide array of sources to
design and develop the CBSCS. These sources include national and international frameworks such as National
Assessment of Educational Progress (NAEP), Programme for International Student Assessment (PISA), and Trends
in International Mathematics and Science Study (TIMSS); results of surveys and course content analyses from
college faculty regarding what is most important for college readiness; assessment frameworks from relevant AP
exams, the SAT®, the PSAT/NMSQT®, and College Level Examination Program® (CLEP®) exams, and selected
university placement programs.

Beginning with the end goal in mind, the committees first defined the academic demands students will face in AP or
first-year college courses in English, mathematics and statistics, and science. After identifying these demands, the
committees then backmapped to the start of middle school to outline a vertical progression, or road map, of critical
thinking skills and knowledge students need to be prepared for college-level work.

Standards and Curriculum Alignment Services

States and districts are encouraged to use the CBSCS as a guiding framework to strengthen the alignment of their
standards, curriculum and assessments to college readiness. The College Board offers standards and curriculum
alignment services to states and districts, providing independent reviews of standards, curriculum and assessments.
College Board content specialists use established alignment methods and review criteria to analyze the alignment
and offer meaningful findings and recommendations tailored to a state‘s or district's needs. 10/21/2009
College Board Standards for College Success Page 2 of 2

The College Board also uses the CBSCS to align our own curriculum and assessment programs, including
SpringBoard®, to college readiness.

Learn More
Please contact Standards and Curriculum Alignment Services at or your
regional College Board representative for more information on the CBSCS and our alignment services.

Science (.pdf/4.4M)
English Language Arts (.pdf/1.9M)
Mathematics & Statistics (.pdf/535K)
Mathematics & Statistics Adapted for Integrated Curricula (.pdf/368K)
Mathematics & Statistics Three-Year Alternative for Middle School (.pdf/121K) 10/21/2009
Jane Jackson, Larry Dukerich, David Hestenes

Modeling Instruction: An Effective

Model for Science Education
The authors describe a Modeling Instruction program that places an emphasis
on the construction and application of conceptual models of physical
phenomena as a central aspect of learning and doing science.

Introduction and rural areas. Modeling Workshops nearly all of whom had physics the
Modeling Instruction is an evolving, at Arizona State University (ASU) are previous year. When asked to formally
research-based program for high the cornerstone of a graduate program present information to the class about
school science education reform for teachers of the physical sciences. controversial topics such as cloning
that was supported by the National Recently, Modeling has expanded to or genetically modified organisms, it
Science Foundation (NSF) from embrace the entire middle/high school is delightfully clear how much more
1989 to 2005. The name Modeling physical science curriculum. The articulate and confident they are.”
Instruction expresses an emphasis program has an extensive web site at Students in modeling classrooms
on the construction and application experience first-hand the richness and
of conceptual models of physical excitement of learning about the natural
Product: Students Who Can
phenomena as a central aspect of world. One example comes from
learning and doing science (Hestenes, Phoenix modeler Robert McDowell.
Modeling Instruction meets or He wrote that, under traditional
1987; Wells et al, 1995; Hestenes,
exceeds NSES teaching standards, instruction, “when asked a question
1997). Both the National Science
professional development standards, about some science application in a
Education Standards (NRC, 1996)
assessment standards, and content and movie, I might get a few students who
and National Council of Teachers
inquiry standards. would cite 1-2 errors, but usually with
of Mathematics Standards (NCTM),
Modeling Instruction produces uncertainty. Since I started Modeling,
as well as Benchmarks for Science
students who engage intelligently in the students now bring up their own
Literacy (AAAS, 1993) recommend
public discourse and debate about topics … not just from movies, but
“models and modeling” as a unifying
matters of scientific and technical their everyday experiences.” One of
theme for science and mathematics
concern. Betsy Barnard, a modeler his students wrote, “Mr. McDowell, I
education. To our knowledge, no other
in Madison, Wisconsin, noticed was at a Diamondback baseball game
program has implemented this theme
a significant change in this area recently, and all I could think of was
so thoroughly.
after Modeling was implemented all the physics problems involved.”
From 1995 to 1999, 200 high
in her school: “I teach a course in A former student of another modeler,
school physics teachers participated in
biotechnology, mostly to seniors, Gail Seemueller of Rhode Island,
two four-week Leadership Modeling
Workshops with NSF support. Since described it as follows: “She wanted us
that time, 2500 additional teachers from to truly LEARN and more importantly
48 states and a few other nations have UNDERSTAND the material. I was
Students in modeling engaged. We did many hands-on
taken summer Modeling Workshops at
universities in many states, supported
classrooms experience experiments of which I can still vividly
largely by state funds. Participants first-hand the richness and remember, three years later.”
include teachers from public and excitement of learning about Kelli Gamez Warble of rural
private schools in urban, suburban, the natural world. Arizona, who has taught physics and

10 Science Educator
calculus for a decade using Modeling assess student understanding in more
Instruction, has had numerous students meaningful ways and experiment with
whose career choices were influenced Students present and justify more authentic means of assessment;
by Modeling Instruction. She wrote their conclusions in oral and to continuously improve and update
about discovering several former written form, including the instruction with new software,
students when visiting an ASU formulation of a model for curriculum materials, and insights
Engineering Day, and all but one were the phenomena in question from educational research; and
females. She wrote, “As a former and an evaluation of the to work collaboratively in action
female engineering student myself, research teams to mutually improve
I was gratified but not surprised.
model by comparison with their teaching practice. Altogether,
Modeling encourages cooperation data. Modeling Workshops provide detailed
and discourse about complicated implementation of the National
ideas in a non-threatening, supportive the physical world. From its inception, Science Education Standards.
environment. Females who view the Modeling Instruction program has The modeling cycle, student
science, engineering, and technology been concerned with reforming high conceptions, discourse
as fields encouraging cooperation school physics teaching to make it Instruction is organized into
and supportiveness will, I believe, more coherent and student-centered, modeling cycles rather than traditional
become much more attracted to these and to incorporate the computer as an content units. This promotes an
non-traditional areas.” essential scientific tool. integrated understanding of modeling
Excellence In a series of intensive workshops processes and the acquisition of
over two years, high school teachers coordinated modeling skills. The
Many modeling teachers have
learn to be leaders in science teaching two main stages of this process
been recognized nationally. For
reform and technology infusion in are model development and model
example, three users of Modeling
their schools. They are equipped deployment.
Instruction have received the National
with a robust teaching methodology The first stage—model
Science Teachers Association (NSTA)
for developing student abilities to development—typically begins
Shell Science Teaching Award.
make sense of physical experience, with a demonstration and class
Numerous modelers have received
to understand scientific claims, to discussion. This establishes a common
the Presidential Award for Excellence
articulate coherent opinions of their understanding of a question to be
in Math and Science Teaching
own and defend them with cogent asked of nature. Then, in small groups,
(PAEMST). At Modeling Workshops
arguments, and to evaluate evidence students collaborate in planning and
and related graduate courses for
in support of justified belief. conducting experiments to answer or
science teachers at ASU, as well as
More specifically, teachers learn clarify the question. Students present
at Modeling Workshops across the
to ground their teaching in a well- and justify their conclusions in oral and
United States, teachers learn to impact
defined pedagogical framework written form, including the formulation
students of various backgrounds and
(modeling theory; Hestenes, 1987), of a model for the phenomena in
learning styles. Modeling Workshops
rather than following rules of thumb; question and an evaluation of the
and classrooms are thriving centers of
to organize course content around model by comparison with data.
interactive engagement.
scientific models as coherent units Technical terms and representational
The Essence of Modeling of structured knowledge; to engage tools are introduced by the teacher as
Instruction students collaboratively in making they are needed to sharpen models,
The Modeling method of instruction and using models to describe, explain, facilitate modeling activities, and
corrects many weaknesses of the predict, design, and control physical improve the quality of discourse. The
traditional lecture-demonstration phenomena; to involve students in teacher is prepared with a definite
method, including the fragmentation using computers as scientific tools agenda for student progress and guides
of knowledge, student passivity, and for collecting, organizing, analyzing, student inquiry and discussion in that
the persistence of naïve beliefs about visualizing, and modeling real data; to direction with “Socratic” questioning

Spring 2008  Vol. 17, No. 1 11

and remarks. The teacher is equipped situations to refine and deepen their
with a taxonomy of typical student understanding. Students work on
misconceptions to be addressed as The teacher is equipped challenging worksheet problems in
students are induced to articulate, with a taxonomy of typical small groups, and then present and
analyze, and justify their personal student misconceptions to defend their results to the class. This
beliefs (Halloun and Hestenes, 1985b, be addressed as students stage also includes quizzes, tests, and
Hestenes et al., 1992). are induced to articulate, lab practicums. An example of the
During the second stage—model analyze, and justify their entire modeling cycle is outlined in
deployment—students apply their Table 1.
newly-discovered model to new
personal beliefs.

Table 1.  Modeling Cycle Example:  The Constant Velocity Model

I. Model Development (Paradigm have presented, the teacher to demonstrate their

Lab) leads a discussion of the understanding of the model
A. Pre-lab discussion models to develop a general and its application. Students
Students observe battery- mathematical model that are asked not only to solve
powered vehicles moving describes constant-velocity problems, but also to provide
across the floor and describe motion. brief explanations of their
their observations. The problem-solving strategies.
II. Model Deployment
teacher guides them toward C. Lab Practicum
A. Worksheets
a laboratory investigation to To further check for
Working in small groups,
determine whether the vehicle understanding, students
students complete worksheets
moves at constant speed, and are asked to complete a lab
that ask them to apply the
to determine a mathematical practicum in which they need
constant-velocity model
model of the vehicle’s to use the constant-velocity
to various new situations.
motion. model to solve a real-world
They are asked to prepare
B. Lab investigation problem. Working in groups,
whiteboard presentations of
Students collect position and they come to agreement on
their problem solutions and
time data for the vehicles and a solution and then test their
present them to the class. The
analyze the data to develop a solution with the battery-
teacher’s role at this stage is
mathematical model. (In this powered vehicles.
continual questioning of the
case, the graph of position
students to encourage them to D. Unit Test
vs. time is linear, so they do a
articulate what they know and As a final check for
linear regression to determine
how they know it, thereby understanding, students take
the model.) Students then
correcting any lingering a unit test. (The constant-
display their results on small
misconceptions. velocity unit is the first unit
whiteboards and prepare
presentations. B. Quizzes of the curriculum. In later unit
In order to do mid-course tests, students are required to
C. Post-lab discussion incorporate models developed
progress checks for student
Students present the results of earlier in the course into their
understanding, the modeling
their lab investigations to the problem solving; this is an
materials include several
rest of the class and interpret example of the spiral nature of
short quizzes. Students
what their model means in the modeling curriculum.)
are asked to complete
terms of the motion of the
these quizzes individually
vehicle. After all lab groups

12 Science Educator
That models and modeling should human thought; we use metaphors so
be central to an inquiry-based approach frequently and automatically that we
to the study of physics is no surprise. seldom notice them unless they are The Modeling method
The NSES state, “Student inquiries called to our attention. Metaphors are stresses developing
should culminate in formulating used to structure our experience and a sound conceptual
an explanation or model… In the thereby make it meaningful. A major understanding through
process of answering the questions, the objective of teaching should therefore graphical and diagrammatic
students should engage in discussions be to help students “straighten out” representations before
and arguments that result in the their metaphors. In Modeling, instead
revision of their explanations.” of designing the course to address
moving on to an algebraic
Traditional instruction often specific “naïve conceptions,” the treatment of problem
overlooks the crucial influence of instructor focuses on helping students solving.
students’ personal beliefs on what construct appropriate models to
they learn. Force Concept Inventory account for the phenomena they study. variables, the instructor asks the
(FCI) data (Hestenes et al., 1992) show When students learn to correctly students to help design the experiment.
that students are not easily induced identify a physical system, represent A general procedure is negotiated so
to discard their misconceptions in it diagrammatically, and then apply that students have a sense of why
favor of Newtonian concepts. Some the model to the situation they are they are doing a particular procedure.
educational researchers have expended studying, their misconceptions tend Frequently, multiple procedures arise
considerable effort in designing and to fall away (Wells et al., 1995). as students have access to equipment
testing teaching methods to deal with A key component of this approach is that allows them to explore the
specific misconceptions. Although that it moves the teacher from the role relationship in different ways. Lab
their outcomes have been decidedly of authority figure who provides the teams are then allowed to begin
knowledge to that of a coach/facilitator collecting data.
who helps the students construct their Students have to make sense of the
own understanding. Since students experiment themselves. The instructor
In Modeling, instead of systematically misunderstand most must be prepared to allow them to fail.
designing the course to of what we tell them (due to the fact The apparatus should be available
address specific “naïve that what they hear is filtered through for several days, should they need it.
conceptions,” the instructor their existing mental structures), Students use spreadsheet and graphing
focuses on helping students the emphasis is placed on student software to help them organize and
articulation of the concepts. analyze their data. After allowing time
construct appropriate to prepare whiteboards to summarize
Laboratory experiences are centered
models to account for the their findings, the instructor selects
on experiments that isolate one
phenomena they study. concept, with equipment that enables certain groups to present an oral
students to generate good data reliably. account of the group’s experimental
better than the traditional ones, Students are given no pre-printed procedure and interpretation. Students
success has been limited. Many have list of instructions for doing these use multiple representations to present
concluded that student beliefs are so experiments. Rather, the instructor their findings, including concise
“deep-seated” that heavy instructional introduces the class to the physical English sentences, graphs, diagrams,
costs to unseat them are unavoidable. system to be investigated and engages and algebraic expressions.
However, documented success with students in describing the system until After some initial discomfort with
the Modeling method suggests that an a consensus is achieved. The instructor making these oral presentations,
indirect treatment of misconceptions is elicits from students the appropriate students generally become more at
likely to be most efficient and effective. dependent and independent variables ease and gradually develop the skills
(Wells et al., 1995) to characterize the system. After of making an effective presentation.
Cognitive scientists have identified obtaining reasoned defenses from They learn how to defend their
metaphors as a fundamental tool of the students for selection of these views clearly and concisely. These

Spring 2008  Vol. 17, No. 1 13

communication skills are valuable getting up before a group of peers
beyond their immediate application with nothing to say. Additionally,
in the physics classroom. Students have to account during preparation, the instructor has
for everything they do in the opportunity to help students if no
Clearly, one role of assessment
solving a problem, ultimately one in the group knows how to do the
appealing to models problem.
is to ascertain student mastery of
the skills and understanding of the developed on the basis of Professional Development
concepts in the unit. An equally experiments done in class. Modeling Instruction places a strong
important role is the feedback it emphasis on professional development,
provides the instructor about his or her both during the workshops and
curriculum and teaching methods. The instructor. Other group members afterward. The workshops are the
Modeling method stresses developing are allowed to help if needed. This beginning of a process of lifetime
a sound conceptual understanding is an important assessment tool that learning. Teachers are encouraged
through graphical and diagrammatic helps the instructor determine how to network amongst themselves and
representations before moving on to well students have mastered the to become leaders of reform in their
an algebraic treatment of problem concepts. When one hears fuzzy or schools and districts. The ASU web
solving. To be consistent with this incoherent explanations, one has the site and list-serve provide a forum
end, the assessment instruments test opportunity to help students deal with for communication among Modelers
students’ ability to interpret graphs and their incomplete conceptions before across the nation and the world.
draw conclusions, as well as to solve moving on to the next topic. The two Since “teachers teach as they have
quantitative problems. questions teachers ask most frequently been taught,” the workshops include
In addition to lab write-ups, in which are, “Why do you say that?” and extensive practice in implementing the
students report their findings using a “How do you know that?” Students curriculum as intended for high school
format outlined in the curriculum have to account for everything they classes. Participants rotate through
materials, the lab practicum is used as a do in solving a problem, ultimately roles of student and instructor as they
means to check student understanding. appealing to models developed on the practice techniques of guided inquiry
In the practicum, an application basis of experiments done in class. and cooperative learning. Plans and
problem is posed to the class as a Instructors trained in the Modeling techniques for raising the level of
whole. The class has a fixed period method do not take correct statements discourse in classroom discussions and
of time to figure out what model is for granted; they always press for student presentations are emphasized.
appropriate to describe the situation, explicit articulation of understanding. The workshops immerse teachers
decide what measurements to make, This probing by the teacher often in the physics content of the entire
collect and analyze the data, and then reveals that the student presenter does semester, thereby providing in-depth
prepare a solution to the problem. This not fully understand the concept (even remediation for under-prepared
is used as the culminating activity in when answers are correct). The teacher teachers.
the unit and usually helps the students then has the opportunity to help the The great success and popularity
review key principles for the unit student correct his or her conceptions of the Modeling Workshops has
test. through Socratic questioning. generated an overwhelming demand
Whiteboarding is another major Whiteboarding has other valuable for additional courses. In response to
component of assessment. In uses as well. First, it gives students a this demand, in 2001 ASU approved
whiteboarding, small groups of chance to reinforce their understanding a new graduate program to support
students write up their results from of concepts; students often don’t know sustained professional development of
a lab or solutions to a worksheet exactly what they think until they’ve high school physics teachers. Modeling
problem on a small whiteboard. One heard themselves express the idea. Workshops are the foundation of the
student from the group presents the Second, students are highly motivated program, which can lead to a Master
whiteboard to the class, responding to understand the question they are of Natural Science (MNS) degree.
to questions from the class and the assigned to present. No one enjoys The National Science Education

14 Science Educator
Standards (NSES) emphasize that of students at diverse institutions. the normalized gain can range from
“coherent and integrated programs” It is the product of many hours of zero (no gain) to 1 (greatest possible
supporting “lifelong professional interviews that validated distracters, gain). This method of calculating the
development” of science teachers are and it has been subjected to intense gains normalizes the index, so that
essential for significant reform. They peer review. gains of courses at different levels
state that “The conventional view of Including the survey by Hake can be compared, even if their pretest
professional development for teachers (1998), we have FCI data on some scores differ widely.
needs to shift from technical training 30,000 students of 1000 physics
for specific skills to opportunities for teachers in high schools, colleges
intellectual professional growth.” The and universities, throughout the
MNS program at ASU is designed to world. This large data base presents A challenge in physics
meet that need. a highly consistent picture, showing education research for more
that the FCI provides statistically
Evidence of Effectiveness reliable measures of student concept
than a decade has been to
understanding in mechanics. Results identify essential conditions
Evaluation of Physics Instruction.
The Force Concept Inventory (FCI) strongly support the following general for learning Newtonian
was developed to compare the conclusions: physics and thereby devise
effectiveness of alternative methods • Before physics instruction, students more effective teaching
of physics instruction (Halloun et al., hold naive beliefs about motion and methods.
1985a, Hestenes et al., 1992). It has force that are incompatible with
become the most widely used and Newtonian concepts.
influential instrument for assessing • Such beliefs are a major determi- From pre/post course FCI scores of
the effectiveness of introductory nant of student performance in 14 traditional courses in high schools
physics instruction, and has been cited introductory physics. and colleges, Hake found a mean
as producing the most convincing • Traditional (lecture-demonstration) normalized gain of 23%. In contrast,
hard evidence of the need to reform physics instruction induces only a for 48 courses using interactive
traditional physics instruction (Hake, small change in student beliefs. This engagement teaching methods (minds-
1998). result is largely independent of the on always, and hands-on usually),
The FCI assesses students’ instructor’s knowledge, experience he found a mean normalized gain of
conceptual understanding of the and teaching style. 48%. The difference is much greater
force concept, the key concept in than a standard deviation––a highly
mechanics. It consists of 30 multiple • Much greater changes in student
significant result. This would indicate
choice questions, but there is one beliefs can be induced with
that traditional instruction fails badly,
crucial difference between the FCI instructional methods derived
and moreover, that this failure cannot
questions and traditional multiple- from educational research in, for
be attributed to inadequacies of the
choice items: distracters are designed example, cognition, alternative
students because some alternative
to elicit misconceptions known from conceptions, classroom discourse,
methods of instruction can do
the research base. A student must have and cooperative learning.
much better. A challenge in physics
a clear understanding of one of six These conclusions can be quantified. education research for more than a
fundamental aspects of the Newtonian The FCI is best used as a pre/post decade has been to identify essential
force concept in order to select the diagnostic. One way to quantify conditions for learning Newtonian
correct response. The FCI reveals pre/post gains is to calculate the physics and thereby devise more
misconceptions that students bring normalized gain (Hake, 1998). This is effective teaching methods. Modeling
as prior knowledge to a class, and it the actual gain (in percentage) divided Instruction resulted from pursuing this
measures the conceptual gains of a by the total possible gain (also in challenge. Results from using the FCI
class as a whole. The FCI is research- percentage). Thus, normalized gain = to assess Modeling Instruction are
grounded, normed with thousands (%post - %pre)/(100 - %pre). Hence, given below.

Spring 2008  Vol. 17, No. 1 15

How effective is modeling instruction? under expert modeling instruction to me to be a resounding success.
Figure 1 summarizes data from a are more than doubled compared to My observations revealed a very
nationwide sample of 7500 high school traditional instruction. conscientious, dedicated, and well-
physics students who participated in Student FCI gains for 100 Arizona informed group of teachers actively
Leadership Modeling Workshops from teachers who took Modeling involved in discussions about the
1995 to 1998. Teachers gave the FCI Workshops were almost as high as most effective ways to present physics
to their classes as a baseline posttest those for leading teachers nationwide, concepts to students. They were doing
when they were teaching traditionally even though three-fourths of the a marvelous job of combining what
(before their first Modeling they knew about how children learn
Workshop), and as a pretest Post test with what they knew about physics
and posttest thereafter. 80 and the modeling approach. The
FCI means scores (%)

The average FCI pretest 69 discussions were very rich.”
score was about 26%, slightly 60 “The interviews confirmed my
above the random guessing 52 observations about the nature of
level of 20%. Figure 1 40 42 the group. All the participants were
shows that traditional high articulate about physics instruction
26 26 29
school instruction (lecture, 20 and the modeling approach. The
demonstration, and standard participants reported being pleased
laboratory activities) has little with everything that was happening
Traditional Novice Expert
impact on student beliefs, Modelers Modelers at the workshop and spoke in glowing
with an average FCI posttest terms about the facilitators. The
score of 42%, well below the 60% participating Arizona teachers do not participants felt that the workshop
score which, for empirical reasons, have a degree in physics. Teachers who was well run and that the facilitators
can be regarded as a threshold in implement the Modeling method most were extremely knowledgeable about
understanding Newtonian mechanics. fully have the highest student posttest how best to teach physics. They also
This corresponds to a normalized gain FCI mean scores and gains. commented that the group itself was
of 22%, in agreement with Hake’s an excellent resource. They all could
External evaluation of Modeling
results. help each other.”
Instruction. The Modeling Instruction
After their first year of teaching “I have almost never seen such
Program was assessed by Prof. Frances
with the Modeling method, posttest overwhelming and consistent
Lawrenz, an independent external
scores for 3394 students of 66 novice support for a teaching approach. It
evaluator for the National Science
modelers were about 10 percentage is especially surprising within the
Foundation. We quote at length from
points higher, as shown in Fig. 1. physics community, which is known
the four page report on her July 22,
Students of expert modelers do much for its critical analysis and slow
1998 site visit, because it describes
better. For 11 teachers identified acceptance of innovation. In short,
the character of the workshops very
as expert modelers after two years the modeling approach presented by
in the program, posttest scores for the project is sound and deserves to
“This site visit was one of several
647 students averaged 69%. This be spread nationally.”
over the past four years to sites
corresponds to a normalized gain of
in the modeling project … I had Recognition by U.S. Department of
56%, considerably more than double
the opportunity to observe the Education. In September 2000, the
the gain under traditional instruction.
participants working in their concept U.S. Department of Education
After two years in the program, gains
groups and presenting to the full announced that the Modeling
for under-prepared teachers were
group. I also interviewed most of Instruction Program at Arizona State
comparable to gains in one year for
the participants, either as part of University is one of seven K-12
well-prepared teachers. Subsequent
a small group or individually, and educational technology programs
data have confirmed all these results
interviewed two coordinators of the designated as exemplary or promising,
for 20,000 students (Hestenes, 2000).
workshop … The workshop appears out of 134 programs submitted to the
Thus, student gains in understanding

16 Science Educator
providing schools and school districts Hake, R. (1998). Interactive-engage-
with a valuable resource for broader ment vs. traditional methods: A six
The Modeling method has reform. thousand-student survey of mechan-
a proven track record of Data on some 20,000 students show ics test data for introductory physics
improving student learning. courses. American Journal of Physics
that those who have been through the
66, 64-74.
Modeling program typically achieve Halloun, I., & Hestenes, D. (1985a). Initial
twice the gains on a standard test of knowledge state of college physics
agency’s Expert Panel. Selections conceptual understanding as students students, American Journal of Physics
were based on the following criteria: who are taught conventionally. 53, 1043-1055.
(l) Quality of Program, (2) Educational Further, the Modeling method is Halloun, I., & Hestenes, D. (1985b).
Significance, (3) Evidence of successful with students who have Common sense concepts about mo-
Effectiveness, and (4) Usefulness not traditionally done well in physics. tion, American Journal of Physics 53,
to Others. In January 2001, a U.S. Experienced modelers report increased 1056-1065.
Department of Education Expert Panel enrollments in physics classes, Hestenes, D. (1987). Toward a modeling
in Science recognized the Modeling parental satisfaction, and enhanced
theory of physics instruction, American
Instruction Program as one of only Journal of Physics 55, 440-454.
achievement in college courses across Hestenes, D., Wells, M., & Swackhamer,
two exemplary K-12 science programs the curriculum.
out of 27 programs evaluated (U.S. G. (1992). Force concept inventory, The
Carmela Minaya of Honolulu, Physics Teacher 30, 141-158.
Department of Education, 2001). NSTA Shell Science Teaching Hestenes, D. (1997). Modeling methodol-
Long-term implementation. In a Awardee, wrote: “The beauty of ogy for physics teachers. In E. Redish
follow-up survey of Leadership Modeling Instruction is that it creates & J. Rigden (Eds.) The changing role
Modeling Workshop graduates, an effective framework for teachers of the physics department in modern
between one and three years after to incorporate many of the national universities. American Institute of
standards into their teaching without Physics. Part II, 935-957.
they had completed the program,
Hestenes, D. (2000). Findings of the mod-
75% of them responded immediately having to consciously do so, because
eling workshop project, 1994 - 2000.
and enthusiastically. More than 90% the method innately addresses many One section of an NSF final report.
reported that the Workshops had a of the various standards. In a certain Online in pdf format at http://modeling.
highly significant influence on the course, different students may learn
way they teach. 45% reported that the same material, except they never Wells, M., Hestenes, D., & Swackhamer,
their use of Modeling Instruction has learn it in exactly the same way. This G. (1995). A modeling method for high
continued at the same level, while method appeals to all learning styles. school physics instruction, American
another 50% reported an increase. Students cling to whatever works for Journal of Physics 63, 606-619.
(Hestenes, 2000). them. Although the content is identical
Jane Jackson is co-director, Modeling
yearly, no two students ever are. The
Final Thoughts classroom becomes a dynamic center
Instruction Program, Department of Physics,
Arizona State University, PO Box 871504,
Instead of relying on lectures and for student owned learning.” Tempe, AZ 85287. Correspondence pertaining
textbooks, the Modeling Instruction to this article may be sent to Jane.Jackson@
program emphasizes active student References
construction of conceptual and [All references by David Hestenes are Larry Dukerich is director of Professional
mathematical models in an interactive online in pdf format at http://modeling. Development in Science, Center for Research
learning community. Students are] in Education in Science, Mathematics,
engaged with simple scenarios to U.S. Department of Education (2001). Engineering, and Technology (CRESMET),
learn to model the physical world. 2001 Exemplary and Promising Sci- Arizona State University, Tempe, AZ.
Modeling cultivates physics teachers ence Programs. Retrieved Dec. 10,
as school experts on the use of 2007 at David Hestenes is distinguished research
OERI/ORAD/KAD/expert_panel/ professor, Department of Physics, Arizona
technology in science teaching, and State University, Tempe, AZ
encourages teacher-to-teacher training
in science teaching methods, thereby

Spring 2008  Vol. 17, No. 1 17


Students become sports broadcasters for PBS. They must compose a voice-over dub
for a sports video of their choice. Their video must convey the excitement of the
sport and the physics principles observed. To gain knowledge and understanding of
physics principles necessary to meet this challenge, students work collaboratively on
activities in which they apply concepts of Newton's laws, forces, friction, and
momentum to sporting events.

Students consider themselves members of an engineering team that is developing a

system that will communicate from one room in the school to another. The challenge
is to send and receive, then measure the speed of the transmission. The final report
must describe both the design and the physics of the system and a discussion of how
the system is better than the methods explored in the chapter. To gain
understanding of science principles necessary to meet this challenge, students work
collaboratively on activities with codes, electricity and magnetism, sound waves, and
light rays. They also use the iterative process of engineering design; refining designs
based on effectiveness and physics concepts.

Students are challenged to design or build a safety device, or system, for protecting
automobile, airplane, bicycle, motorcycle, or train passengers. New laws, increased
awareness, and improved safety systems are explored as students work on this
challenge. They are also encouraged to design improvements to existing systems
and to find ways to minimize harm caused by accidents. To meet this challenge,
students engage in collaborative activities that explore motions and forces and the
principles of design technology.

Students are presented with a “myth” or story about some physical situation (For
example, “The winner of a Tug-of-war is the strongest team.”) They will work in
teams to design, build, run, and analyze experiments that will test the physical ideas
central to the myth. Students will then extend their experiments, exploring the
physical limits of the experiment and applying it to real-world situations. Teams will
then present their findings to the class (and to the world!) in the form of a
“Mythbusters” video and make conclusions as to whether the situation described by
the myth is physically plausible or even possible.

Students will design and build their own musical instrument from available household
items. Students must analyze the tonal quality for the notes played and make a
comparison to the tonal quality of several musical instruments. To gain
understanding of science principles necessary to meet this challenge, students work
collaboratively on activities to learn about wave motion, standing waves, sound
waves, resonance, timbre, and hearing. They also learn to use the iterative process
of engineering design, refining designs based on the physics they learn.
Dear Physics Students and Parents,

During this year, you will be learning information related to many topics in physics. In this course,
a non-traditional criterion-based assessment system is used. Rather than using a point system to
record scores on various assignments, quizzes and tests, you must demonstrate your level of
proficiency on various learning goals. For example, instead of getting one grade for a worksheet
that may cover many topics, you will be scored on individual learning goals such as, “I can
interpret/draw position vs. time graphs for an object moving with constant velocity.” Rubrics that
list the learning goals and define achievement levels of each will be provided to you and used for

Grades will be assigned based on descriptors of achievement only. Grades will not be affected by
issues such as lateness of work, effort, attitude, participation, and attendance. Those factors will be
reported separately. Even though there will be many opportunities for cooperative learning, you
will never be assigned group grades.

New information showing additional learning will replace old information. Grades will reflect the
trend in most recent learning. You may re-attempt assessments provided that you have
documented an effort to engage in additional learning (e.g., tutoring, additional practice, test
corrections, etc.). In addition, an alternative assessment may be accepted if the work is pre-
contracted between you (the student) and me (the teacher). There is no deadline for
reassessments. Any grade changes due to reassessments made after the quarter ends will be
reflected in the final year-end grade.

For each learning goal and unit of study, you will be scored using the following scale:

4 = Advanced. Indicators include:

• I understand the content/skills completely and can explain them in detail.
• I can explain/teach the skills to another student.
• I have high confidence on how to do the skills.
• I can have a conversation about the skills.
• I can independently demonstrate extensions of my knowledge.
• I can create analogies and/or find connections between different areas within the sciences or
between science and other areas of study.
• My responses demonstrate in-depth understanding of main ideas and of related details.

3 = Proficient. Indicators include:

• I understand the important things about the content/skills.
• I have confidence on how to do the skills on my own most of the time, but I need to continue
practicing some parts that still give me problems.
• I need my handouts and notes once in a while.
• I am proficient at describing terms and independently connecting them with concepts.
• I understand not just the “what,” but can correctly explain the “how” and “why” of scientific
• My responses demonstrate in-depth understanding of main ideas.

2 = Developing. Indicators include:

• I have a general understanding of the content/skills, but I’m also confused about some important
• I need some help from my teacher (one-on-one or small group) to do the skills correctly
• I do not feel confident enough to do the skills on my own
• I need my handouts and notebook most of the time.
• I can correctly identify concepts and/or define vocabulary; however I cannot not make
connections among ideas and/or independently extend my own learning.
• My responses demonstrate basic understanding of some main ideas, but significant information is
1 = Beginning. Indicators include:
• I need lots of help from my teacher (one-on-one).
• I have low confidence on how to do the skills and need more instruction.
• I need my handouts and science notebook at all times.
• I do not understand the concept/skills.
• I cannot correctly identify concepts and/or define vocabulary.
• I cannot make connections among ideas or extend the information.
• My responses lack detail necessary to demonstrate basic understanding.

0 = No Basis
I do not provide any responses for which a judgment can be made about my understanding.

Near the end of each quarter, you (the student) will meet with me (the teacher) individually to
discuss your progress and to assign a quarterly grade. Quarterly grades will be given as follows:

98% 78%
Above Standard Approaching Standard
95% I am advanced in at least one unit of 75% I am developing in most units of study
study and proficient in all others. and proficient in at least one.
92% 72%

88% 68% Far Below Standard

I am basic in most units of study
At Standard and proficient in at least one.
85% 65%
I am proficient in most units of study.
Not Meeting Any Standard
82% F
I am not proficient in any unit of study.

Year-end course grades will be assigned the same way, looking at the final achievement on the
entire year’s worth of learning goals.

Please sign below to indicate you have read and understand this grading system. If you have any
questions, please email me at, leave a voicemail at 763-7200 x9512, or
simply write them in the box below. Thank you.

I have read and understand the criterion-based grading system used in

Mr. Noschese’s class.

Student Name: _________________________________

Student Signature: _________________________________ Date: ___________

Parent Signature: _________________________________ Date: ___________


Design an experiment to determine the relationship between the
position of a toy buggy and time.

You have access to rulers, meter sticks, tape measures, and


___ Describe your experimental design with both a labeled diagram
and a verbal description.
___ State what the independent and dependent variables are.
___ Include how will you vary and measure your chosen parameters.

___ Perform the experiment and record your measurements in an
appropriate table.

___ Graph and determine the numerical model for your data set.

___ What pattern did you find from your observations? Write a verbal
and mathematical description.
___ What is the physical significance of the slope of the graph?
___ What is the physical significance of the y-intercept?
___ Turn your numerical model into a general model.

You will be assessed using the rubric on the back of this page.


I can design a reliable My experiment does not My experiment may not Some important aspects of My experiment might yield
experiment that investigate the yield any interesting the phenomenon will not be interesting patterns relevant
investigates the phenomenon. patterns. observable. to the investigation of the
phenomenon phenomenon.
I can decide what My parameters are Only some of my My parameters are relevant. My parameters are relevant
parameters are to be irrelevant. parameters are relevant. However, independent and and independent and
measured and can dependent variables are not dependent variables are
identify independent and identified. identified.
dependent variables
I can communicate the My diagrams are missing My diagrams are present My diagrams and/or my My diagrams and/or my
details of an and/or my experimental but unclear and/or my experimental procedure are experimental procedure are
experimental procedure procedure is missing or experimental procedure is present but with minor clear and complete.
clearly and completely extremely vague. present but important omissions or vague details.
details are missing.
I can record and My data are either absent or Some important data are All important data are All important data are
represent data in a incomprehensible. absent or incomprehensible. present, but recorded in a present, organized, and
meaningful way way that requires some recorded clearly.
effort to comprehend.
I can represent data No attempt is made to My graph has major My graph has minor My graph is complete and
graphically and use the represent the data errors/omissions or my errors/omissions or my correct and my numerical
graph to make graphically. numerical models are numerical models are models are written
predictions. missing. written incorrectly. correctly.
I can identify a pattern in No attempt is made to My pattern described is My pattern has minor errors My pattern represents the
the data search for a pattern. irrelevant or inconsistent or omissions. relevant trend in the data.
with my data.
I can represent a pattern No attempt is made to My mathematical expression My mathematical expression My mathematical expression
mathematically and represent a pattern does not represent the represents the trend, but I represents the trend, and I
provide physical meaning mathematically. trend or I do not provide provide unreasonable provide reasonable physical
to the slope and y- any physical meaning to the physical meaning to the meaning to the slope and
intercept. slope and intercept. slope and intercept. intercept.



LAB.1 – I can identify the hypothesis to be
tested, phenomenon to be investigated, or
the problem to be solved.
LAB.2 – I can design a reliable experiment
that tests the hypothesis, investigates the
phenomenon, or solves the problem.
LAB.3 – I can communicate the details of an
experimental procedure clearly and

LAB.4 – I can revise the hypothesis when


LAB.5 – I can decide what parameters are to

be measured and can identify independent
and dependent variables.

LAB.6 – I can record and represent data in a

meaningful way.

LAB.7 – I can analyze data appropriately.

LAB.8 – I can represent data graphically and

use the graph to make predictions.

LAB.9 – I can identify a pattern in the data.

LAB.10 – I can represent a pattern

mathematically and provide physical meaning
to the slope and y-intercept.




MOM.1 – I can determine whether
interactions are present for a given situation
by considering the motion of objects.
MOM.2 – I can calculate the momentum of
an object/system with direction and proper

MOM.3 – I can draw an interaction diagram

and specify the system and the surroundings.

MOM.4 – I can draw and analyze momentum

bar charts.

MOM.5 – I can use momentum conservation

to solve different problems.

MOM.6 – I can apply the property of

reciprocity to the interactions between
MOM.7 – I can compute the momentum
transfer into/out of an object/system
(impulse) with direction and proper units.
MOM.8 – I know the relationship between
average force and the rate of momentum
MOM.9 – I know the relationship between
average force and time for a given
momentum transfer (impulse).

MOM.10 – I can use momentum transfer

(impulse) to solve different problems.




The learning goals I need to improve most are: _________________________________


What I can do to improve: _________________________________________________


The learning goals I excel in are: ____________________________________________


Reasons why I excel in these areas: _________________________________________


Student Name: _________________________________

Student Signature: _________________________________ Date: ___________

Parent Signature: _________________________________ Date: ___________




The learning goals I need to improve most are: _________________________________


What I can do to improve: _________________________________________________


The learning goals I excel in are: ____________________________________________


Reasons why I excel in these areas: _________________________________________


Student Name: _________________________________

Student Signature: _________________________________ Date: ___________

Parent Signature: _________________________________ Date: ___________