Professional Documents
Culture Documents
2
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
3
Teachers College Record, 117, 040304 (2015)
4
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
a study of nine schools in two districts, teachers were more likely to use
the results of interim assessment data primarily to decide what content
to review with students, how to place students into instructional groups,
and how to identify which students needed extra support, rather than re-
thinking how they would teach differently (Goertz, Oláh, & Riggan, 2009).
Clearly, more needs to be done to support both novice and veteran educa-
tors in using data effectively. But what principles should guide the design
of that support? In hopes of generating a discussion around this impor-
tant question, the approach the Data Wise Project takes to working with
practitioners and graduate students is described next.
For over a decade, the Data Wise Project at the Harvard Graduate
School of Education (HGSE) has been honing and teaching a model
that provides a systematic approach to improving classroom practice.
The work began when education economist Richard Murnane convened
a group comprising researchers from HGSE and school leaders from
three Boston public schools to work together to articulate what school
leaders need to know and do in order to use data to improve instruc-
tion. The findings were captured in Data Wise: A Step-by-Step Guide to
Using Assessment Results to Improve Teaching and Learning (Boudett, City, &
Murnane, 2013). Contributions from Murnane, statistician John Willett,
and psychometrician Daniel Koretz grounded the work in theory; con-
tributions from practitioners ensured that it would be accessible and rel-
evant to principals and teachers.
Figure 1 shows the Data Wise Improvement Process, which breaks the
work of improvement into three phases. The “prepare” phase involves
creating and maintaining a culture in which staff members can collabo-
rate effectively and use data responsibly. In the “inquire” phase, educa-
tors use a wide range of data sources, including student work and class-
room observations, to articulate a very specific problem of practice that
they are committed to solving. In the “act” phase, teams articulate how
they will learn about and employ instructional strategies to address this
problem and how they will assess the extent to which the plan improved
student learning. The model is characterized by an arrow that curves
back on itself because after educators assess the effectiveness of their ac-
tions, they are well positioned to determine the focus for the next cycle
of collaborative inquiry.
As they work their way through the process, teachers use field-tested
protocols to examine a wide range of data sources. They then develop
action plans that contain focused strategies for improving their teaching
5
Teachers College Record, 117, 040304 (2015)
practice. These plans almost inevitably involve giving students a more cen-
tral role in driving their own learning. Because teachers develop the plans
themselves, their commitment to enacting them is much stronger than it
would have been if central office or even school leaders handed down the
plans. Because Data Wise explicitly supports and empowers teachers, it
can serve as a useful counterbalance to the anxiety produced by increas-
ing accountability pressures.
Source: Boudett, K. P., City, E. A., & Murnane, R. J. (2013). Data Wise: A step-by-step
guide to using assessment results to improve learning and teaching (revised and expanded
ed.). Cambridge, MA: Harvard Education Press.
The Data Wise Project was established to support educators in using this
process to organize the core work of schools around evidence of learning.
Over the past 10 years, the Project has taught more than 2,500 educators
worldwide who have enrolled in courses designed to either introduce the
6
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
process or coach teams of educators in working their way through it. Most
courses are designed for school- and district-based teams, although one
is geared specifically to students in graduate programs. Delivery models
range from programs that are entirely face-to-face to those that are com-
pletely online, including a course that employs a hybrid strategy. There
are two different types of online courses: one that provides live custom
coaching to each team and another that involves asynchronous commu-
nication between coaches and participating teams, with a particular focus
on encouraging feedback among teams. Table 1 shows the purpose, audi-
ence, and delivery models for Data Wise courses.
Although all these courses use the Data Wise Improvement Process at the
lead framework, participants are told from the start that simply checking
off the steps of an improvement process will not by itself be enough to
bring about real changes in learning and teaching. For meaningful change
to occur, educators must bring a distinctive approach to their work. The
Data Wise Project captured this disciplined way of thinking in the “ACE
Habits of Mind,” in which each letter represents a habit:
A: Shared commitment to Action, Assessment, and Adjustment
C: Intentional Collaboration
E: Relentless focus on Evidence.3
7
Teachers College Record, 117, 040304 (2015)
Costa and Kallick (2008) defined habits of mind as what successful people
do “when they are confronted with problems to solve, decisions to make,
creative ideas to generate, and ambiguities to clarify” (p. 1). The ACE Habits
of Mind are the stances that educators take when they approach data inqui-
ry and improvement because they are ways of thinking that will help them
achieve more productive outcomes. Adding an emphasis on habits of mind
expands building data literacy beyond merely accumulating discrete knowl-
edge and skills or learning a process that becomes routine—it makes explic-
it that there are certain ways of thinking about data that separate emerging
forms of data literacy from sophisticated, well-developed approaches.
Costa and Kallick (2008) noted that to activate habits of mind, one must
be able to know the context in which such habits are useful and also have
the capability and commitment to enact those habits. Because the ACE
Habits of Mind do not come naturally for many educators, the instruc-
tional design of Data Wise courses provides opportunities that allow edu-
cators to experience the habits firsthand and incorporate them into their
professional practice. Table 2 summarizes the teaching strategies that sup-
port the cultivation of each habit; however, it is important to note that
using any one or even a few strategies will not help educators acquire the
habits of mind automatically. The key is to explicitly teach the habits and
then create learning experiences that incorporate the habits holistically,
such that participants have practice activating the habits over time. The
following sections discuss each set of strategies in turn. These strategies
can be incorporated into any course about improvement, not just those
that use the Data Wise Improvement Process specifically. Taken together,
these strategies contribute to a robust instructional design regardless of
the particular inquiry model used.
8
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
The first of habit of mind that educators need to develop as they learn
how to engage in collaborative data inquiry is an orientation toward pur-
poseful, reflective action. When practiced broadly, this habit leads to or-
ganizational learning by creating a continuous feedback loop that allows
educators to improve how they improve. The importance of feedback loops
has been documented by organizational theorists, who argue that organi-
zational learning occurs when cycles of action and reflection are iterative
and recursive and build on one another over time (Argyris & Schön, 1996;
Edmondson, 2002). Effective teams engage in reflection, or discussing
new insights and learning, as well as action, or testing and implementing
new ideas, applying insight, and producing change (Edmondson, 2002).
Unfortunately, educators have several “bad” habits that work against this
habit of mind. Educators do not seek to practice bad habits, but the frag-
mented and rushed nature of much data work done in schools leads to
bad habits that are created to manage the work. One such habit is the ten-
dency to create action plans without following up on the results of those
actions, without making “midcourse corrections,” and without reflecting
on progress and learning over time (Boudett & City, 2013, p. 1). Educators
are frequently asked to develop action plans for various reasons; for ex-
ample, leadership teams create action plans focused on improving their
schools’ performance on state assessments, and many educator evaluation
systems require individual teachers to create action plans detailing how
they will seek new professional development. However, they rarely have
time to thoughtfully reflect on the results of those action plans in a way
that produces learning and informs the next plan.
To be effective with data inquiry, educators need to complete the feed-
back loop and frequently gather evidence of progress and reflect on what
happened as a result of their actions. There are three pedagogical strate-
gies that institutions of higher education can use to help educators learn
to choose this path.
Organizing the Syllabus Around Actions That Result From Inquiry Process Steps
9
Teachers College Record, 117, 040304 (2015)
the ones before it and clarifies what the outcome of the step is that par-
ticipants are working toward. This emphasis on intermediate outcomes
helps participants see that action is required at every step. For example,
when organizing for collaborative work (Step 1), the objective is to put
into place the teams and structures that will support inquiry; when digging
into student data (Step 4), the objective is to articulate a statement of the
learner-centered problem that is supported by the data.
10
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
with a new guiding question or a new way of thinking about the current
issue. This written record can also serve as a tool for showing new team
members what has happened so far and for demonstrating to other audi-
ences at or beyond their school what the team has done.
C: INTENTIONAL COLLABORATION
11
Teachers College Record, 117, 040304 (2015)
Teaching and Modeling Tools for Designing and Facilitating Effective Meetings
12
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
13
Teachers College Record, 117, 040304 (2015)
the idea that consequential decisions about student learning should not
be based on the results of a single test (Koretz, 2009). Researchers docu-
ment instances where data teams meet to examine not only test scores but
student work produced in class and for homework, and they argue that
the most helpful sources of instructional information are assessments that
demonstrate student thinking, developmental pathways or approaches, or
misconceptions (Supovitz, 2012).
Practitioners often overlook data collected from classroom observa-
tions about student learning and teaching practice. Yet, being able to
accurately describe what is currently happening in classrooms is essen-
tial to figuring out what improvements to instruction are needed (City,
Elmore, Fiarman, & Teitel, 2009). When the data in question are obser-
vations about practice, it is particularly important to cultivate the habit
of maintaining a relentless focus on evidence. When an educator reports
that teachers in multiple classrooms failed to engage students in rigorous
work, a skillful facilitator will ask that person to provide the evidence sup-
porting his or her claim. A more useful data statement about classroom
practice might be: Most students answered teachers’ questions with one- or two-
word statements.
Strategies for helping educators develop a relentless focus on evidence
follow.
Research supports the idea that children learn best when abstract ideas
are put into a context that is meaningful to them (National Research
Council, 2000); the same is true for adults (Kolb & Kolb, 2005). Efforts
to teach educators how to read score reports or develop action plans can
fall flat if not embedded in case studies of real schools. Teaching educa-
tors data literacy through examples of student work or videos of classroom
instruction from real schools helps them learn new skills while putting
themselves in the shoes of the educators in the cases. It also increases
engagement by making it easier for participants to see how to transfer
their learning to their own situations. Finally, case studies demonstrate
how real educators gather a broad range of data to address one focus of
inquiry. By following a case study school throughout the entire inquiry
cycle, participants see firsthand how a school or educator team might start
by examining student performance data, then move to looking at student
work, and finally to observing classroom instruction. Each next source of
data adds another layer of understanding about students’ and teachers’
strengths and challenges.
14
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
Once again, protocols can be quite useful for helping educators try out
habits that might at first feel unfamiliar. For example, many educators
think that when they observe a classroom, it is their job to identify what the
teacher is doing “right” and “wrong.” So when they observe a video of in-
struction, participants should take very descriptive and specific notes about
what they see and hear students and teachers doing. The “affinity” protocol
helps educator teams analyze the classroom observation data. This protocol
allows educators to work with colleagues to sort, categorize, and label this
observational evidence and come to a shared understanding of what is hap-
pening—and not happening—in classrooms. In addition, educators might
use the mental model of “the ladder of inference,” an idea developed by
Chris Argyris, Peter Senge, and others (Argyris & Schön, 1996; Senge et al.,
2000). The bottom rung of the ladder contains descriptive statements, but
as one climbs the ladder conceptually, the higher rungs lead to inferences,
conclusions, and actions. Participants learn how to classify their statements
and how to go up the ladder deliberately, making sure they have enough
evidence to support the climb. To avoid making judgmental statements, par-
ticipants might ask one another, “What evidence do you see that makes you
say that?” The metaphor of being “high on the ladder” and “coming down
the ladder” can make the process of developing the habit of evidence more
playful and easier to talk about and do.
15
Teachers College Record, 117, 040304 (2015)
On the last day of a Data Wise course, it is typical for educators to report
that the experience was not what they expected. One year, a student sum-
marized her learning by writing, “I used to think that it was all about the
data: its accuracy, validity, the amount we have. And now I think that to
achieve success in using data to affect change, the attitudes and skills of
the people implementing the change are more important.” Interestingly,
in the early years, our courses placed more emphasis on helping educators
to retrieve assessment results from databases and create formal presenta-
tions of their data. However, feedback from participants revealed that they
needed more than a toolkit of discrete skills, protocols, and forms—they
needed a way to understand the process of inquiry in its entirety and the
habits of mind that they could integrate across all their work with data.
Reciprocal relationships between institutions of higher education, pro-
fessional developers, and K–12 educators are essential to supporting edu-
cators to become data literate. As noted in prior research, very few teacher
candidates learn to engage deeply in data literacy for teaching in their
coursework. Further, even when teacher candidates are placed in schools
with a mentor teacher who can model and support the development of
good instructional practices, there is no guarantee that the novice teacher
will be exposed to a team that engages in data inquiry. Novice teachers are
socialized into the professional act of collaborative improvement vicari-
ously, if at all. As a result, we arrive at our current situation: Most educators
have not learned data literacy skills in preservice preparation, such that
schools and districts must provide ongoing professional development to
make up for this gap.
Institutions of higher education and professional development provid-
ers must encourage learning through clinical practice, meaning courses
that combine practitioners and students occurring in real school settings
and using authentic examples. It is usually said that institutions of higher
education have a responsibility to bridge what is known from research
studies with the everyday work of educators. The Data Wise Project sees
its responsibility as both bringing research to practitioners and translat-
ing knowledge from practitioners into frameworks, tools, and supports
for other educators. This involves creating instructional designs and peda-
gogy that are responsive to participants’ needs. Practitioners are therefore
involved in coauthoring materials, codesigning educational experiences,
and cofacilitating our courses. Their reflections and journey presenta-
tions demonstrate how educators apply the Data Wise Improvement
Process at their schools, including how they have integrated the process
with their other work. Practitioners’ questions inform the next iteration of
16
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
CONCLUDING THOUGHTS
Education has been assailed for lacking ways to socialize educators into
the profession, leading many to observe that teaching practice in schools
appears haphazard and largely attributable to each individual teacher’s
dispositions. In his call for defining and developing signature pedagogies
in education, Shulman (2005) described a vision that education might
have a set of practices to induct novices into ways of “thinking like an edu-
cator,” and all institutions and programs that train and support educators
would use these methods. These signature pedagogies would encourage
the development of “habits of the mind, habits of the heart, and habits
of the hand” common to all members in the profession (p. 59). As the
field continues in search of a signature pedagogy that might socialize new
teachers into a collaborative, data-literate profession, there must be less
emphasis on teaching data literacy as a set of isolated skills and practices,
and instead a focus on teaching data inquiry as a holistic process ground-
ed in habits of mind necessary for collaborative improvement. The goal
is not just getting teachers to be comfortable with data, but allowing the
profession to evolve to a place where understanding of data is thoroughly
integrated with the work of learning and teaching.
NOTES
1. In this article, we refer to “educators” instead of just teachers to include ad-
ministrators, specialists, and coaches.
2. For more information about the Data Wise Project, please visit http://www.
gse.harvard.edu/datawise.
3. For further discussion of the ACE Habits of Mind, see Boudett & City, 2013.
4. Instructions for all the protocols mentioned in this article can be found on-
line: Plus/Delta, Affinity, SUMI: http://www.gse.harvard.edu/datawise; Success
Analysis: http://www.tcpress.com/pdfs/mcdonaldprot.pdf; Compass Points:
http://www.schoolreforminitiative.org/.
5. For instructions, please see http://www.schoolreforminitiative.org/.
17
Teachers College Record, 117, 040304 (2015)
REFERENCES
Argyris, C., & Schön, D. A. (1996). Organizational learning II: Theory, methods and practice.
Reading, MA: Addison-Wesley.
Boudett, K. P., & City, E. A. (2013). Lessons from the Data Wise Project: Three habits of mind
for building a collaborative culture. Harvard Education Letter, 29(3), 1–2. Retrieved from
http://hepg.org/hel/article/567
Boudett, K. P., & City, E. A. (2014). Meeting wise: Making the most of collaborative time for educators.
Cambridge, MA: Harvard Education Press.
Boudett. K. P., City, E. A., & Murnane, R. J. (Eds.). (2013). Data Wise, revised and expanded
edition: A step-by-step guide to using assessment results to improve teaching and learning.
Cambridge, MA: Harvard Education Press.
City, E. A., Elmore, R. F., Fiarman, S. E., & Teitel, L. (2009). Instructional rounds in education: A
network approach to improving teaching and learning. Cambridge, MA: Harvard Education Press.
Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis.
Measurement, 9, 173–206.
Coburn, C. E., & Turner, E. O. (2012). The practice of data use: An introduction. American
Journal of Education, 118(2), 99–111.
Costa, A. L., & Kallick, B. (Eds.). (2008). Learning and leading with habits of mind: 16 essential
characteristics for success. Alexandria, VA: Association for Supervision and Curriculum
Development.
Edmondson, A. (1999). Psychological safety and learning behavior in work teams.
Administrative Science Quarterly, 44(1999), 350–383.
Edmondson, A. (2002). The local and variegated nature of learning in organizations: A
group-level perspective. Organization Science, 13(2), 128–146.
Gallimore, R., Ermeling, B., Saunders, W., & Goldenberg, C. (2009). Moving the learning
of teaching closer to practice: Teacher education implications of school-based inquiry
teams. Elementary School Journal, 109(5), 537–553.
Goertz, M., Oláh, L., & Riggan, M. (2009, December). Can interim assessments by used for
instructional change? (CPRE Policy Briefs RB-51). Philadelphia, PA: Consortium for Policy
Research in Education.
Greenberg, J., & Walsh, K. (2012). What teacher preparation programs teach about K-12 assessment:
A review. Washington, DC: National Council on Teacher Quality. Retrieved from http://
www.nctq.org/p/publications/docs/assessment_report.pdf
Hackman, R. (2002). Leading teams: Setting the stage for great performances. Cambridge, MA:
Harvard Business School Press.
Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009).
Using student achievement data to support instructional decision making (NCEE 2009-4067).
Washington, DC: National Center for Education Evaluation and Regional Assistance,
Institute of Education Sciences, U.S. Department of Education. Retrieved from http://
ies.ed.gov/ncee/wwc/publications/practiceguides/
Hiebert, J., Gallimore, R., & Stigler, J. (2002). A knowledge base for the teaching profession:
What would it look like and how would we get one? Educational Researcher, 31(5), 3–15.
Jennings, J. L. (2012). The effects of accountability system design on teachers’ use of test
score data. Teachers College Record, 114(11), 1–23.
Katzenbach, J. R., & Smith, D. K. (2003). The wisdom of teams: Creating the high performance
organization. Cambridge, MA: Harvard Business School Press.
Kekahio, W., & Baker, M. (2013). Five steps for structuring data-informed conversations and action
in education (REL 2013–001). Washington, DC: U.S. Department of Education, Institute of
Education Sciences, National Center for Education Evaluation and Regional Assistance,
Regional Educational Laboratory Pacific. Retrieved from http://ies.ed.gov/ncee/edlabs
18
TCR, 117, 040304 Teaching Educators Habits of Mind for Using Data Wisely
Kolb, A. Y., & Kolb, D.A. (2005). Learning styles and learning spaces: Enhancing experiential
learning in higher education. Academy of Management Learning and Education, 4(2), 193–212.
Koretz, D. (2009). Measuring up: What educational testing really tells us. Cambridge, MA:
Harvard University Press.
Lave, J. (1996). The practice of learning. In S. Chaiklin & J. Lave (Eds.), Understanding practice:
Perspectives on activity and context (pp. 3–32). Cambridge, England: Cambridge University Press.
Levine, T. H., & Marcus, A. S. (2010) How the structure and focus of teachers’ collaborative activities
facilitate and constrain teacher learning. Teaching and Teacher Education, 26(3), 389–398.
Little, J. W. (1982). Norms of collegiality and experimentation: Workplace conditions of
school success. American Educational Research Journal, 19, 325–340.
Little, J. W. (1990). The persistence of privacy: Autonomy and initiative in teachers’
professional relations. Teachers College Record, 91(4), 509–536.
Little, J. W. (2002). Locating learning in teachers’ communities of practice: Opening up problems
of analysis in records of everyday work. Teaching and Teacher Education, 18(7), 917–946.
Lortie, D. C. (1975). Schoolteacher: A sociological study. Chicago, IL: University of Chicago.
Love, N., Stiles, K. E., Mundry, S., & DiRanna, K. (2008). The data coach’s guide to improving learning
for all students: Unleashing the power of collaborative inquiry. Thousand Oaks, CA: Corwin Press.
Mandinach, E. B., Friedman, J. M., & Gummer, E. S. (2015). How can schools of education help to
build educators’ capacity to use data? A systemic view of the issue. Teachers College Record, 117(4).
Mandinach, E. B., & Gummer, E. S. (2013). Defining data literacy: A report on a convening
of experts. Journal of Educational Research and Policy Studies, 13(2), 6–28.
Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and
gaps. Teachers College Record, 114(11), 1–48.
McDonald, J. P., Mohr, N., Dichter, A., & McDonald, E. C. (2007). The power of protocols (2nd
ed.). New York, NY: Teachers College Press.
McLaughlin, M., & Talbert, J. E. (2002). Reforming districts: How districts support school reform.
Seattle: Center for the Study of Teaching and Policy, University of Washington.
Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ ability to use data to inform
instruction: Challenges and supports. Washington, DC: U.S. Department of Education
Office of Planning, Evaluation and Policy Development.
Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From
accountability to instructional improvement. Washington, DC: U.S. Department of Education
Office of Planning, Evaluation and Policy Development.
National Forum on Education Statistics. (2012). Forum guide to taking action with education data
(NFES 2013-801). U.S. Department of Education. Washington, DC: National Center for
Education Statistics.
National Research Council. (2000). How people learn: Brain, mind, experience, and school
(expanded ed.). Washington, DC: National Academies Press.
Nelson, T., & Slavit, D. (2008). Supported teacher collaborative inquiry. Teacher Education
Quarterly, 35(1), 99–116.
Park, V., & Datnow, A. (2009): Co-constructing distributed leadership: district and school
connections in data-driven decision-making. School Leadership & Management, 29(5), 477–494.
Patton, M. Q. (2002). Qualitative research and evaluation methods. Thousand Oaks, CA: Sage.
Putnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say
about research on teacher learning? Educational Researcher, 29(1), 4–15.
Saunders, W. M., Goldenberg, C. N., & Gallimore, R. (2009). Increasing achievement by focusing
grade-level teams on improving classroom learning: A prospective, quasi-experimental
study of Title I Schools. American Educational Research Journal, 46(4), 1006–1033.
Senge, P., Cambron-McCabe, N., Lucas, T., Smith, B., Dutton, J., & Kleiner, A. (2000).
Schools that learn: A Fifth Discipline fieldbook for educators, parents, and everyone who cares about
education. New York, NY: Crown Business.
19
Teachers College Record, 117, 040304 (2015)
CANDICE BOCALA is the co-chair of the Data Wise Summer Institute and
a senior team member for the Data Wise Project. Since 2009, she has been
working with educators to use the Data Wise Improvement Process. She
also conducts research and program evaluation for WestEd, where her
research interests include data use, professional development, and school
improvement. Previously, she taught elementary school in Washington,
DC. Candice holds a BA in government from Cornell University, an MA
in policy analysis and evaluation from Stanford University, an MAT in
elementary education from American University, and an EdD from the
Harvard Graduate School of Education
20