You are on page 1of 19

ARTICLE IN PRESS

Journal of
E NGLISH for
A CADEMIC
Journal of English for Academic Purposes 5 (2006) 174–192
P URPOSES
www.elsevier.com/locate/jeap

Evaluative review in materials development


Fredricka L. Stollera,, Bradley Horna, William Grabea,
Marin S. Robinsonb
a
Department of English, Northern Arizona University, P.O. Box 6032, Flagstaff, AZ 86011-6032, USA
b
Department of Chemistry and Biochemistry, Northern Arizona University, P.O. Box 5698, Flagstaff,
AZ 86011-5698, USA

Abstract

English for Academic Purposes (EAP) professionals know that initial efforts to produce or adapt
materials generally require evaluative review and revision. A review process that solicits feedback
from teacher and student users is critical because materials writers often find it difficult to envision
the problems others may have with their materials. Despite the importance of such feedback, the
EAP literature provides few insights on how to engage in evaluative review to inform material
revisions. To fill this gap, we describe the evaluative review process that we developed as part of an
interdisciplinary textbook development project. The case study setting is described, to situate the
discussion, and includes an explanation of the scope of the project, the nature of the instructional
approach, and our rationale for materials assessment. We then describe instruments developed to
gather feedback from three participants groups, explain feedback-collection and data-analysis
procedures, and provide sample data to demonstrate the breadth, scope, and usefulness of our
evaluative review. We conclude with implications for EAP practitioners, with an emphasis on
implications that are pertinent to the overall materials evaluation process and to the design of
feedback-collection instruments and procedures.
r 2006 Elsevier Ltd. All rights reserved.

Keywords: Materials development; Pilot testing; Evaluation; Triangulation; Writing in the disciplines; English for
academic purposes

Corresponding author. Tel.: +1 928 523 6272; fax: +1 928 523 7074.
E-mail addresses: Fredricka.Stoller@nau.edu (F.L. Stoller), bmh36@dana.ucc.nau.edu (B. Horn),
William.Grabe@nau.edu (W. Grabe), Marin.Robinson@nau.edu (M.S. Robinson).

1475-1585/$ - see front matter r 2006 Elsevier Ltd. All rights reserved.
doi:10.1016/j.jeap.2006.07.003
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 175

1. Introduction

English for Academic Purposes professionals often engage in materials development


activities because commercial textbooks and other instructional materials fall short in
addressing their students’ specific language learning needs. Those who have engaged in
such efforts know that initial attempts to create or adapt materials generally require an
evaluative review and revision process. Whether writing materials for one’s own students
or a broader target audience, it is assumed that materials will be adjusted over time in
response to evolving student populations, new research findings, current trends in the field,
and external mandates (e.g., from school administrators, governmental offices, or
publishers). Materials that undergo this evaluative review and revision process are likely
to serve target student and teacher audiences more effectively than materials that do not
(see Dudley-Evans & St. John, 1998; cf. Bell & Gower, 1998).
Relatively little information, however, is currently available about review and revision
processes that follow directly from large-scale materials development projects.1 To fill this
gap, we present the evaluative review process that we followed as part of an
interdisciplinary textbook development project. We begin with a brief review of the
literature, a depiction of the case study setting, and a description of the multidimensional
materials development process that we have engaged in. We conclude with implications for
EAP professionals who are engaged in both small- and larger-scale materials development
projects and who might benefit from having a series of formal steps to follow during the
materials evaluation and revision process.

2. Evaluation: an integral part of the materials development process

Many discussions of evaluation in language-teaching settings center on macro-


evaluation, that is, curriculum and program-wide evaluation (e.g., Brown, 1995a, b;
Johnson, 1989; Richards, 2001; cf. Ellis, 1998). The model of curriculum design,
development, and evaluation set forth by Brown (1995a, b, 2003) illustrates the inextricable
links between regularly conducted formative evaluation and the five principal components
of his curriculum development framework: needs analysis; the specification of goals and
objectives; test development and improvement; materials adoption, adaptation, and
development; and teaching and teacher support. Richards (2001) also highlights the
essential role of evaluation in his multi-step model of curriculum development involving,
much like Brown, needs analysis, situation analysis, specification of goals and learning
outcomes, course planning and syllabus design, and teaching. In both Brown’s and
Richards’ models of curriculum development, the ongoing process of formative evaluation
serves important functions, including a monitoring of the effectiveness of materials for the
practical purposes of materials revision and improvement (e.g., Byrd, 1995; Flowerdew &
Peacock, 2001a; Jordan, 1997). When materials evaluation activities are followed by
proactive steps that bring about improvements in current and future practices (Brown,
1995b; Dudley-Evans & St. John, 1998), materials are more likely to achieve their ‘‘true
purpose, that is, to help learners to learn effectively’’ (Jordan, 1997, p. 138).
1
Discussions of large-scale materials and curriculum development and evaluation projects were more common
in the 1970s and 1980s, in now hard-to-find British Council ELT Documents (e.g., British Council, 1980).
ARTICLE IN PRESS
176 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

Materials that are intended for a wider audience (beyond the local context) require an
evaluative review with additional dimensions. In such cases, there is a need to judge the
appropriateness of materials for a range of student groups, teachers, and possibly other
stakeholders, including publishers, at multiple institutions and in varied instructional or
professional contexts (see Byrd, 1995). One means of collecting data on material
effectiveness is pilot testing, also referred to as class testing or trialing (Donovan, 1998;
Jolly & Bolitho, 1998; Reid, 1995; Richards, 2001). Although pilot testing requires time
and a coordination of efforts, most would agree that the time is well invested and the
benefits worthwhile (Reid, 1995). The views contributed by pilot test participants
(instructors and students alike) can lead to a more balanced and comprehensive view of
the materials, in part because materials writers are often too closely aligned to the
materials to envision the difficulties that other teachers and their students might have with
certain types of materials (Reid, 1995; see also Jolly & Bolitho, 1998). For these reasons
(and others), trialing is usually required by publishers before materials go into production.
Pilot testing can provide insights that validate materials. Certain materials and activities
may turn out to be surprisingly effective across multiple student groups; in other cases,
materials may provide unanticipated benefits. Furthermore, pilot testing can (a) determine
the suitability of the overall scope, approach, level, organization, and progression of
content and tasks; (b) reveal appropriateness (or inappropriateness), ease of use, and
fit in different instructional settings; (c) expose inaccuracies in content and presentation;
and (d) reveal mismatches between course aims and the materials themselves
(Donovan, 1998; see also Littlejohn, 1998). The triangulated view that emerges from
feedback gathered from multiple user groups can serve as a sound foundation (and
justification) for materials production and improvement (Donovan, 1998; Flowerdew &
Peacock, 2001a).
Whether materials are evaluated to judge their effectiveness in local settings or broader
contexts, the evaluation process should begin by determining what information to gather,
how to gather it, and from whom. According to Dudley-Evans and St. John (1998), an
effective evaluation solicits information that (a) emphasizes the strengths and weaknesses
of the materials and (b) clarifies how materials are used (or not used). Yet, knowing how
well (or poorly) materials have worked is not sufficient on its own. Understanding why
materials are evaluated in certain ways enables materials writers to repeat successes and
improve less successful components in subsequent materials-revision stages. Because
piloters’ input is directly connected to the context in which materials are used, information
about pilot institutions and classrooms should also be collected and analyzed as part of the
materials-evaluation process (see Brown, 1995a; Richards, 2001).

3. Case study

To illustrate the value of piloting for the purposes of ongoing materials creation,
evaluation, and revision, we present a case study that highlights the steps taken to evaluate
our Write Like a Chemist materials. We include the ‘‘voices’’ of our evaluators as a way to
demonstrate the breadth, scope, and usefulness of our evaluative review. Although the
particular features of materials development projects vary, it is our hope that the case
study illustrates an evaluative review framework, guided by an established EAP materials
development orientation (Brown, 1995a; Dudley-Evans & St. John, 1998; Johns, 1997;
Richards, 2001; Swales, 1990) that can be adapted for other EAP material writing contexts.
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 177

3.1. Project background and description

The Write Like a Chemist project (see www4.nau.edu/chemwrite) was conceived as a


response to a Northern Arizona University mandate to address the academic writing needs
of all junior-level students (native and nonnative alike). At the time, departments had
the option of either developing junior-level writing intensive courses of their own or
requiring students to take a writing intensive course in the English department.2 The
chemistry department chose to develop its own discipline-specific writing course; two
faculty members, a chemist and an applied linguist, have collaborated to develop the
course and corresponding materials, essentially building a ‘‘cross-disciplinary alliance’’
(Wardle, 2004).
The interdisciplinary course-development team was expanded to include a graduate
student in TESL/applied linguistics (who served on the project for three years) and a post-
doctoral associate in chemistry (who served on the project for two years). The course-
development group evolved into a materials development team, with the goal of refining
in-house instructional materials so that they could be pilot tested beyond the confines of
Northern Arizona University. A separate project-assessment team, comprising an applied
linguistics faculty member and a doctoral student, was formed in the third year of the
project to assist with the evaluation of the overall project and with a 2-year pilot to assess
the effectiveness of Write Like a Chemist materials at institutions nationwide3 (see Stoller,
Horn, Grabe, & Robinson, 2005).
Core to the Write Like a Chemist project has been the analysis of the language of
chemistry in three professional genres (i.e., journal articles, scientific posters, and research
proposals), the development of a read–analyze–write approach to teaching disciplinary
writing, and the design and development of instructional materials that could be used by
chemistry faculty with little, if any, experience in teaching writing (see Stoller, Jones,
Costanza-Robinson, & Robinson, 2005). The materials (still in development at the time
this article was written) include three multiple-chapter writing modules (corresponding to
the three target genres) and three concluding chapters relevant to all three genres (i.e., on
the formatting of tables, figures, schemes, citations, and references, and on revision and
proofreading strategies). There are also sections devoted to self-study language tips
(related to audience and purpose, writing conventions, and grammar and mechanics), peer-
review, and grading rubrics. Supplementary materials beyond the text itself include an
instructors’ answer key and ‘‘canned’’ research projects. The canned research projects
(based on techniques commonly encountered in undergraduate chemistry curricula and
research areas popular among students) include information about the research area and
fictitious, but realistic, data. These resources provide support for students who do not have
the research experience and/or data to write a scientific paper or poster by themselves.
To introduce students to the reading and writing of the target genres, a read–analyze–
write approach was formalized to guide students in reading (and rereading) authentic texts
from the targeted genres, analyzing texts, and then writing (and rewriting) a piece of their

2
These courses are required for both first language (L1) and second language (L2) students. L2 students in these
courses are fully matriculated into the university; no specific provisions are made for their linguistic needs. Rather,
the courses address the academic literacy needs of upper-division undergraduates in general.
3
Write Like a Chemist materials were piloted in eight US colleges and universities in 2004–2005 and another
eight institutions in 2005–2006.
ARTICLE IN PRESS
178 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

own following the writing conventions of the discipline. The excerpts from the targeted
genres (e.g., methods sections from journal articles) serve as models of preferential
patterns, discipline-specific expectations, and the interrelationships between language and
content. Excerpts were chosen to meet select criteria including topic, length, writing
conventions, and challenge. Topics were selected to represent a variety of areas within
chemistry and to be both of interest to students and within their intellectual grasp,
considering that they were presumed to have had only two prior years of university-level
chemistry course work. Short excerpts are used to highlight features of individual sections
of the target genres, while longer excerpts illustrate the interface between and among
different sections of the genres.
The materials are written to direct students’ attention to five essential writing
components (Table 1). By analyzing writing across these five dimensions, students gain
an appreciation for the complexity of the writing process and recognize the different
aspects of writing that need to coalesce for their work to sound professional. The five
essential components serve as a convenient mechanism for breaking down the larger
writing task into smaller, more achievable goals.
It should be noted that students’ genre analyses were preceded by our own analyses of
the three target genres. Our approach to genre analysis has been grounded in the work of
Bhatia (1993), Connor and Mauranen (1999), Hyland (2004a, b), and Swales (1990, 2004)
and augmented by the use of corpus linguistic tools to identify lexico-grammatical usage
and frequencies, contexts in which particular words and structures occur, and generalizable
linguistic patterns. (See Stoller et al., 2005, for a more detailed discussion of our genre
analysis activities.)

3.2. Rationale for materials assessment

Two major impetuses have driven our materials assessment activities. First, we had to be
mindful of our funding agency’s assessment and accountability expectations. The US
National Science Foundation (NSF) views evaluation as an integral part of the curriculum
and materials development process, rather than as an activity conducted solely at the end
of a project (Frechtling Westat, 2002). Furthermore, as recipients of NSF support, we are
obliged to submit annual progress reports. Our assessment activities, including the
collection of quantitative data about pilot sites as well as pre- and post-course student
ability measures, have facilitated the annual reporting process.
The second impetus, most important for us as materials developers, has been the need
for sufficient qualitative data to inform the ongoing process of materials creation, revision,
and improvement. Because our materials were designed for use by faculty and students
both within and beyond our home institution, it was important to gather a range of

Table 1
The five essential writing components

Audience Organization Writing conventions Grammar and mechanics Science content

Level of detail Broad structure Formatting Parallelism Text


Level of formality Fine structure Abbreviations and acronyms Punctuation Graphics
Conciseness Verb tense Word usage
Voice Subject/verb agreement
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 179

viewpoints—from instructors, students, and other potential adopters. We devised our data
collection instruments (following the recommendations of Dudley-Evans & St. John, 1998)
so that we could gain an understanding of (a) how materials were used and adapted in
different instructional settings and course contexts (e.g., writing intensive courses, lab
courses, seminars), (b) what specific activities and sections of the textbook were and were
not used (and for what reasons), and (c) how different user groups judged the overall
effectiveness of the materials.

3.3. In-house materials evaluation at early stages of materials development

When Write Like a Chemist materials were being developed for home–institution use in
the early stages of the project, evaluation was ongoing. The materials development team
met almost weekly to discuss, critique, and revise. Group deliberations led to the
realignment, augmentation, rewording, and sometimes removal of materials in addition to
the design of new materials. What was particularly valuable at this stage in the process
were the different orientations of the materials development team members. During our
most intensive year of meetings, one team member was actually using the materials in a
writing-intensive course for target students, and a second team member was observing the
class regularly to see how materials were being used and modified by the instructor and
how they were received by students. The remaining two team members were working on
continued materials development activities (and overseeing the entire project). To
supplement these materials-evaluation activities, students were interviewed at mid-term
and at the end of the semester for their reactions to the materials and to their (hopefully
improved) discipline-specific writing abilities. Students were also asked to complete an
anonymous evaluation form that linked elements of our materials with course objectives.
The insights gained from weekly meetings, student feedback, and continuous materials
development activities resulted in significant improvements in our materials.

3.4. External materials evaluation

When the project shifted to the piloting of materials at multiple external sites, we
developed a more formal set of evaluation procedures. Here we describe the instruments
developed to gather different types of feedback, the analyses of qualitative data that we
collected, and the feedback that informed subsequent materials improvement activities.
For the purposes of this discussion, we focus solely on the evaluation of materials from the
journal article module.
Four key objectives guided the design of our data-collection instruments. First, we
wanted to be able to triangulate data, both within and across three participant groups:
pilot faculty, pilot students, and potential adopters who served as external reviewers.4 In
this way, we were more likely to collect data that would represent diverse, but equally
important, perspectives. Second, as an extension of our triangulation goal, we wanted to
solicit varied forms of feedback (e.g., quantitative and qualitative) by different means (e.g.,
hard copy, online, and live via phone) from participant groups. Third, we wanted to collect
4
During our first pilot year (2004–2005), participants included pilot faculty (N ¼ 8), pilot students (N ¼ 155),
and external reviewers (N ¼ 11). Pilot faculty and external reviewers volunteered to be involved in the project
largely because of their interests in incorporating writing into their curricula.
ARTICLE IN PRESS
180 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

data that could provide insights on a range of materials components (e.g., excerpts from
the primary literature, read–analyze–write activities, language tips, peer review tasks) and,
at the same time, allow an in-depth examination of how materials were actually being used.
Fourth, we wanted procedures that would ensure confidentiality; we hoped that such
procedures would lead to more truthful feedback.
We created a set of instruments5 for each of the three participant groups (see Table 2).
With respect to pilot faculty, we developed three data-collection instruments. The online
Course Information Form elicited background data pertaining to class size, level of the
course, nature of the course (e.g., writing intensive, lab, seminar), status of the course (i.e.,
required or elective), and anticipated percentage of the course that would be devoted to
instruction using Write Like a Chemist materials. The Materials Evaluation Form, initially
provided in hard-copy format and later as an online survey, was used to gather mainly
qualitative feedback on individual chapters of the textbook and related materials (e.g.,
answer key, peer review forms, grading rubrics). The Faculty Interview, conducted as a
semi-structured phone interview toward the end of the pilot semester, facilitated the
collection of summative feedback on textbook materials and overall impressions that could
assist us with materials revisions.
For the pilot students, we created six data-collection tools. The online Student
Information Survey solicited personal and academic information (e.g., gender, major, year
in school, first and second language background). The Pre- and Post-Course Chemistry
Writing Assessment battery comprised five writing tasks that provided baseline and post-
course quantitative and qualitative data on students’ writing abilities and experiences with
Write Like a Chemist materials. As part of the pre/post assessment battery, students were
asked to (a) write an introduction to a chemistry journal article, (b) write the methods
section of a chemistry journal article, (c) create a table that could be placed in a results
section of a journal article,6 (d) proofread a discipline-specific passage and make
corrections where necessary, and (e) and write an essay in which they reflected on their
current strengths and weaknesses as scientific writers. Pilot students were also asked to
submit the final version of their data-driven research papers modeled after a journal article;
this paper was used as post-treatment evidence of student writing abilities.7 Toward the
end of the pilot course, students were asked to complete an online Final Textbook
Evaluation form, which solicited quantitative survey data about the extent to which
students perceived Write Like a Chemist materials to have helped them improve their
writing skills in different areas (e.g., grammar, chemistry-specific writing conventions,
organization, conciseness, formatting of tables and figures). The Student Interview,
conducted as a semi-structured phone interview with a subset of students, facilitated the
collection of more in-depth qualitative feedback on our materials.
We also solicited feedback from chemistry faculty who were not otherwise involved
with the project. These external reviewers were asked to review Write Like a Chemist
materials using a set of guiding questions. Their reviews provided a mechanism

5
If interested, contact first author Fredricka L. Stoller (Fredricka.Stoller@nau.edu) for copies of instruments
described in the remainder of this section.
6
For (a), (b), and (c), students were provided relevant (but fictitious) data and pertinent information needed to
write a simulated journal article introduction, methods section, and table.
7
Eight external evaluators (all university chemistry faculty) participated in the evaluation component of our
project. They assessed pilot students’ data-driven papers in an effort to determine student-writing outcomes and,
indirectly, the effectiveness of instructional materials. A reporting of results is beyond the scope of this paper.
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 181

Table 2
Data collection instruments for the Write Like a Chemist materials development project

Instrument Goal(s) Format Time frame Response


rate (%)

Pilot Faculty
Course Information Form To gather information about Online survey Beginning of course 100
structure of course and its
position within the larger
departmental curriculum

Materials Evaluation Form To gather feedback on Print/online survey Ongoing (one form 55.4
individual textbook chapters per chapter)

Interview To gather feedback on the Scripted phone End of course 100


overall success and specific interview
strengths and weaknesses of
the textbook and
accompanying materials

Pilot Students
Student Information To gather information on Online survey Beginning of course 75.5
Survey student demographics and
academic background

Pre-Course Chemistry To gather baseline data on 5-part written Beginning of course 97.4
Writing Assessment student writing ability assessment

Post- Course Chemistry To gather post-treatment 5-part written End of course 89.0
Writing Assessment data on student writing assessment
ability

Final Draft of Paper To gather post-treatment Data-driven End of course 89.7


data on student writing research paper
ability (modeled after
journal article)

Final Textbook Evaluation To gather feedback on Online survey End of course 46.5 (65.5)a
overall success of the
textbook

Interview To gather feedback on the Scripted phone End of course 96.3


overall success and specific interview
strengths and weaknesses of
the textbook and
accompanying materials

External Reviewers
Textbook Manuscript To gather feedback on the Written review Fall/Winter 71.4
Review overall success and specific 2004–2005
strengths and weaknesses of
the textbook and instructor’s
manual
a
Due to an unfortunate oversight, the final textbook evaluation was not administered at one of the pilot sites.
The figure in parentheses represents the response rate without that group.

for receiving feedback from interested faculty who were potential adopters of the
materials.
ARTICLE IN PRESS
182 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

The response rates varied across instruments, as indicated in Table 2. Overall, we


considered the response rates to be quite high. The instrument with the lowest response rate
was the Materials Evaluation Form, which we thought had great potential for providing
useful feedback. The response rate may have been low because we asked pilot faculty to fill
out a form upon completion of each textbook chapter; the task was inherently labor intensive
in part because the original form solicited open-ended responses. We converted the form mid-
way through the first pilot year into an on-line form with standardized responses to choose
from; respondents were given the option of supplementing their responses with open-ended
comments. The response rate improved when the form became easier to complete.

3.5. Data analyses

Quantitative and qualitative data collected by means of these instruments were compiled
and then analyzed in order to guide subsequent materials revision and improvement. The
quantitative data provided three types of information: (a) background information on the
pilot sites and students that allowed us to frame our subsequent analyses of qualitative
data, (b) pre- and post-course measures of student writing ability that could be used in
evaluating the overall effectiveness of our materials, and (c) annual reporting data to meet
funding agency requirements. The qualitative data, on the other hand, proved invaluable
in identifying specific features of our materials that should be kept, modified, realigned,
supplemented, or discarded to make the materials more effective.
In the remainder of this article, we focus on qualitative data stemming from (a) the
Materials Evaluation Form completed by pilot faculty, (b) Faculty and Student Interviews,
and (c) textbook manuscript reviews submitted by external reviewers. Data from each of
these qualitative instruments were analyzed by means of analytic induction (Creswell,
2003; LeCompte & Preissle, 2003). In the initial steps of this process, responses to each of
the instruments were read (by the first author) to develop a coding framework. This
entailed identifying categories of responses based upon (a) the frequency with which a
given topic recurred in the data set and (b) the perceived importance of a given topic/
comment. The preliminary framework was then utilized to code a subset of responses. The
framework and coded responses were then reviewed (by the second author) with an eye for
possible procedural modifications. Suggested modifications were discussed until consensus
was reached on a revised coding scheme. The entire data set was then recoded (by the first
author). Finally, the recoded data were reviewed (by the second author) to verify that the
responses had been coded properly.

4. Results

Three overarching categories emerged from our analyses of participant feedback: (a)
confirmation of our pedagogical approach, (b) expressions of concern about some
aspect(s) of the materials, and (c) concrete suggestions for the improvement of the
materials. All three types of responses proved useful, though in different ways.

4.1. Confirmation of pedagogical approach

A detailed analysis of responses that offered support for our pedagogical approach
resulted in three subcategories. These subcategories reflected the appropriateness of our
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 183

approach (a) for targeted student audiences, (b) for targeted teacher audiences, and (c) at
practical as well as conceptual levels.
Responses that confirmed the suitability of our pedagogical approach for target student
audiences (L1 and L2 chemistry majors) revealed (a) the usefulness of our materials as a
primary classroom and out-of-class student-user resource, (b) the accessibility and
appropriateness of model texts from the primary literature, (c) the utility of the canned
research projects, and (d) the appropriateness of skill emphases. Comments like the ones
below affirmed some of the key features of our materials, confirming that we had judged
student needs appropriately.
I think it’s a pretty good book. It’s not something you read from cover to cover; it’s
more like a reference manual. I would go to it as a reference and use it to answer some
questions I had or clarify some things if I needed that. I used the poster section actually
to make my poster for the ACS convention, so I found that section to be very helpful.
[Student interview 101]
The book practices what it preaches. The audience is college undergraduates, so many
of the readings are on topics, like beer chemistry, which would appeal to them. [External
reviewer 201]
[The section on formatting graphics is a] good fit with students’ needs. I notice that
[students] are very good now at fixing these problems. [Pilot faculty Materials
Evaluation Form 301]
[A] number of y students were also in an advanced lab with me, and they were
ynoticing differences between what they were writing for a lab report—which is
basically what they had been doing since they were freshmen—and the writing y and
the thinking y that was expected of them here. [Pilot faculty interview 401]
Other responses, such as those listed below, revealed the appropriateness of the
approach for targeted instructors, that is, chemistry faculty with little, if any, experience
teaching and assessing writing. Numerous faculty mentioned how much they had actually
learned about writing in the discipline while teaching from the materials or reviewing them.
Such revelations were reassuring.
I enjoyed it. And, I’ve mentioned this to other people on campus, both in sciences and a
couple of people in the English department, and they’re really interested in it. I would
like to develop a science writing course. [Pilot faculty interview 401]
The benefit for faculty reading this manuscript should not be ignored. Although we are
required to write in our specific sub-disciplines, there is no guarantee that we write well.
Furthermore, there is no guarantee that we reflect [on] the writing processy. I am
certain that I will continue to use the textbook and my own writing will improve.
[External reviewer 202]
Other responses focused less on targeted audiences and more on the suitability of the
pedagogical approach at practical and more abstract levels. These comments (see below)
highlighted the appropriateness of what we considered to be some of the distinguishing
features of our materials (e.g., the use of excerpts from the primary literature to showcase
common writing conventions, a read–analyze–write instructional approach, and the use of
move structures to depict organizational patterns in different sections of the three targeted
genres; see Swales, 1990, 2004). It was encouraging to receive this feedback; it provided us
with the confidence to build upon our intended foci in subsequent revisions.
ARTICLE IN PRESS
184 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

The repeated use of a few articles made it easier to see how [writing] changed from
section to section [in a journal article]. [External reviewer 203]
The move structures y were extremely useful. [They] told you exactly what
information you should be putting in each of the sections [of the journal article] and in
what order it would make the most sense. [The move structures] kind of helped make
your sections flow a bit better and [they] made sure you got all of the information in
there. [Student interview 102]
The concept of the ‘‘move’’ is a brilliant way to organize and focus advice [on writing].
[External reviewer 202]
In the past, when I had students write a secondary review paper, I always had them write
the introduction first. This was where they learned the topic. I polled the class about
whether they would have liked to write the intro first to learn about the topic. They
overwhelmingly said they liked the order in the [Write Like a Chemist materials]. They
said they could not have written the intro until they learned about the project in the
methods and results sections. [Pilot faculty Materials Evaluation Form 302]
I’ll tell you what we really like y those little sidebars y the computer database corpus
linguistics sidebars. For example, one I think that was really memorable for students
was this idea of the word[s] ‘research’ and ‘researchers’ y [and their] frequency y in
the literature. Those are two words that novice writers—I mean I’ve seen this again and
again reading lab reports—like to usey. Just to have that one little sidebar where it
very succinctly says, ‘‘This is what we found out,’’ sends a clear message. [Pilot faculty
interview 402]
As these sample responses reveal, feedback submitted by our three participant groups,
by means of different instruments, affirmed the value of many practical as well as
conceptual features of our materials. As illustrated by the two comments on move
structures (above), there were many instances of convergent views, where different
respondents were basically in agreement on a given point, positive or negative. Convergent
views within and across participant groups proved useful in our revisions.

4.2. Expressions of concern

While some participant feedback explicitly endorsed our materials, other feedback
revealed concerns about (a) physical characteristics of the materials, (b) different textbook
features and emphases, and (c) the appropriateness and practicality of the materials for
targeted settings and audiences. These critical, but constructive, comments, individually
and in the aggregate, raised important issues for us later during revision stages. For
example, some participants commented on what they perceived to be unnecessary
repetition in the materials.
The scoring guides are quite repetitive. Is there a more concise way to present these?
[External reviewer 204]
Some of the [exercises] went a little too far—like you have to look up 10 articles. This
was busy work. [Student interview 103]
The biggest thing for me actually—I thought I would have an easy time of implementing
these materials—but I found that it got a little stale. So, the biggest challenge for me
was trying to find enough variety in the way that I was presenting material in class to y
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 185

keep the students on their toes so it didn’t become y rote or repetitious. [Pilot faculty
interview 402]
Others expressed concerns about the feasibility of select aspects of the materials.
The main problem with canned research, in general, is, of course, since you haven’t done
the experiment yourself, you’re not exactly sure what’s going on and you’re just kind of
thrown into someone else’s notes. And, uranium induced DNA, I didn’t think it was
assembled very well. It’s about 20 pages, I think. It took me a half an hour to read it
from beginning to end and, after I’d finished reading it, I didn’t really get anything out
of it. I had to read it 2 or 3 times in order to get the gist. I mean the science itself was
fine—the DNA, the uranium—but the procedure, the experiment, was the problem. The
science made sense, but the experiment was kind of vague in my opinion. [Student
interview 101]
Canned research: Students struggled a bit with data manipulation in part because they
did not DO the work. Although [the canned research] gave them a chance to practice, it
still makes more sense to me for students to write up/manipulate lab work they’ve done.
[Pilot faculty Materials Evaluation Form 301]
Additional concerns about the usefulness and effectiveness of select features of our
materials (including authentic texts, exercises, and sidebars) were also expressed. At times,
participants expressed divergent views about the same feature (e.g., tense/voice
discussions), as seen below.
The analysis of tense and voice—students really seemed to struggle with this. I’m not
sure if y they don’t remember how to recognize [tense and voice] or if they’ve ever
truly been taught to analyze writing so closely. As a result, they are having a difficult
time incorporating [appropriate tense and voice] into their writing. [Pilot faculty
Materials Evaluation Form 301]
The voice/tense issues were seen as too easy or too trivial for students. [Pilot faculty
Materials Evaluation Form 303]
The introduction of the nominalization ‘‘concept’’ did not seem worth the effort.
[External reviewer 205]
Although the sidebars receive equal organizational weight, they are different in their
helpfulness and necessity.... There is too much heterogeneity in sidebar information.
[External reviewer 202]
Other evaluators expressed preoccupations about select emphases or their absence.
Being a physical/inorganic chemist, I felt a bit slighted by the emphasis on organic/
analytical readings. [External reviewer 205]
I felt that the section on results really missed a major point of chemical writing and one
that is almost invariably the most difficult for students: organizing the data. [External
reviewer 206]
Others were troubled by the perceived unsuitability of our materials for specific
instructional settings and audiences.
I have one overreaching concern: with the already crowded curriculum mandated by the
American Chemical Society Committee on Professional Training, and the constraints
placed by individual institutions with respect to general education—which I strongly
ARTICLE IN PRESS
186 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

support—will there be time and credits available for students to fully benefit from the
materials presented? [External reviewer 207]
I had such a mix of the quality of the students in the class y some who were better
writers than others. So, for the ones who were better writers, they thought the
[textbook] was easy to understand, easy to work with. And, for those who were not very
good at writing, they struggled with the [textbook]. [Pilot faculty interview 403]
These types of feedback (including divergent views on the same feature) from different
participant groups have proven particularly instructive, obliging us to reevaluate our
materials in light of diverse perceptions. Our responses to such feedback have resulted in
some materials being discarded in their entirety; other items have been modified in major
ways or substituted with other materials. We intend to address most of these concerns
explicitly in the teacher’s manual, which will be developed in the final stages of the project.

4.3. Suggestions for improvement of materials

Concrete suggestions for the improvement of our materials, the third major category of
qualitative responses, were plentiful and came in many shapes and forms. Some
respondents suggested a reorganization of information, as indicated in these responses:
I would consider moving some basics of writing from appendices to the introduction.
[External reviewer 207]
Fig. 1.2 and Table 1.3 would read better if the order were reversed. [Pilot faculty
Materials Evaluation Form 301]
The problems that the students had [are tied to] the number of excerpts in the middle of
the chapters. They wanted those [excerpts] to be put at the end. They said they were
great, we appreciated them, but they got lost trying to follow the explanation of the
chapter with y so many excerpts in the middle. [Pilot faculty interview 403]
Others suggested that we add elements to our materials. A number of additions were
proposed by more than one of our participants, adding weight to the suggestions. In most
cases, the suggestions were common sense; we were simply ‘‘too close’’ to our materials to
see the need. After being pointed out, it became apparent that the additions would
strengthen the effectiveness of our materials.
For some exercises in this chapter (and others as well), it may be helpful to include a
‘‘worked example’’ so that students can see/understand the strategy required to
[complete] the exercise. [Pilot faculty Materials Evaluation Form 301]
The text would be substantially improved with two additions to each chapter. A detailed
list of the topics covered y at the beginning of each chapter would greatly help
students.... Along those lines, the text needs some sort of [bulleted] summary page....
For a student who has already worked through a chapter, having these two items will be
very helpful when they go back to write their own papers. [External reviewer 206]
I think that the activities y that are within the chapter follow the chapter very well.
And, in some chapters, there were end-of-chapter [exercises] that sort of brought it
togethery. I think there could be a few more end-of-chapter problems that could link
up with some of the things that were done in classy. I think some professors might want
to see that. [Pilot faculty interview 401]
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 187

Other respondents suggested modifications for increased clarity and either more or less
detail.
What I don’t like about the book is that it doesn’t highlight what exactly is an
introduction or an abstract. It’s not like other [sections of the journal article] where
[the authors explain] the meaning of the [section]y. What does the introduction
mean?y Highlight it y in a couple of sentences or even one sentence. [Student
interview 104]
I would suggest changing the icon [that refers readers] to Appendix A. It looks too
much like the icon for ‘‘Writing On Your Own.’’ [External reviewer 208]
Many students did not understand what was being asked in exercise 1.15. Specifically,
many students thought that they should make up a move structure for each
acknowledgment y rather than for the genre. I would suggest rewording this
[exercise]. [Pilot faculty Materials Evaluation Form 303]
Numerous respondents offered suggestions by explaining how they had translated our
materials into practice, how they had adapted the materials for their instructional contexts,
and which external resources they had turned to for supplementation (see below). Some
respondents attached their own PowerPoint presentations, exercises, and worksheets for
our review. In the aggregate, suggestions of these types led to meaningful, though often
small, changes. Equally important, they have given us ideas for our teacher’s manual.
The move structures are very effective. We [compared] them to a chess match or
billiards game. We emphasized the idea of thinking about the next move. Very helpful in
providing students with a programmed approach to organization in writing. [Pilot
faculty Materials Evaluation Form 301]
I often refer students to the MIT Guide to Science and Engineering Communication for
ideas on poster/oral presentations. [External reviewer 209]
We interpreted feedback that was submitted in the form of explicit requests for teaching
tips and/or teaching resources as indirect recommendations for modifying and
supplementing materials so that they would be more accessible for faculty. In essence,
such comments (sometimes impassioned pleas) revealed that our materials were not
working entirely on their own. The pilot faculty, as well as external reviewers, expressed a
need for more guidance (see below), something we intend to address in our teacher’s
guidelines.
As an instructor, I would appreciate some suggestions on how [these materials] could
[actually] be used in a course. [External reviewer 204]
I still think that somewhere in the instructor’s manual some implementation tips would
be helpful. That’s the thing that I really think was missing. Or, maybe something at the
beginning of each section in the instructor’s manual that says, ‘‘Here are the take home
messages—the most important things as far as we see them.’’ I think that would help me
prioritize a little bit. [Pilot faculty interview 402]
The feedback that we have interpreted as ‘‘suggestions for improvement’’ has assisted us
in refining our materials. Furthermore, it has helped us reconceptualize our teacher’s
guidelines that we now know need to include a discussion of reading and writing
connections, ideas for adding variety to the classroom and for varying the presentation of
textbook information, suggestions for student-group formation and different class
ARTICLE IN PRESS
188 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

formats, and the presentation of potential problems with possible solutions (see Reid,
1995).

5. Implications

The feedback that we received from three participant groups, by means of multiple data
collection instruments, has provided us with a range of diverse perspectives on our
materials. Because it would have been difficult to distance ourselves from our materials
sufficiently to gain these insights on our own, the feedback has proven to be invaluable. It
has stimulated interesting discussions, serious reflection, and a combination of both minor
and substantive improvements in our materials.
First, it should be noted that the Write Like a Chemist materials development project,
despite its large scale and combined L1 and L2 student groups, shares similarities with
other EAP materials development projects. The most notable similarity is its underlying
EAP research and development framework (e.g., Brown, 1995a, b; Dudley-Evans & St.
John, 1998; Jordan, 1997; Richards, 2001). Furthermore, its attention to authenticity of
purpose, tasks, and texts; integrated reading and writing skills; strategy training; problem
solving; and self-reflection mirrors the priorities of many EAP materials development
projects (see Jordan, 1997). Our experience suggests numerous implications that can guide
other materials writers, whether they are involved in large-scale projects like ours or more
modest and typical EAP endeavors. We have divided these implications into two sections:
those that are pertinent to the overall materials evaluation process and those that are
particularly relevant to the design of materials evaluation instruments.

5.1. Implications for the overall materials evaluation process

1. During initial stages of materials development, it is important that team members


engage in on-going communication to evaluate materials. The consideration of different
viewpoints (e.g., from teacher users, classroom observers, and/or materials developers)
can reveal issues that need to be addressed. Later in the process, it is necessary to return
to these early evaluations and decide how (and if) these viewpoints are reflected in final
versions of the materials.
2. Soliciting feedback from others, beyond the materials development team, is a sensible
step to take. One of the most valuable forms of feedback comes from the actual trialing
of materials; the collection of reactions from both pilot teachers and students can be
instructive (Jolly & Bolitho, 1998; see Donovan, 1998, for sample pilot questionnaires).
It is also useful to solicit reviews from potential users (i.e., from those who are not
actually piloting materials, but rather simply reviewing them).
3. Whenever possible, it is worthwhile to triangulate data collection. Having multiple
sources of feedback, soliciting reactions from a range of teachers (experienced and
inexperienced), institutions, and students, and using a variety of instruments (e.g.,
questionnaires, classroom observations, interviews) can be quite revealing. Triangulated
data collection can lead to the confirmation of shared perspectives, the discovery of
different points of view, and the revelation of ‘‘unanticipated consequences’’ (Frechtling
Westat, 2002). Equally important, it can identify the strengths and weaknesses of
different components of a materials development project—an essential goal for
principled material improvements. The inherent complexities of EAP materials
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 189

(including their texts, tasks, directions, examples, sequencing, accessibility, flexibility,


suitability, and language and content foci) merit triangulated evaluation.
4. In interdisciplinary materials development teams, it is helpful to establish a shared
vocabulary that can be used while conducting needs analyses, establishing goals and
objectives, developing materials, devising materials evaluation instruments, analyzing
data, and revising materials.
5. Though not always feasible, the materials evaluation process is best overseen by an
individual who is not actually one of the materials writers. In this way, data can be
compiled and forwarded to materials writers while ensuring the confidentiality of
evaluators.

5.2. Implications for the design of materials evaluation instruments

1. Materials evaluation instruments should be designed to solicit answers to what,


how, and why questions. Feedback that identifies perceived strengths and
weaknesses of materials, that clarifies what and how materials have been used (or not
used), and that reveals why materials are perceived as they are can prove useful during
revision stages.
2. The collection of both quantitative and qualitative feedback is beneficial.
3. Quantitative data are particularly useful for materials development progress reports
(e.g., to program administrators, publishers, funding agencies, Ministries of Education).
However, quantitative data by themselves have limited value for meaningful material
improvements. Qualitative data, on the other hand, are invaluable for material
improvements; as a complement, quantitative data can often make qualitative data
more easily interpretable.
4. The use of multiple and varied assessment instruments (e.g., questionnaires, interviews,
Likert scale items), directed to different evaluator groups, can increase the chances for
convergent and divergent feedback as well as principled opportunities for meaningful
improvements in materials.
5. Response rates can be influenced by the types of instruments used and the time required
for their completion. Response rates for instruments that are quick and easy to
complete are likely to be higher than response rates for more complex, time-consuming
instruments. Online forms, with standardized responses to select from (with the option
of additional open-ended comments), are more user-friendly than laborious open-ended
paper and pen tasks. However, we have also found that leaving the question of when
(and if) to complete online forms to the discretion of individual piloters has often meant
that such instruments were simply forgotten (or ignored) as courses progress and as
student and instructor workloads increase. Therefore, more ‘‘controlled’’ instruments
(i.e., instruments administered to the students or instructor directly by the evaluator)
may be necessary to ensure that sufficient numbers of responses are gathered. The trade-
off, of course, is that controlled instruments can be time consuming to schedule,
administer, transcribe, and analyze.

The implications drawn from our materials development and evaluation experience are
relevant for other EAP material writers. Whether materials are written for one’s own EAP
classrooms, others’ classrooms, or much wider teacher-learner audiences (the latter, most
often in published form), materials ultimately are more effective when viewed through the
ARTICLE IN PRESS
190 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

eyes of multiple user groups and then revised accordingly. That materials writers might
have a difficult time perceiving the problems that other users have with their materials is a
common phenomenon. Thus, it behooves us all to solicit feedback from relevant target
audiences.
The EAP literature provides insightful discussions on materials development (e.g., Ferris
& Hedgcock, 1998; Ferris & Tagg, 1996; Flowerdew, 1994; Flowerdew & Miller, 2005;
Flowerdew & Peacock, 2001b; Grabe & Stoller, 2001; Hyland, 2003; Jordan, 1997; Swales,
1995), but little has been said about how to engage in evaluative review to inform the
revision stages of the materials development process. It is our hope that this case study,
and corresponding implications, will assist materials developers in meeting the needs of
target students and their instructors.

Acknowledgements

The authors gratefully acknowledge the support of the US National Science Foundation
(DUE 0087570 and 0230913); we note that any opinions, findings, conclusions, and
recommendations expressed in this article are those of the authors and do not necessarily
reflect the views of the National Science Foundation. We thank J. K. Jones, M. S.
Costanza-Robinson, and chemistry faculty and students who have used, reviewed, or
contributed to Write Like a Chemist materials. We also thank JEAP editor Ken Hyland
and two anonymous reviewers for their insightful comments on an earlier draft of this
article.

References

Bell, J., & Gower, R. (1998). Writing course materials for the world: A great compromise. In B. Tomlinson (Ed.),
Materials development in language teaching (pp. 116–129). New York: Cambridge University Press.
Bhatia, V. (1993). Analyzing genre: Language use in professional settings. London: Longman.
British Council. (1980). Projects in materials design. London: The British Council.
Brown, J. D. (1995a). The elements of language curriculum: A systematic approach to program development.
Boston: Heinle & Heinle.
Brown, J. D. (1995b). Language program evaluation: Decisions, problems, and solutions. In W. Grabe (Ed.),
Annual Review of Applied Linguistics, Vol. 15 (pp. 227–248). New York: Cambridge University Press.
Brown, J. D. (2003). Creating a complete language-testing program. In C. A. Coombe, & N. J. Hubley (Eds.),
Assessment practices (pp. 9–23). Alexandria, VA: TESOL.
Byrd, P. (1995). Writing and publishing textbooks. In P. Byrd (Ed.), Materials writer’s guide (pp. 3–9). Boston:
Heinle & Heinle.
Connor, U., & Mauranen, A. (1999). Linguistic analysis of grant proposals: European Union research grants.
English for Specific Purposes, 18, 47–62.
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.).
Thousand Oaks, CA: Sage.
Donovan, P. (1998). Piloting—A publisher’s view. In B. Tomlinson (Ed.), Materials development in language
teaching (pp. 149–189). New York: Cambridge University Press.
Dudley-Evans, T., & St. John, M. J. (1998). Developments in English for specific purposes: A multi-disciplinary
approach. New York: Cambridge University Press.
Ellis, R. (1998). The evaluation of communicative tasks. In B. Tomlinson (Ed.), Materials development in language
teaching (pp. 217–238). New York: Cambridge University Press.
Ferris, D., & Hedgcock, J. S. (1998). Teaching ESL composition: Purpose, process, and practice. Mahwah, NJ:
Lawrence Erlbaum.
Ferris, D., & Tagg, T. (1996). Academic oral communication needs of EAP learners: What subject-matter
instructors actually require. TESOL Quarterly, 30, 31–58.
ARTICLE IN PRESS
F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192 191

Flowerdew, J. (Ed.). (1994). Academic listening: Research perspectives. New York: Cambridge University Press.
Flowerdew, J., & Miller, L. (2005). Second language listening: Theory and practice. New York: Cambridge
University Press.
Flowerdew, J., & Peacock, M. (2001a). The EAP curriculum: Issues, methods, and challenges. In J. Flowerdew, &
M. Peacock (Eds.), Research perspectives on English for academic purposes (pp. 177–194). New York:
Cambridge University Press.
Flowerdew, J., & Peacock, M. (Eds.). (2001b). Research perspectives on English for academic purposes. New York:
Cambridge University Press.
Frechtling Westat, J. (2002). The 2002 user-friendly handbook for project evaluation. Arlington, VA: National
Science Foundation.
Grabe, W., & Stoller, F. L. (2001). Reading for academic purposes: Guidelines for the ESL/EFL teacher. In M.
Celce-Murcia (Ed.), Teaching English as a second or foreign language (3rd ed, pp. 187–203). Boston: Heinle &
Heinle.
Hyland, K. (2003). Second language writing. New York: Cambridge University Press.
Hyland, K. (2004a). Disciplinary discourses: Social interactions in academic writing. Ann Arbor, MI: University of
Michigan Press.
Hyland, K. (2004b). Genre and second language writing. Ann Arbor, MI: University of Michigan Press.
Johns, A. (1997). Text, role, and context: Developing academic literacies. New York: Cambridge University Press.
Johnson, R. K. (1989). A decision-making framework for the coherent language curriculum. In R. K. Johnson
(Ed.), The second language curriculum (pp. 1–23). New York: Cambridge University Press.
Jolly, D., & Bolitho, R. (1998). A framework for materials writing. In B. Tomlinson (Ed.), Materials development
in language teaching (pp. 90–115). New York: Cambridge University Press.
Jordan, R. R. (1997). English for academic purposes: A guide and resource book for teachers. Cambridge, UK:
Cambridge University Press.
LeCompte, M. D., & Preissle, J. (2003). Ethnography and qualitative design in educational research (2nd ed). San
Diego, CA: Academic Press.
Littlejohn, A. (1998). The analysis of language teaching materials: Inside the Trojan horse. In B. Tomlinson (Ed.),
Materials development in language teaching (pp. 190–216). New York: Cambridge University Press.
Reid, J. (1995). Developing ESL writing materials for publication or writing as a learning experience. In P. Byrd
(Ed.), Materials writer’s guide (pp. 64–78). Boston: Heinle & Heinle.
Richards, J. C. (2001). Curriculum development in language teaching. New York: Cambridge University Press.
Stoller, F. L., Horn, B., Grabe, W., & Robinson, M. S. (2005). Creating and validating assessment instruments
for a discipline-specific writing course: An interdisciplinary approach. Journal of Applied Linguistics, 2(1),
73–101.
Stoller, F. L., Jones, J. K., Costanza-Robinson, M. S., & Robinson, M. S. (2005, May 15). Demystifying
disciplinary writing: A case study in the writing of chemistry. In Across the disciplines: Interdisciplinary
perspectives on language, learning, and academic writing. Retrieved May 27, 2006, from http://wac.colostate.
edu/atd/lds/stoller.cfm
Swales, J. M. (1990). Genre analysis: English in academic and research settings. Cambridge: Cambridge University
Press.
Swales, J. M. (1995). English for academic purposes. In P. Byrd (Ed.), Materials writer’s guide (pp. 124–136).
Boston: Heinle & Heinle.
Swales, J. M. (2004). Research genres: Exploration and applications. Cambridge: Cambridge University
Press.
Wardle, E. A. (2004). Can cross-disciplinary links help us teach ‘‘academic discourse’’ in FYC? In Across the
disciplines: Interdisciplinary perspectives on language, learning, and academic writing, 1. Retrieved May 27,
2006, from http://wac.colostate.edu/atd/articles/wardle2004/

Fredricka L. Stoller is Professor of English at Northern Arizona University where she teaches in the MA TESL
and Ph.D. in Applied Linguistics programs. Her interests include English for Academic Purposes, literacy skills
development, content-based instruction, and disciplinary writing. She is co-Principal Investigator of the Write
Like a Chemist project.

Bradley Horn is a Ph.D. student in the applied linguistics program at Northern Arizona University. He
has served on the Write Like a Chemist assessment team since its inception. His research focuses largely
ARTICLE IN PRESS
192 F.L. Stoller et al. / Journal of English for Academic Purposes 5 (2006) 174–192

on the assessment of languages for specific purposes and the use of technology in discipline-specific writing
assessment.

William Grabe is Professor of English at Northern Arizona University where he teaches in the MA TESL and
PhD in Applied Linguistics programs. He has served on the Write Like a Chemist assessment team since its
inception. His interests include L1/L2 literacy, discourse analysis, and applied linguistics more generally.

Marin S. Robinson is Associate Professor of Chemistry and Environmental Sciences at Northern Arizona
University where she teaches organic chemistry, atmospheric chemistry, and conducts research on atmospheric
particulate. She is the Principal Investigator of the Write Like a Chemist project.

You might also like