You are on page 1of 6

The Clearing House: A Journal of Educational Strategies,

Issues and Ideas

ISSN: 0009-8655 (Print) 1939-912X (Online) Journal homepage: https://www.tandfonline.com/loi/vtch20

Using Questionnaires in Teacher Research

Daniel Xerri

To cite this article: Daniel Xerri (2017) Using Questionnaires in Teacher Research, The
Clearing House: A Journal of Educational Strategies, Issues and Ideas, 90:3, 65-69, DOI:
10.1080/00098655.2016.1268033

To link to this article: https://doi.org/10.1080/00098655.2016.1268033

Published online: 12 Jan 2017.

Submit your article to this journal

Article views: 3076

View related articles

View Crossmark data

Citing articles: 1 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=vtch20
THE CLEARING HOUSE
, VOL. , NO. , –
http://dx.doi.org/./..

Using Questionnaires in Teacher Research


Daniel Xerri
Centre for English Language Proficiency, University of Malta, Msida, Malta

ABSTRACT KEYWORDS
Teacher research is described as being beneficial and yet it is hampered by teachers’ lack of knowl- Teacher research;
edge about research, including how to use popular research methods. Given that accounts by teach- questionnaires; survey;
ers describing their use of such methods in a systematic manner might prove useful for their peers, professional development
this article describes my experience of using a questionnaire in teacher research.

Introduction Surveying students


Engaging in research is an important means by which In one of my studies at the school where I used to
teachers can enhance teaching and learning, and teach, I incorporated a questionnaire into the mixed
develop as professionals. Teacher research is defined as methods design as a means of investigating the inter-
research conducted by practitioners in their own con- play between students’ attitudes, beliefs and prac-
text and for the purpose of developing their under- tices in relation to some of the texts they studied in
standing of their practices (Borg 2013). It “encour- class. I chose a mixed methods approach because this
ages teachers to reflect on their practice, and therefore enables teacher-researchers to combine different data
leads to potential change. It plays an important part sources and methods of analysis in order to enhance
in reflective teaching, where personal and professional the completeness of description, and the accuracy
development occur when teachers review their experi- and sensitiveness of interpretation (McDonough and
ence in a systematic way” (Field 1997, 192). Despite its McDonough, 1997).
benefits, the main argument against teacher research In my experience, one of the attractions of using
concerns teachers’ lack of research skills and training a questionnaire is that it gives teacher-researchers
(Burns 2010b). One of the lacunae in teachers’ knowl- the opportunity “to gather information that learn-
edge about research seems to be constituted by a lack ers are able to report about themselves, such as their
of solid familiarity with how to use popular data collec- beliefs and motivations about learning or their reac-
tion methods. Despite the plethora of guides in relation tions to learning and classroom instruction and activ-
to different research instruments, these are not usu- ities” (Mackey and Gass 2005, 77). However, I am
ally written by teachers and most often fail to take into also aware that questionnaires tend to provide a some-
consideration how teachers might need to use them. what thin description of whatever is being investigated
Given the idea that teachers seem to learn best from (Dörnyei 2007, 115) and are characterized by the possi-
other teachers (Margolis 2009), in this article I describe ble restricted scope of the data that are collected, as well
my experience of using a questionnaire in some of my as limited flexibility of response (Cohen, Manion, and
teacher research. My account illustrates how I sought to Morrison 2007, 317). The main downside for teacher-
use the questionnaire in a systematic fashion, this being researchers is that questionnaire data may sometimes
one of the defining characteristics of teacher research provide a superficial assessment of a very complex con-
(Henderson et al. 2012). struct (Wagner 2010, 26). In my study I sought to

CONTACT Daniel Xerri daniel.xerri@um.edu.mt Centre for English Language Proficiency, Room , Ġuze Cassar Pullicino Building, University of Malta,
Msida, MSD , Malta.
©  Taylor & Francis Group, LLC
66 D. XERRI

counteract these problems by combining the question- of four pages because of the suggestion that anything
naire with semi-structured and focus group interviews, much longer than that would be a heavy imposition on
both of which have the capacity to provide rich data. respondents (Dörnyei 2007, 110).
Teacher-researchers can design questionnaires in Besides paying attention to a questionnaire’s design,
order to collect three kinds of information about stu- teacher-researchers also need to be careful of how
dents: factual, behavioral, and attitudinal (Dörnyei and they administer it. I used my questionnaire to survey
Taguchi 2010, 5). The first item in my study’s ques- the entire A-level English student population at my
tionnaire asked for bio-data information whereas the school, which stood at 376 students (96 males and 280
rest was made up of selected-response items and open- females). By choosing to include the whole target pop-
ended questions focusing on attitudes, beliefs and prac- ulation in the survey rather than just my students, I felt
tices. In this I followed McDonough and McDonough’s I could speak with a higher degree of certainty about
(1997, 177) advice to choose a variety of question types their attitudes, beliefs and practices (Munn and Drever
to enhance the range and detail of the gathered infor- 2004, 19). I also realized that I would be able to com-
mation. That is why the questionnaire contained dif- pare my students’ views with those of students taught
ferent kinds of questions, including multiple-choice, by my colleagues, thus identifying important similari-
dichotomous (e.g. yes or no questions), rank ordering ties and differences. Given my proximity to the target
and open-ended questions. One of the questions was population, I was also able personally to adminis-
in the form of a four-point Likert scale, which teacher- ter the questionnaire. This is a good technique for
researchers might find useful for identifying students’ those teacher-researchers who want “to get standard-
opinions, views and judgements (Brown and Rodgers ized information by offering everyone the same stim-
2002, 120). I chose to use a four-point scale since I ulus”, especially since “the spoken presentation and
was aiming for a stronger level of commitment on stu- the attitude of the presenter can have a marked effect
dents’ part than that entailed by a more finely tuned on how questionnaires are completed” (Munn and
five-point scale giving the “no opinion” option. Newby Drever 2004, 35,36). When a teacher-researcher takes
(2014, 308) claims that “when a mid-point is inserted, on responsibility for questionnaire administration, this
there is evidence that responses can gravitate towards increases the response rate and the quality of the results
it because there is comfort in the average and it is eas- (Wagner 2010, 30). A problem that is sure to affect
ier to check it rather than think deeply about the issue teacher-research based on questionnaire data is that of
and decide on which side you sit.” For similar rea- non-response bias. I chose to minimize this as much as
sons, the “don’t know” option was not inserted in the possible by asking students who were absent during the
selected-response items that formed part of my ques- administering of the questionnaire to complete a copy
tionnaire. Teacher-researchers need to be aware of the upon their return to school.
fact that if too many students opt for this category, the The next stage in the process of using a question-
results of their survey will not be statistically significant naire in teacher-research is that of analyzing the data.
(Wagner 2010, 27). Perhaps that is why it is described A questionnaire is likely to yield a substantial amount
as “a lazy option for some respondents” (Newby 2014, of numerical data, which has to be analyzed statisti-
314). I used open-ended questions whenever I could cally. However, this does not necessarily involve famil-
not predict all the possible answers that students might iarity with dedicated software like SPSS. In my study, I
produce. According to Dörnyei (2007, 107), “By per- used spreadsheets to make sense of the raw data (Munn
mitting greater freedom of expression, open-format and Drever 2004, 40). Closed-ended questions were
items can provide a far greater richness than fully quan- pre-coded by means of numbers and letters and this
titative data”, however, they “work particularly well was in line with the suggestion that “a coding frame
if they are not completely open but contain certain is generally developed before the interviewing com-
guidance.” I found that the best way of providing this mences so that is can be printed into the questionnaire
kind of guidance is by including clarification questions itself ” (Cohen, Manion, and Morrison 2007, 348). In
and sentence completion items in my questionnaire. the case of open-ended questions, the coding frame
Teacher-researchers also need to keep in mind that a was developed by examining a sample of responses and
questionnaire’s brevity is very important as it facili- calculating a frequency tally, upon which the validity
tates response. I kept my questionnaire to a maximum of the coding frame was checked further by coding
THE CLEARING HOUSE 67

a larger sample (Cohen, Manion, and Morrison 2007, groups so as to get an opportunity of tackling and mak-
348). This method is in line with the recommenda- ing a note of their difficulties.
tion to analyze open-ended questions “by describing The success of the piloting phase will depend largely
the trends, themes or patterns of ideas you find in them” on the feedback provided by the students chosen by
(Burns 2010a, 85). Teacher-researchers need to bear the teacher-researcher. By piloting the questionnaire
in mind that the main advantage of using categories with a group of students who were “sympathetic …
derived from the data rather than pre-set categories but willing to give forthright comments and sharp crit-
is that one avoids the risk of imposing one’s interests icism” (Munn and Drever 2004, 35), I was able to make
on the data (Munn and Drever 2004, 45). Moreover, a number of adjustments to ambiguous and mislead-
when coding both closed and open questions, missing ing terms and phrases. For example, in three particu-
answers need to be taken into account, as are answers lar questions students were asked to indicate the effec-
that clearly show students’ disregard or misinterpreta- tiveness of their teacher’s method when teaching a set
tion of the instructions. text in lectures, and when conducting seminars and
tutorials. The design of these questions was somewhat
problematic, primarily because of the issue of “effec-
Piloting the questionnaire tiveness.” Noticing that some of the students had prob-
If teacher-researchers wish to use questionnaires in a lems interpreting the term “effective”, I asked them to
systematic manner, it is advisable to pilot their instru- explain to me what they considered to be effective or
ment. I found that when designing my questionnaire not. Most of them claimed that an effective method
it was helpful to take into account a number of rec- of teaching a set text is one in which they are pro-
ommendations put forward by the literature, especially vided with plenty of notes and background informa-
in relation to writing questions that match the stu- tion. However, during seminars and tutorials a teacher’s
dents’ vocabulary and ideas, making questions concise method is considered effective if students are allowed
and straightforward, and having one issue per question to “participate” and voice their views about the text
(Newby 2014). The wording of the questions is par- in question. Therefore, I decided to replace the term
ticularly important when evaluating non-factual mat- “effective” with the less ambiguous “useful.” Another
ters, such as students’ attitudes, beliefs and practices problem presented by these three questions consisted
(Dörnyei 2007, 103). The piloting of the questionnaire of the ambiguity surrounding the “somewhat effective”
will serve to address any problems in the wording category. Even though this category can be interpreted
and in other aspects of the instrument. In the piloting as neutral, during my analysis of the data I tended
phase, teacher-researchers should aim “to increase the to associate it with the negative categories rather than
reliability, validity and practicability of the question- with the positive ones given that on its own it was not
naire” (Cohen, Manion, and Morrison 2007, 341) by telling me anything about the students’ opinions. As
checking whether it is “relatively easy to answer, easy to mentioned above, a mid-point category attracts a lot
record and evaluate, user-friendly and unambiguous” of attention because many respondents find it comfort-
(McDonough and McDonough 1997, 177). able not having to take a definite stand on a particular
The choice of participants forming part of the pilot- issue (Newby 2014). The data showed that this was cer-
ing phase is an important consideration for teacher- tainly the case with seminars and tutorials and when I
researchers. In line with Brown and Rodgers’s (2002, questioned students about their choice, they conceded
143) recommendation to pilot a questionnaire with that this was somewhat true. Hence, I decided to omit
participants similar to the ones who will form part of the mid-point category from the revised questionnaire.
the final study, I chose to pilot my questionnaire with a Besides dealing with issues concerning wording of
group of students attending an A-level English course at questions, teacher-researchers might sometimes have
an institution similar to my school. Following Newby’s to change the format of certain items. For exam-
(2014, 335) advice “to sit with respondents as they com- ple, in another set of three items in my question-
plete the questionnaire and ask them to comment on naire students were asked whether they would like to
the questions as they answer them”, I chose to person- see any changes in their set text lectures, seminars,
ally administer the questionnaire to students in small and tutorials. A number of students merely answered
68 D. XERRI

affirmatively or negatively and failed to elaborate. my colleagues, learning from some of the differences to
Hence, I decided to change these three items into improve my teaching. Moving away from the insular-
binary questions and provide adequate space for stu- ity of focusing only on my students and myself, I was
dents to write down reasons for their choice of answer. able to gain a better understanding of where I stood
Besides serving as an opportunity for redrafting as a practitioner in my context. I consider this to have
some of the items in a questionnaire, the pilot offers been the most important benefit of my use of a ques-
teacher-researchers an avenue for data cross-checking. tionnaire.
For instance, I found that the attitudes towards texts The deployment of a questionnaire (in conjunc-
recorded by means of my questionnaire were largely tion with other instruments) facilitated the process of
also highlighted by the student interview guide, topic addressing some of the problems I had noticed my
guide, and stimulus material. This is in line with the students facing when reading texts in class for exam-
idea that the best way of assessing the validity of ination purposes. This is very much in line with the
an instrument is by comparing its results with the idea that “teacher-researchers raise questions about
data yielded by other instruments (McDonough and what they think and observe about their teaching and
McDonough 1997, 179). For example, the question- their students’ learning” (MacLean and Mohr 1999, x).
naire respondents’ appreciation of a lecturing approach Knowing how to answer those questions systematically
during lessons on a set text and a desire for more inter- enables teacher-researchers to enhance their teaching
action during seminars and tutorials also emerged dur- and their students’ learning.
ing the analysis of the data generated by the other
instruments I used in my study. This meant that the
findings generated by means of the different instru- References
ments in most cases corroborated each other. More-
Borg, S. 2013. Teacher research in language teaching: A critical
over, as a teacher-researcher I was pleased to see that
analysis. Cambridge: Cambridge University Press.
in completing the questionnaire students answered Brown, J. D., and T. S. Rodgers. 2002. Doing second language
the questions in a consistent fashion, that is, the data research. Oxford: Oxford University Press.
yielded by one particular item verified that produced Burns, A. 2010a. Doing action research in English language teach-
by other questions. For example, the percentage regis- ing: A guide for practitioners. New York: Routledge.
tered in relation to student satisfaction with the activi- Burns, A. 2010b. Teacher engagement in research: Published
resources for teacher researchers. Language Teaching 43 (4):
ties they did during their seminars was in line with the
527–36.
results yielded by the question on students’ evaluation Cohen, L., L. Manion, and K. Morrison. 2007. Research methods
of a teacher’s methodology during such seminars, and in education. 6th ed. London and New York: Routledge.
the question on students’ enjoyment of said seminars. Dörnyei, Z. 2007. Research methods in applied linguistics.
Oxford: Oxford University Press.
Dörnyei, Z., and T. Taguchi. 2010. Questionnaires in second lan-
Conclusion guage research: Construction, administration, and processing.
2nd ed. Abingdon and New York: Routledge.
After having used a questionnaire in one of my stud- Field, J. 1997. Key concepts in ELT: Classroom research. ELT
ies I am convinced of the value of this instrument Journal 51 (2): 192–3.
for teacher-researchers as long as it is deployed in the Henderson, B., D. Meier, G. Perry, and A. Stremmel. 2012.
The nature of teacher research. Voices of practitioners 1–
systematic manner I have outlined. My use of a ques-
7.https://www.naeyc.org/files/naeyc/file/vop/Nature%20of
tionnaire provided me with a broad picture of students’ %20Teacher%20Research.pdf
attitudes, beliefs and practices in relation to the texts Mackey, A., and S. M. Gass. 2005. Second language research:
studied in class. The students’ views about the latter Methodology and design. New Jersey: Lawrence Erlbaum
also offered me an indication of what went on during Associates.
A-level English lessons at my school. The questionnaire MacLean, M. S., and M. M. Mohr. 1999. Teacher-researchers at
work. Berkeley, CA: National Writing Project.
helped me to develop an insight into which classroom
Margolis, J. 2009. How teachers learn. Educational Leadership
practices students valued and found useful. The fact 66 (5). http://www.ascd.org/publications/educational-lead
that I did not restrict the questionnaire solely to my ership/feb09/vol66/num05/How-Teachers-Lead-Teachers.
students enabled me to compare myself as a teacher to aspx
THE CLEARING HOUSE 69

McDonough, J., and S. McDonough. 1997. Research methods for Newby, P. 2014. Research methods for education. 2nd ed. Abing-
English language teachers. London: Edward Arnold. don and New York: Routledge.
Munn, P., and E. Drever. 2004. Using questionnaires in small- Wagner, E. 2010. Survey research. In Continuum companion to
scale research: A beginner’s guide. Rev. ed. Glasgow: The research in applied linguistics, edited by B. Paltridge, and A.
SCRE Centre, University of Glasgow. Phakiti, 22–38. London and New York: Continuum.

You might also like