You are on page 1of 4

Developing Multiple Choice Questions

Multiple choice test items can be used to test factual recall, levels of understanding, and ability to apply
learning (analyzing and evaluating). Multiple choice tests can also provide an excellent pre-assessment
indicator of student knowledge as well as a source for a post-test discussion. A class discussion can
address why the incorrect responses were wrong as well as why the correct responses were right.
Unfortunately, multiple choice questions are difficult to construct and can be challenging to select when
using a test bank. The following information is designed to guide you in the development and pursuit of
good multiple choice test items.

Multiple choice test items contain a stem which identifies the question or problem. The response
choices contain the correct answer and several distractors. This guide provides some general strategies
and tips for designing stems and distractors.

General Strategies

1. Be transparent. Align questions with your learning outcomes and instructional activities. A good
question from a test bank is not always a good fit for your learning outcomes and teaching
activities.

2. Provide tips to students on how to write multiple choice tests, and allow the experience to be
an opportunity for students to “show what they know”. Student Learning Success Guides,
available through the Camosun library home page under Research Guides, are an excellent
source for tips on writing multiple choice tests.

3. Write questions throughout the term (a few each week) as you work through the content so
that you are not left constructing it all at the end of term.

4. Instruct students to select the best answer. Distractors may have an element of truth but are
not the correct answer.

5. Use familiar language. Do not introduce new terminology or use expressions that may not be
familiar to all students in your class.

6. Avoid trick questions. Students who know the material should be able to determine the correct
answer.

7. Randomly distribute the correct responses. This avoids patterning and guessing.

Designing Stems

1. Express the full problem in the stem. Students who know the material should be able to answer
the question without looking at the response choices.

2. When possible, state the stem as a direct question. This will help reduce vagueness and
wordiness in the stem. Students shouldn’t have to read the response options to understand the
question.

3. Include any repeated words in the stem. This shortens reading time for the student and
reduces confusion.
4. Avoid negative wording. Students often make mistakes on negatively worded questions,
especially double negatives. If you are using a negative statement in your stem, highlight the
negative word by underlining or capitalizing the word.

Poor example:
A nurse is assessing a client who has pneumonia. Which of these assessment findings indicates
that the client does not need to be suctioned?
a) Diminished breath sounds
b) Absence of adventitious breath sounds
c) Inability to cough up sputum
d) Wheezing following bronchodilator therapy
Better example:
Which of these assessment findings, if identified in a client who has pneumonia, indicates that
the client needs suctioned?
a) Absence of adventitious breath sounds
b) Respiratory rate of 18 breaths per minute.
c) Inability to cough up sputum.
d) Wheezing prior to bronchodilator therapy.

Designing Distractors

1. Limit the number of distractors. Use between 3 and 5 options per question. It is difficult to
come up with good distractors. The use of additional distractors will increase reading time.
Research shows that three distractors are about as effective as four or five distractors.

2. Make the distractors appealing and plausible. If the distractors are far-fetched, students can
too easily locate the correct answer, even if they have little knowledge. The best distractors help
diagnose where students went wrong in their thinking.

3. Make sure there is only one correct answer. Avoid having two or more answers that are
arguably correct where one is more correct than the other. Note, this differs from distractors
that contain an element of truth.

4. Make the choices grammatically consistent with the stem. This improves readability and
reduces confusion.

5. Put the choices in meaningful order when possible. This could be numerical, chronological, or
conceptual order.

6. Avoid using “all of the above” or “none of the above”. An “all of the above” option means
students must read every response, increasing test-taking time, and penalizing those who may
take more time. If the students know two answers are correct they may incorrectly select “all of
the above”. If they know one answer is incorrect, they know that “all of the above” is also
incorrect. The option “none of the above’ does not test whether the student knows the correct
answer, only that the distractors aren’t correct.
7. Avoid using “which of the following” items. This increases test-taking time and penalizes
students who may take more time to read.

8. Avoid using words such as always, never, all, or none. Most students know that few things are
universally true or false so distractors with these words can easily be eliminated as plausible
answers.

9. Make the distractors mutually exclusive. Overlapping alternatives may easily be identified as
distractors. Furthermore, if the overlap includes the intended answer, there may be more than
one alternative that can be defended as being the answer.
Poor example
How long does a biennial plant generally live?
a) It dies after the second year
b) It lives for many years
c) It lives for more than one year
d) It needs to be replanted every two years
Better example
How long does a biennial plant generally live?
a) One year
b) Two years
c) Several years

10. Make distractors approximately equal in length. Students often select the longest option as the
correct answer.

For additional examples of multiple choice items that demonstrate poor and improved construction,
check out the following guide: How to Prepare Better Multiple-Choice Test Items: Guidelines for
University Faculty. The Centre for Excellence in Teaching and Learning also has resources that can help
with multiple choice test construction including Tools for Teaching by Barbara Gross Davis. There are
additional examples in the resources listed in the Further Reading section.

Further Reading

Bothell, T.D. (2001). Brigham Young University. Faculty Center. 14 Rules For Writing Multiple-Choice
Questions. Retrieved from:
https://testing.byu.edu/handbooks/14%20Rules%20for%20Writing%20Multiple-
Choice%20Questions.pdf (May 2019)

Burton, S.J., Sudweeks, R.R., Merrill, P.F., Wood, B. Brigham Young University Testing Services and the
Department of Instructional Science. (1991). How to Prepare Better Multiple-Choice Test Items:
Guidelines for University Faculty. Retrieved from
https://testing.byu.edu/handbooks/betteritems.pdf (Sept 2019)

Clay, B (2001). Is This a Trick Question? A Short Guide to Writing Effective Test Questions. Retrieved
from: https://www.k-state.edu/ksde/alp/resources/Handout-Module6.pdf (May 2019).
Davis, B. (2009). Tools for Teaching. San Francisco: Jossey-Bass.

Designing Multiple-Choice Questions. Centre for Teaching Excellence, University of Waterloo. (n.d.).
Retrieved from https://uwaterloo.ca/centre-for-teaching-excellence/teaching-
resources/teaching-tips/developing-assignments/exams/questions-types-characteristics-
suggestions (March 2019)

Designing Quality Multiple Choice Questions. Yale University, Poorvu Center for Teaching and Learning,
retrieved from https://poorvucenter.yale.edu/MultipleChoiceQuestions (June 2019)

Fozzard, N., Pearson, A., du Toit, E., Naug, H., Wen, W., Peak, I.R. Analysis of MCQ and distractor use in a
large first year Health Faculty Foundation Program: assessing the effects of changing from five
to four options. BMC Medical Education. Volume 18, Article number: 252 (2018)

Jacobson, M. Werklund School of Education, University of Calgary. (n.d.). Multiple Choice Item
Construction. Retrieved from https://people.ucalgary.ca/~dmjacobs/portage/index.htm (June
2019)

Suskie, L. (2009). Assessing Student Learning. San Francisco: Jossey-Bass.

Three Options Are Optimal for Multiple-Choice Items: A Meta-Analysis of 80 Years of Research,
Educational Measurement: Issues and Practice, Volume 24, Number 2, June 2005, pp. 3-13.
Retrieved from: https://www.ernweb.com/educational-research-articles/the-optimal-number-
of-choices-in-a-multiple-choice-test/ (Sept 2019)

You might also like