Developing Multiple Choice Questions

A multiple choice question (MCQ) is an assessment item consisting of a stem, which poses the question or problem, followed by a list of possible responses, also known as options or alternatives. One of the alternatives will be the correct or best answer, while the others are called distracters, the incorrect or less correct answers.

MCQ example

Stem:

The stem should be able to stand alone as a short-answer question without the alternatives. The stem should either be in a question format or completion format, but the question format is typically recommended by experts.  When using the completion stem, the blank to be answered should always be at the end. The stem should always include a verb.

Distracters:

Distracters are intended to offer feasible but inaccurate, incomplete, or less accurate answer options that a student who does not know or has an incomplete understanding of the material may select. Typically, a question will have 3 or 4 distracters. Research suggests that more than 4 distracters provides little benefit. Less than three distracters improve the odds of guessing. All distracters should be homogenous in content, form, and grammatical structure.

Each distracter should be unique and plausible. If no one ever picks a distracter as the answer, it is not a good distracter. One distracter should not be rephrased to create another distracter. Avoid using synonyms or a similarly spelled word as means to create a distracter.

MCQ 2

Reducing guessing

Students have developed techniques for guessing correct answers. Here are some common guessing rules of thumb and how to defeat them:

Guessing rule of thumb Defeating the rules
Pick the longest answer Make sure the longest answer is correct about 25% of the time if there are 4 alternatives or make all answers the same length
Pick the “b” option Make each answer option the correct one an equal number of times
Never pick an answer with “always” or “never” in it Make sure answers with “always” and “never” are correct as often as they are incorrect or avoid using these words
If two answers are express opposites, one is the correct answer Offer opposites when neither is correct
If in doubt, guess Increase the number of alternatives
Pick the scientific-sounding answer Use scientific-sounding jargon in incorrect answers; make the simple answer correct
Pick a word related to the topic

Use terminology from the topic in distractors too.

Tips:

  • Avoid the use of “all of the above” or “none of the above” as altneratives. Test-takers easily catch on that “all of the above” is always the correct answer or that it is placed due to lack of other good ideas for distractors. The use of “none of the above” opens up the possibility of other “best” answers that you had not considered.
  • Provide instructions. Some distracters may be correct in certain circumstances. Instruct students to select the “best” answer, rather than the right answer. This eliminates the debate about a distracter being correct under limited or isolated conditions.
  • Be clear. Capitalize all limiting and directive words, such as NOT, PROHIBITED, ALL, MUST, ALLOWED, etc.
  • Grammar counts. Be sure to check spelling and punctuation. Avoid the use of contractions. Maintain grammatical continuity between the stem and alternatives.
  • Reduce the risk of guessing. Avoid answering one question in the test by giving the answer somewhere else in the test. Avoid extremes, nonsense words, and unreasonable statements. Make all options the same length and grammatically matched to the stem, particularly singular versus plural. Avoid nonsensical distracters.
  • Randomly distribute the correct answer among the positions.
  • Avoid writing questions that test the test-takers ability to take a test, aka “trick questions.” Trick questions typically do not assess the student’s master of the learning objectives.
  • Letter the options. Avoid numbering the options. Lettering reduces confusion.

Additional resources:

  • Ebel, R. L., & Frisbie, D. A. (1986).  Essentials of educational measurement (4th ed.).  Englewood Cliffs, NJ: Prentice-Hall.
  • Haladyna, T. M., & Downing, S. M. (1989a).  A taxonomy of multiple-choice item-writing rules. Applied Measurement in Education, 2(1), 37-50.
  • Haladyna, T. M., & Downing, S. M. (1989b).  Validity of a taxonomy of multiple-choice itemwriting rules.  Applied Measurement in Education, 2(1), 51-78.
  • Hopkins, C. D., & Antes, R. L. (1979).  Classroom testing: construction.  Itasca, IL: F. E. Peacock.