Writing Effective Multiple Choice Questions

Components of a Multiple Choice Question

Student “Guessing Strategies”

Designing Questions to Promote Higher Order Thinking

Multiple Choice questions are popular due to the potential for automated grading, but writing good multiple choice questions can be a significant investment of time.



Components of a Multiple Choice Question

Each multiple choice question is composed of :

  • a stem (the question)
  • the key (the correct answer)
  • distractors (the incorrect choices).

Write a clear simple stem.

The stem should:

  • Be meaningful by itself

  • Focus on a single idea or concept

  • Not include irrelevant material

  • Be negatively stated ONLY when significant learning outcomes require it

  • Be phrased as a question (preferable to a partial sentence)

Each question should look at a single idea or concept - this becomes the stem. Question stems that are not written with a specific learning objectives in mind often end up measuring lower-level objectives exclusively, or covering trivial material that is of little educational worth. The question stem should be clear and straightforward - your goal is to assess student knowledge not reading comprehension of a trickily worded question. Use negatives sparingly, always avoid double negatives. The stem should only be negatively stated when significant learning outcomes require it. For instance, when identifying dangerous lab or clinical practices. In this case, using the word ‘except’, called out in bold, can sometimes improve clarity. A question stem is preferable because it allows students to focus on just answering the question (less cognitive load), rather than holding a partial sentence in working memory to try and complete a sentence.

Write plausible distractors.

Distractors should:

  • Be plausible

  • Be clear and concise

  • Be mutually exclusive

  • Avoid clues

  • Not include absolutes (always, never, all)

  • Avoid ‘all of the above’ and ‘none of the above’

  • Present in logical order

All distractors should be plausible - avoid silly or ridiculous distractors. Do not use excessively wordy stems and be mindful of spelling, grammar or typos in distractors. Distractors should incorporate common student misunderstandings, where appropriate. Alternatives should be mutually exclusive and avoid overlapping content. For example, (a) 1-8 msV  (b) 8-16 msV are overlapping and should be avoided. Your distractors and correct key should all be similar in length, complexity, and use of relevant terminology. Use parallel grammar in constructing distractors to avoiding confusion with reading comprehension. Be mindful of using absolutes like "always, "never" and "all". “All of the above” and “none of the above” choices in questions are problematic because they allow students with partial information to answer questions successfully. Last, alternatives should be presented in some logical order to avoid a bias toward certain correct answer positions (e.g. alphabetical, numerical, etc.).

Student “Guessing Strategies”

Make sure your questions are solid assessment tools by avoiding these common student strategies to defeat multiple choice questions.

“The longest answer is most likely to be correct”
Solution: Make all choices of similar length

“The correct answer is never first”
Solution: Present the choices in random order or a meaningful order (increasing numerically etc.)

“Pick the scientific-sounding answer”
Solution: All choices should make use of relevant terminology

“Choose the one that uses the most important sounding term you remember”
Solution: Use key terms in both the key and the distractors.

Designing Questions to Promote Higher Order Thinking

Multiple Choice questions can be a powerful tool, but we often see them employed in measuring one of the simplest forms of learning : recall. Higher order questions may ask students to predict outcome of situations or select examples that illustrate abstract principle

Consider this example of questions that look at similar content area but at different levels: 

Question 1: Identify three symptoms of a cold.
This is the lowest level: knowledge or recall information presented in a lecture or textbook.

Question 2: Match symptoms with their associated ailments of cold or flu
This asks for comprehension of connection symptoms and ailments.

Question 3: Select which procedure would be useful for determining if a patient has a cold or the flu.
This question asks for application of the ideas.

Creating Higher Order Questions

There are three key elements that distinguish higher order multiple-choice questions from lower level multiple-choice questions.

  • Sequential reasoning

  • Written at high cognitive levels including application, analysis, synthesis, etc.

  • Uses a unique or novel, realistic stimulus (case, scenario, chart, graph, etc.)


Stimuli can either be text-based or a visual. Text-based stimuli can state a claim, passage, provide a mini-case, quote, present a report, data set, or describe an experiment. Visual stimuli can include a chart, graph, table, map, picture, model, diagram, drawing, schematic, spreadsheet, etc. Only your imagination limits you.

Types of Higher Order Thinking Questions

Two types of multiple choice question strategies for encouraging critical thinking:

Multiple Response Multiple choice Questions

Also known as “select all that apply” multiple response questions are questions with multiple correct answers that must all be selected to receive credit.


  • allows for several correct answers

  • less opportunity for guessing

  • avoids “all of the above”

  • models NCLEX exam questions


  • require more options (usually 5 or 6)

  • more distractors make these questions more difficult to construct

Multiple Multiple choice Questions

Series or cluster of test questions presented under one stimuli.


  • Quick and easy to score

  • Can test a wide range of low AND higher-order thinking skills

  • Can cover a lot of content areas


  • Unprepared students could still guess

  • Including misinformation could influence subsequent thinking about the content

  • Takes time and skill to construct good questions

References and Further Readings

Billings, D. M., & Halstead, J. A. (2013). Teaching in nursing: A guide for faculty. Elsevier Health Sciences.

Bull, J. & McKenna, C. (2002). Computer Assisted Assessment Centre. Retrieved 8 February 2009.

Brown, G. & Pendlebury, M. (1992). Assessing Active Learning. Sheffield: CVCP, USDU.

Carnegie Mellon’s Eberly Center for Teaching Excellence from

Cohen, A., & Wollack, J. (2000). Handbook on test development: Helpful tips for creating reliable and valid classroom tests. Madison, WI: University of Wisconsin, Center for Placement Testing. Retrieved 13 October, 2003 from

Dewey, R. A. (1998, January 20). Writing multiple choice items which require comprehension.

Kehoe, J. (1995) Writing multiple-choice test items. Practical Assessment, Research & Evaluation, 4(9). Retrieved July 29, 2008 from

McDonald, M. (2014). The nurse educator's guide to assessing learning outcomes. Jones & Bartlett Publishers.

Nedeau-Cayo, R., Laughlin, D., Rus, L., & Hall, J. (2013). Assessment of item-writing flaws in multiple-choice questions. Journal for nurses in professional development, 29(2), 52-57.

Nitko, A. J. (2001). Educational assessment of students. (3rd Ed.). Columbus, OH: Merrill Prentice Hall.

Owen, S. & Freeman, R. (1987). What's wrong with three option multiple items? In Educational & Psychological Measurement (47), 513-22.

Parkes, J. Multiple Choice Test. Retrieved 20 September 2005

Tarrant, M., Knierim, A., Hayes, S. K., & Ware, J. (2006). The frequency of item writing flaws in multiple-choice questions used in high stakes nursing assessments. Nurse education in practice, 6(6), 354-363.

Zimmaro D. (2004). Writing Good Multiple-Choice Exams, Measurement and Evaluation Center: University of Texas, Austin