Multiple-choice question design, as with any assessment type, starts with the student learning objectives (SLOs) . It’s important to have alignment between your specific questions and SLOs and to recognize when a multiple-choice question is and is not appropriate.
The Knowledge and Comprehension categories of Bloom’s Taxonomy are fairly simple to measure with multiple choice questions. It lends itself well to questions asking students to “identify”, “distinguish”, “recognize”, “recall”, or “classify” something. It is increasingly more difficult to write multiple-choice questions for the higher-order thinking categories of Application, Analysis, and Evaluation. In order to do this, you often have to create complicated stems (providing a reading passage or chart in the question portion) for them to “interpret”, “infer”, “predict” or “conclude” from. Furthermore, it is impossible to write a multiple-choice question for Synthesis since this category requires the student to create something new.
- Have your TAs review the principles linked here and create questions each semester to add to your testing bank
- For high-stakes exams (15% or more of the final grade), or anytime the total portion of your course exceeds 40% assessment by test questions, you should use a proctor service such as ProctorU
- Ensure that the overall number of questions for each objective corresponds to the intended emphasis/importance of the objective (each objective may be of equal importance or there may be one or two dominant objectives being assessed)
- Use separate question banks for lower-order and higher-order questions, pulling some from each to assess how well the students know the construct and guide your recommendations for improvement
- It is likely that you will want to give more weight to higher order questions
- Keep wording simple and clear, the goal is to assess what the students know and the assessment becomes less reliable if they are confused by the phrasing. Ex:
- Remove unnecessary information in the answer choices and in the question (unless the goal is for them to analyze the information in the stem to determine which part is pertinent)
- negative wording such as which one is not true (try to avoid but be sure to emphasize NOT or EXCEPT if you need to phrase the question this way)
- partial sentence with parts missing at the beginning or middle (the stem should have either a clear question or a partial sentence where students fill in the END of the sentence with the answer choice)
- Tips for better distractor choices:
- Make distractors plausible
- Ex: 1 – 4 = ?
- -3 (correct answer)
- 0 (a student might select this if they think they cannot subtract a larger number from a smaller number)
- 3 (the resulting mistake if the student subtracted 1 from 4)
- 4 (4 is highly unlikely as the answer, but you could replace this distractor with 5 which would be the mistake of adding instead of subtracting)
- Ex: 1 – 4 = ?
- Make distractors plausible
- Avoid having the correct answer frequently being the longest choice
- Avoid All, Always, Never, etc.
- Use alphabetical order to deter unintentional indicators of the correct response (or other logical ordering like numbers in increasing values)
- Check the grammar between stem and all answer choices
- Write multiple (at least 3) questions for the same learning objective to ensure the quality of the question and student mastery of the construct
- When you have multiple questions for the same objective, you can analyze the responses for each question to evaluate if wrong answers are attributable to poorly worded questions/answer choices (including the effectiveness of the distractors) or a true indicator of a gap in learning.
- Cornell’s Test Construction Manual has information on planning, preparing, and analyzing/revising tests. It covers the basic principles of test construction and provides examples.
- The Writing Good Multiple Choice Test Questions resource from Vanderbilt provides a great overview of effective stem wording (how you phrase the question), distractors (the alternatives to the correct answer), and considerations for testing higher-order thinking.
- Considine, J., Botti, M., & Thomas, S. (2005). Design, format, validity and reliability of multiple choice questions for use in nursing research and education. Collegian, 12(1), 19-24.
- Provides a fairly detailed summary of reliability and validity considerations as well as a nice literature review of studies focusing on stem and distractor analysis.
- Active Learning in Online Courses
- Student Engagement in Online Learning
- Learning Analytics
- Student Workload