Assessment & Marking Resources

Writing multiple choice questions – a handy guide

hand filling in multiple choice script

Multiple choice questions (MCQs) can be a useful time saver for academics as they are quick to mark, but they are only useful to the learners if they are planned well.  Here are some pointers for you to consider when designing and implementing multiple choice quizzes, whether on paper or in Blackboard.

Initial planning

  • Is a multiple choice quiz the right way to test what your learners do and don’t understand? Think carefully so that you can avoid using tests to simply punish a lack of knowledge or reward cramming.
  • Does the assessment accurately reflect the course content?
  • Do you have a system in place to monitor how your learners have answered? Item analysis of MCQs can help identify whether what you thought was an easy question has proved difficult for learners, or vice versa. This will enable you to review, refine and re-use your questions.

Structure your assessment and feedback

  • Use frequent, small quizzes and tests rather than monolithic once-or-twice per-term exams.
  • Consider using small quizzes as formative assessments where learners can be given detailed feedback on all the answers, and can take the quizzes more than once.
  • Give learners instant feedback on their performance (for example, displaying the correct answers and feedback onscreen after the test has been turned in)
  • Consider allowing learners to take quizzes first as individuals and then the same quiz again in groups.
  • Multiple-choice questions are easiest to write when there is a definitively right or wrong answer.
  • Multiple-choice testing of more interpretive material should always include an appeal mechanism in which learners can and must make a written, evidence-supported case for their answer.

Writing a good MCQ test

Take your time

  • Do not write the test in one day. Spread the work out over time. Questions demanding high-level thinking take longer to craft – professional item writers often write only 3 or 4 per day. Write one or two questions after each lecture or seminar, so it becomes a simple matter of assembling them into an exam.

Writing the stem

  • Phrase question stems as clearly as possible. Confusing questions can generate wrong answers from learners who do understand the material. For example, a confusing stem like: “According to Tuckman’s model, groups develop through several stages over time. Furthermore, it contradicts Poole’s activity-track model which has groups switching among several different linear sequences. Which of the following is not one of the stages identified in Tuckman’s model?” could be cleaned up to read: “Tuckman’s model of group development includes: [Select all that apply]”
  • Avoid extra language in the stem. Some think extraneous details make a question more complex. However, they most often just add to the learners’ reading time. This reduces the number of questions you can put on a test, therefore reducing the reliability of the test. For example, in the Tuckman question above, the information on Poole’s model had nothing to do with the information sought by the question.
  • Include any language in the stem that you would have to repeat in each answer option. For example, a stem such as “Biology is defined as the scientific study of:” keeps you from having to repeat “the scientific study of” at the beginning of each option.
  • Avoid, where possible, asking for a negative answer. For example, “Which of the following is not an explicit norm?”. Learners under the stress of a test can often misread this.  If you do use a question asking for a negative answer, ensure that this is made clear by highlighting the negative words in capitals or changing the font to bold.

Answer options

  • Avoid lifting phrases directly from texts or lecture notes. This becomes a simple recall activity for the learner. Use new language as frequently as possible.
  • Most literature recommends writing the correct answer before writing the distracters. This makes sure you pay enough attention to formulating the one clearly correct answer.
  • Answer options should be about the same length and parallel in grammatical structure. Too much detail or different grammatical structure can give the answer away.
    For example, the specificity and grammatical structure of the first option here are dead giveaways:

The term “side effect” of a drug:

1. refers to any action of a drug in the body other than the one the doctor wanted to drug to have.

2. is the chain effect of a drug.

3. additionally benefits the drug.

  • Limit the number of answer options. Research shows that three-choice items are about as effective as four-choice items. Four or five choice items are the most popular. Consider carefully, if you give more than five alternatives, whether this is appropriate, or may just be adding to learners’ reading time.
  • Distracters must be incorrect, but plausible. If you can, include among the distracters options that contain common errors. Learners will then be motivated to listen to your explanations of why those options are incorrect.
  • To make distracters more plausible, use words that should be familiar to learners.
  • If a recognisable key word appears in the correct answer, it should appear in some or all of the distracters as well. Don’t let a verbal clue decrease the accuracy of your exam.
    For example, someone with no biology background would not have to think very hard to make a correct guess on this question:

Every organism is made of cells and every cell comes from another cell. This is the:

1. Relativity Theory

2. Evolution Theory

3. Heat Theory

4. Cell Theory

  • Help learners see crucial words in the question by marking them out in capitals, or by changing the font to bold or underlined. For example: “Which of the following is an explicit norm?”. Likewise, when you ask a similarly-worded question about two different things, always highlight the difference between the questions.
  • It is often difficult to come up with 3 or 4 plausible distracters, and you may be tempted to add some that are not plausible, or even humorous. Be careful. If it is too easy to eliminate one or two options, then the question loses much of its measurement value. If energy or time is limited and you must come up with one more distracter, consider either offering a true statement that does not answer the question and/or a jargon-ridden option that is meaningless to someone who understands the concept.
  • Use the following terms rarely:

1. Extreme words like all, always and never (generally a wrong answer).

2. Vague words or phrases like usually, typically and may be (generally a correct answer).

3. All of the above – eliminating one distracter immediately eliminates this, too.

4. None of the above – use only when the correct answer can be absolutely correct, such as in mathematics, grammar, historical dates, geography, etc. Do not use with negatively-stated stems, as the resulting double-negative is confusing. Studies do show that using None of the above does make a question more difficult, and is a better choice when the alternative is a weak distracter.

Effective feedback to encourage further study

If you are designing a formative quiz, consider giving pointers in your feedback to where learners can find the correct answer themselves, rather than a straight correct/incorrect statement.

Give meaningful feedback on all answers, even the correct ones. If a learner has guessed the correct answer but doesn’t know why it is correct, your feedback should be able to give them the background information they need to understand why their answer was the right one. You could also direct learners to other resources or areas for further study.

References and further reading

– Bauer, D., Holzer, M., Kopp, V. and Fischer Martin R. 2011. Pick-N multiple choice-exams: a comparison of scoring algorithms. Advances in Health Sciences Education: Theory Practice. Abstract 16(2):211-21.

– Beullens, J., Struyf, E., Damme, B. V. 2005. Do extended matching items multiple choice measure clinical reasoning. Medical Education, 39 (4) 410-417 – Case and Swanson. 2001. Constructing Written Test Questions for the Basic and Clinical Sciences. Third Edition (Revised)

– Case, S.M., Swanson, D.B, and Ripkey, D.R. 1994. Comparison of items in five-option and extended-matching formats for assessment of diagnostic skills. Academic Medicine, Abstract 69 (10 Suppl):S1-3.

– Chandratilake, M, Davis, M., Ponnamperuma, G. 2011. Assessment of medical knowledge: the pros and cons of using true/false multiple choice questions. The National Medical Journal of India. Abstract; 24(4):225 – Collins, J. 2006. Writing Multiple-Choice Questions for Continuing Medical Education Activities and Self-Assessment Modules. RadioGraphics; 26:543–551

– Fowell, S.L., Fewtrell, R., McLaughlin, P.J. 2008. Estimating the minimum number of judges required for test-centred standard setting on written assessments. Do discussion and iteration have an influence? Advances in Health Science Education Theory Practice. 13(1):11-24.

– George, S. 2003. Extended matching items (EMIs): solving the conundrum. The British Journal of Psychiatry, (27) 230-232

– Luckett, K and Sutherland L. 2000. Assessment practices that improves Teaching and Learning, Experiencing a Paradigm shift Through Assessment, Chapter 1, Huba and Jann

– Norcini, J. 2003 Setting standards on educational tests. Medical Education, 37:464–469

– McCowan R and McCowan S. 1999. Item Analysis for Criterion-Referenced Tests. Available at: http://www.eric.ed.gov/ERICWebPortal/search/detailmini.jsp? _nfpb=true&_&ERICExtSearch_SearchValue_0=ED501716&ERICExtSearch_SearchType_0=no&accno=ED501716. Accessed 9 January 2012.

– Phipps, S.D and Brackbill, L. 2009. Relationship Between Assessment Item Format and Item Performance Characteristics. American Journal of Pharmaceutical Education 73(8): 146.

– Ripkey, D.R., Case, S.M and Swanson, D.B. 1996. A “New” Item format for Assessing Aspects of Clinical competence. Academic Medicine, 71 (10) 34-37.

– Swanson, D.B., Holtzman, K.Z., Allbee, K. 2008. Measurement characteristics of content-parallel single-best-answer and extended-matching questions in relation to number and source of options. Journal of the Association of American Colleges, 83(10 Suppl):S21-4

– Swanson, D.B., Holtzman, K.Z., Allbee, K., Clauser, and B.E. 2006. Psychometric characteristics and response times for content-parallel extended-matching and one-best-answer items in relation to number of  options. Journal of the Association of American Colleges. Abstract 81(10 Suppl):S52-5

– Zurawsk, R. M.1998. Making the Most of Exams: Procedures for Item Analysis. The National Teaching and Learning Forum. (7) 6

Produced from initial research and documents written by Steve Wheeler