Authors: Jason Ridgeway*, Texas A&M University, Ben Sylvester*,
Topics: Higher Education
Keywords: education, testing, physical geography
Session Type: Paper
Presentation File: No File Uploaded
Testing using multiple-choice items is a well-established and effective tool for student assessment. The number of answer options used on multiple-choice tests, however, varies widely. A metanalysis by Rodriguez (2005) suggests that items with three answer-options are optimal for most purposes. Three-option items are faster to write and faster to read, and the rise in student scores due to increased successful guessing is slight. This paper examines whether the use of three-option items in a large-enrollment collegiate physical geography course results in significantly higher test scores compared to the use of four-option items. We administered two versions of a 50-item multiple-choice exam to two randomly selected samples from a population of 588 students. On each version of the exam, 25 multiple-choice items contained three answer options, while the other 25 items contained four answer options. We used the two-tailed t-test to compare student performance between the two groups on total scores, item difficulty, and discrimination indices. We found that students performed on average 3.5% better on the three-option items compared to the four-option items. While this difference is statistically significant at the 0.05 level, we have found that in practice this slight rise in scores can be countered by the use of slightly more difficult questions. We found no significant difference in item difficulty or discrimination. We thus find support for Rodriguez's recommendation to use three-option items when there is no compelling reason to use more.