A Comparison of the Items and Specifications of Two English Tests in Multiple Choice and Short Answer Format Measuring the Same Behavior in Foreign Language Teaching

Adnan Kan, Ulaş Kayapınar

Abstract

The purpose of the study is to determine the differences between two English tests constructed with multiple choice format and short answer format which measure the same behavior in accordance with their item difficulty indexes, item discrimination indexes, test reliability indexes, criterion related validity coefficients, and test score averages. Computed item and test statistics according to the item scores which were supplied from 139 students from 10 different English preparatory classes revealed that these two tests have clear differences related to their item difficulty indexes, item discrimination indexes, and test reliability indexes but there is not a distinct difference related to their test score averages. Moreover, it can be slated that the short answer format test scores predict the final exam scores better than the multiple choice formal test scores.

Keywords

Multiple choice test, short answer test, item and test properties, validity, and reliability

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.