On Varying Item Difficulty by Changing the Response Format for a Mathematical Competence Test

Christine Hohensinn, Klaus D. Kubinger

    Veröffentlichungen: Beitrag in FachzeitschriftArtikelPeer Reviewed


    Educational and psychological aptitude and achievement tests employ a variety of different response formats. Today the predominating format is a multiple-choice format with a single correct answer option (at most out of altogether four answer options) because of the possibility for fast, economical and objective scoring. But it is often mentioned that multiple-choice questions are easier than those with constructed response format, for which the examinee has to create the solution without answer options. The present study investigates the influence of three different response formats on the difficulty of an item using stem-equivalent items in a mathematical competence test. Impact of formats is modelled applying the Linear Logistic Test model (Fischer, 1974) appertaining to Item Response Theory. In summary, the different response formats measure the same latent trait but bias the difficulty of the item.
    Seiten (von - bis)231-239
    FachzeitschriftAustrian Journal of Statistics
    PublikationsstatusVeröffentlicht - 2009

    ÖFOS 2012

    • 501005 Entwicklungspsychologie