DIKUL - logo
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • The role of rapid guessing ...
    Nagy, Gabriel; Ulitzsch, Esther; Lindner, Marlit Annalena

    Journal of computer assisted learning, June 2023, Letnik: 39, Številka: 3
    Journal Article

    Background Item response times in computerized assessments are frequently used to identify rapid guessing behaviour as a manifestation of response disengagement. However, non‐rapid responses (i.e., with longer response times) are not necessarily engaged, which means that response‐time‐based procedures could overlook disengaged responses. Therefore, the identification of disengaged responses could be improved by considering additional indicators of disengagement. We investigated the extent to which decreases in individuals' item solution probabilities over the course of a test reflect disengaged response behaviour. Objectives To disentangle different types of possibly disengaged responses and better understand non‐effortful test‐taking behaviour, we augmented responses‐time‐based procedures for identifying rapid guessing with strategies for detecting disengaged responses on the basis of performance declines in non‐rapid responses. Methods We combined item response theory (IRT) models for rapid guessing and test‐taking persistence to examine the capability of response times and item positions to capture response disengagement. We used a computerized assessment in which science items were randomly distributed across positions for each student. This allowed us to estimate individual differences in test‐taking persistence (i.e., the duration for which the initial level of performance is maintained) while accounting for rapid responses. Results and Conclusions Response times did not fully explain disengagement; item responses reflected test‐taking persistence even when rapid responses were accounted for. This interpretation was supported by a strong correlation of test‐taking persistence with decreases in self‐reported test‐taking effort. Furthermore, our results suggest that IRT models for test‐taking persistence can effectively account for the undesirable impact of low test‐taking effort even when response times are unavailable. Practitioner Notes Assessments of proficiencies that attempt to quantify what individuals know and can do lead to biased results when individuals provide disengaged responses. Item response times are frequently used to identify rapid guessing behaviour as a manifestation of response disengagement, but response‐time‐based procedures could overlook disengaged responses. To disentangle different types of possibly disengaged responses and better understand non‐effortful test‐taking behaviour, we augmented responses‐time‐based procedures for identifying rapid guessing with strategies for detecting disengaged responses on the basis of performance declines in non‐rapid responses. In a sample of fifth and sixth graders, we found that response times did not fully explain disengagement, as many students showed performance declines in non‐rapid item responses. Our results suggest that item response theory models for test‐taking persistence can effectively account for the undesirable impact of low test‐taking effort even when response times are unavailable.