A comparability study of handwritten versus typed responses in high-stakes English language writing tests

  • Irene Stoukou PeopleCert, Athens, Greece
  • Yiannis Papargyris PeopleCert, Athens, Greece
  • David Coniam PeopleCert, Athens, Greece
Keywords: test score comparability, Writing tests, handwritten vs. typed, CEFR, high stakes tests

Abstract

This paper investigates fairness in writing test scores in terms of candidates who completed a writing test either by hand or typed, on a computer. The data for this large-scale comparability study comprise candidates taking English language writing tests at four CEFR levels – B1 to C2 in the period 2019–2022. The data were analysed via effect size differences and equivalence tests. Measured by effect size, a small amount of difference was apparent in scores obtained between the two production modes at B1, B2 and C1 levels. At C2 level, there was a medium effect size, indicative of a difference in favour of computer-produced scripts. Differences observed on equivalence tests – an adaptation of the standard t-test – were not found to be statistically significant. The contribution of the research to knowledge lies in the fact that (with the exception of C2 level) – whether writing tests are written by hand or on computer, while there is a slight skew towards higher scores with computer-processed texts, candidates generally receive similar scores in both modes. Practically, candidates may elect to write either on paper or on computer without fear of bias.

Published
2023-03-31
How to Cite
Stoukou, I., Papargyris, Y., & Coniam, D. (2023). A comparability study of handwritten versus typed responses in high-stakes English language writing tests. ELT Forum: Journal of English Language Teaching, 12(1), 36-43. https://doi.org/10.15294/elt.v12i1.66354