Investigating the Authenticity of Computer- and Paper-Based ESL Writing Tests
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Technology has become a common medium in high-stakes language testing. Many scholars claimed for a higher authenticity of computer-based (CB) writing tests than paper-based (PB) with little empirical evidence (Lessien, 2013). Test authenticity refers to "the degree of correspondence of the characteristics of a given language test task (LTT) to the features of a target language use (TLU) task" (Bachman & Palmer, 1996, p. 23). Test authenticity influences the construct and consequential validity of a test (Bachman & Palmer, 1996; 2010), but no research empirically compared the authenticity of CB and PB writing tests. This study filled this gap by examining the effect of test medium on the situational and interactive authenticity of English as a second language (ESL) writing assessments. Guided by Bachman and Palmer's (1996) theoretical discussions of authenticity and Liu's (2005) conceptual model of authenticity, this study examined the authenticity of the CB and PB ESL writing tests using an embedded correlational model of mixed-methods design. The results indicated a higher authenticity of the CB test, but also uncovered some concerns with the effect of typing accuracy on the CB test performance. Qualitative results suggested a need for hearing test-takers' voice in choosing the test mediums. The findings indicated the advantages of the CB test for assessing postsecondary ESL students' writing proficiency for a higher degree of authenticity compared to the traditional PB test. Nevertheless, the study also revealed some potential issues with construct validity, fairness, and consequential validity of the CB test for ESL writing assessments. This study extended test validation practices to include authenticity measurements. The findings have practical implications for test development, administration policies, and stakeholders in choosing test delivery mediums for ESL writing assessments.