The development and evaluation of a software prototype for computer adaptive testing
This paper presents ongoing research at the University of Hertfordshire on the use of computer-adaptive tests (CATs) in Higher Education. A software prototype based on Item Response Theory has been developed and is described here. This application was designed to estimate the level of proficiency in English for those students whose first language is not English. Academic staff and students evaluated the prototype introduced here and we summarise their attitude to the user interface and to pedagogical aspects of the prototype. We provide evidence that learners are not disadvantaged by the CAT approach, based on a comparison of performance between CAT and computer-based tests. A group of international students also took part in a focus group session after using the software. During this session, students discussed issues related to computer-adaptive tests, ranging from their perception that very easy tests are “meaningless” to their insights into the fairness of such computer-assisted assessments. In addition, this paper outlines how our current work will be developed further by implementing multimedia resources, developing more subjective tests and adding a stop condition associated with the calculation of standard error. Finally, the benefits and potential limitations of this method of assessment are also presented here.