The provision of automated feedback for practical and essay tests
The recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our research with objective testing, into the provision of automated feedback for practical and essay questions. Recent research showed that learners and tutors valued our automated feedback approach based on objective tests and Computer Adaptive Testing. The automated feedback system has been extended to include practical testing and essay type questions. This system, which can be used within any subject area, is based on a simple marking scheme created by the subject tutor. As marks are awarded for each question by the teacher an individual feedback file is created automatically for each learner. Teachers may also add and modify comments to each learner and save additional feedback to the database for later use. Each individual feedback file was emailed automatically to learners. The development of the system is explained in the paper and testing and evaluation with 350 first year 120 second year and 100 final year undergraduate Computer Science students is reported. It was found that the time to mark practical and essay type tests was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. In end of module tests it was very beneficial indeed as it had proven difficult to provide feedback in the past after modules have ended. Examples of the feedback provided are presented in the paper. Staff teaching on these modules and a sample of students took part in an evaluation of the system. The results of this evaluation were very positive and are reported in the paper. The provision of fast effective feedback is vital and this system was found to be an important addition to the tools available.