Show simple item record

dc.contributor.authorBarker, T.
dc.identifier.citationBarker , T 2011 , ' An automated individual feedback and marking system : an empirical study ' , Electronic Journal of e-Learning , vol. 9 , no. 1 , pp. 1-114 .
dc.identifier.otherPURE: 84529
dc.identifier.otherPURE UUID: a45de2f4-89f3-41ba-bf38-acd06e84cbb8
dc.identifier.otherdspace: 2299/5937
dc.identifier.otherScopus: 84859942427
dc.descriptionOriginal article can be found at : Copyright Management Centre International Limited [Full text of this article is not available in the UHRA]
dc.description.abstractThe recent National Students Survey showed that feedback to students was an ongoing problem in Higher Education. This paper reports on the extension of our past research into the provision of automated feedback for objective testing. In the research presented here, the system has been further developed for marking practical and essay questions and providing automated feedback. Recent research at the University of Hertfordshire was able to show that learners and tutors accept and value our automated feedback approach based on objective tests and Computer Adaptive Testing. The research reported in this paper is an important extension to this work. The automated feedback system developed for objective testing has been extended to include practical testing and essay type questions. The automated feedback system, which can be used within any subject area, is based on a simple marking scheme created by the subject tutor as a text file according to a simple template. Marks for each option and a set of feedback statements are held within a database on a computer. As marks are awarded for each question by the teacher an individual feedback file is created automatically for each learner. Teachers may also add and modify comments to each learner and save additional feedback to the database for later use. Each individual feedback file was emailed automatically to learners. The development of the system is explained in the paper and testing and evaluation with 350 first year (1 final practical test), 120 second year (1 written and 1 practical tests) and 100 final year (1 final practical test) undergraduate Computer Science students is reported. It was found that the time to mark practical and essay type tests was reduced by more than 30% in all cases compared to previous years. More importantly it was possible to provide good quality individual feedback to learners rapidly. Feedback was delivered to all within three weeks of the test submission date. In end of module tests it was very beneficial indeed as it had proven difficult to provide feedback in the past after modules had ended. Examples of the feedback provided are presented in the paper and the development of the system using a user-centred approach based on student and staff evaluation is explained. The comments of staff teaching on these modules and a sample of students who took part in this series of evaluations of the system are presented. The results of these evaluations were very positive and are reported in the paper, showing the changes that were made to the system at each iteration of the development cycle. The provision of fast effective feedback is vital and this system was found to be an important addition to the tools available.en
dc.relation.ispartofElectronic Journal of e-Learning
dc.subjectautomated systems
dc.titleAn automated individual feedback and marking system : an empirical studyen
dc.contributor.institutionSchool of Computer Science
dc.contributor.institutionCentre for Computer Science and Informatics Research
dc.description.statusPeer reviewed
dc.relation.schoolSchool of Computer Science
rioxxterms.typeJournal Article/Review

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record