University of Hertfordshire Research Archive

        JavaScript is disabled for your browser. Some features of this site may not work without it.

        Browse

        All of UHRABy Issue DateAuthorsTitlesThis CollectionBy Issue DateAuthorsTitles

        Arkivum Files

        My Downloads
        View Item 
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item

        Design and evaluation of an ontology-based tool for generating multiple-choice questions

        View/Open
        Final Accepted Version (PDF, 1Mb)
        Author
        Cubric, Marija
        Tosic, M.
        Attention
        2299/22825
        Abstract
        Purpose: The recent rise in online knowledge repositories and use of formalism for structuring knowledge, such as ontologies, has provided necessary conditions for the emergence of tools for generating knowledge assessment. These tools can be used in a context of interactive computer-assisted assessment (CAA) to provide a cost-effective solution for prompt feedback and increased learner’s engagement. The purpose of this paper is to describe and evaluate a tool developed by the authors, which generates test questions from an arbitrary domain ontology, based on sound pedagogical principles encapsulated in Bloom’s taxonomy. Design/methodology/approach: This paper uses design science as a framework for presenting the research. A total of 5,230 questions were generated from 90 different ontologies and 81 randomly selected questions were evaluated by 8 CAA experts. Data were analysed using descriptive statistics and Kruskal–Wallis test for non-parametric analysis of variance.FindingsIn total, 69 per cent of generated questions were found to be useable for tests and 33 per cent to be of medium to high difficulty. Significant differences in quality of generated questions were found across different ontologies, strategies for generating distractors and Bloom’s question levels: the questions testing application of knowledge and the questions using semantic strategies were perceived to be of the highest quality. Originality/value: The paper extends the current work in the area of automated test generation in three important directions: it introduces an open-source, web-based tool available to other researchers for experimentation purposes; it recommends practical guidelines for development of similar tools; and it proposes a set of criteria and standard format for future evaluation of similar systems.
        Publication date
        2020-02-12
        Published in
        Interactive Technology and Smart Education
        Published version
        https://doi.org/10.1108/ITSE-05-2019-0023
        License
        http://creativecommons.org/licenses/by-nc/4.0/
        Other links
        http://hdl.handle.net/2299/22825
        Relations
        Hertfordshire Business School
        Metadata
        Show full item record
        Keep in touch

        © 2019 University of Hertfordshire

        I want to...

        • Apply for a course
        • Download a Prospectus
        • Find a job at the University
        • Make a complaint
        • Contact the Press Office

        Go to...

        • Accommodation booking
        • Your student record
        • Bayfordbury
        • KASPAR
        • UH Arts

        The small print

        • Terms of use
        • Privacy and cookies
        • Criminal Finances Act 2017
        • Modern Slavery Act 2015
        • Sitemap

        Find/Contact us

        • T: +44 (0)1707 284000
        • E: ask@herts.ac.uk
        • Where to find us
        • Parking
        • hr
        • qaa
        • stonewall
        • AMBA
        • ECU Race Charter
        • disability confident
        • AthenaSwan