University of Hertfordshire Research Archive

        JavaScript is disabled for your browser. Some features of this site may not work without it.

        Browse

        All of UHRABy Issue DateAuthorsTitlesThis CollectionBy Issue DateAuthorsTitles

        Arkivum Files

        My Downloads
        View Item 
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item
        • UHRA Home
        • University of Hertfordshire
        • Research publications
        • View Item

        Augmenting dementia cognitive assessment with instruction-less eye-tracking tests

        View/Open
        FINAL_VERSION.pdf (PDF, 4Mb)
        Author
        Mengoudi, Kyriaki
        Ravi, Daniele
        Yong, Keir X.X.
        Primativo, Silvia
        Pavisic, Ivanna M.
        Brotherhood, Emilie
        Lu, Kirsty
        Schott, Jonathan M.
        Crutch, Sebastian J.
        Alexander, Daniel C.
        Attention
        2299/23813
        Abstract
        Eye-tracking technology is an innovative tool that holds promise for enhancing dementia screening. In this work, we introduce a novel way of extracting salient features directly from the raw eye-tracking data of a mixed sample of dementia patients during a novel instruction-less cognitive test. Our approach is based on self-supervised representation learning where, by training initially a deep neural network to solve a pretext task using well-defined available labels (e.g. recognising distinct cognitive activities in healthy individuals), the network encodes high-level semantic information which is useful for solving other problems of interest (e.g. dementia classification). Inspired by previous work in explainable AI, we use the Layer-wise Relevance Propagation (LRP) technique to describe our network's decisions in differentiating between the distinct cognitive activities. The extent to which eye-tracking features of dementia patients deviate from healthy behaviour is then explored, followed by a comparison between self-supervised and handcrafted representations on discriminating between participants with and without dementia. Our findings not only reveal novel self-supervised learning features that are more sensitive than handcrafted features in detecting performance differences between participants with and without dementia across a variety of tasks, but also validate that instruction-less eye-tracking tests can detect oculomotor biomarkers of dementia-related cognitive dysfunction. This work highlights the contribution of self-supervised representation learning techniques in biomedical applications where the small number of patients, the non-homogenous presentations of the disease and the complexity of the setting can be a challenge using state-of-the-art feature extraction methods.
        Publication date
        2020-11
        Published in
        IEEE Journal of Biomedical and Health Informatics
        Published version
        https://doi.org/10.1109/JBHI.2020.3004686
        Other links
        http://hdl.handle.net/2299/23813
        Metadata
        Show full item record
        Keep in touch

        © 2019 University of Hertfordshire

        I want to...

        • Apply for a course
        • Download a Prospectus
        • Find a job at the University
        • Make a complaint
        • Contact the Press Office

        Go to...

        • Accommodation booking
        • Your student record
        • Bayfordbury
        • KASPAR
        • UH Arts

        The small print

        • Terms of use
        • Privacy and cookies
        • Criminal Finances Act 2017
        • Modern Slavery Act 2015
        • Sitemap

        Find/Contact us

        • T: +44 (0)1707 284000
        • E: ask@herts.ac.uk
        • Where to find us
        • Parking
        • hr
        • qaa
        • stonewall
        • AMBA
        • ECU Race Charter
        • disability confident
        • AthenaSwan