dc.contributor.author | Khan, Zohaib Ahmad | |
dc.contributor.author | Xia, Yuanqing | |
dc.contributor.author | Aurangzeb, Khursheed | |
dc.contributor.author | Khaliq, Fiza | |
dc.contributor.author | Alam, Mahmood | |
dc.contributor.author | Khan, Javed Ali | |
dc.contributor.author | Anwar, Muhammad Shahid | |
dc.date.accessioned | 2024-04-25T08:15:02Z | |
dc.date.available | 2024-04-25T08:15:02Z | |
dc.date.issued | 2024-03-29 | |
dc.identifier.citation | Khan , Z A , Xia , Y , Aurangzeb , K , Khaliq , F , Alam , M , Khan , J A & Anwar , M S 2024 , ' Emotion detection from handwriting and drawing samples using an attention-based transformer model ' , PeerJ Computer Science , vol. 10 , e1887 , pp. 1/23 . https://doi.org/10.7717/peerj-cs.1887 , https://doi.org/10.7717/peerj-cs.1887 | |
dc.identifier.issn | 2376-5992 | |
dc.identifier.other | ORCID: /0000-0003-3306-1195/work/158538218 | |
dc.identifier.other | PubMedCentral: PMC11041987 | |
dc.identifier.uri | http://hdl.handle.net/2299/27799 | |
dc.description | © 2024 The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/ | |
dc.description.abstract | Emotion detection (ED) involves the identification and understanding of an individual’s emotional state through various cues such as facial expressions, voice tones, physiological changes, and behavioral patterns. In this context, behavioral analysis is employed to observe actions and behaviors for emotional interpretation. This work specifically employs behavioral metrics like drawing and handwriting to determine a person’s emotional state, recognizing these actions as physical functions integrating motor and cognitive processes. The study proposes an attention-based transformer model as an innovative approach to identify emotions from handwriting and drawing samples, thereby advancing the capabilities of ED into the domains of fine motor skills and artistic expression. The initial data obtained provides a set of points that correspond to the handwriting or drawing strokes. Each stroke point is subsequently delivered to the attention-based transformer model, which embeds it into a high-dimensional vector space. The model builds a prediction about the emotional state of the person who generated the sample by integrating the most important components and patterns in the input sequence using self-attentional processes. The proposed approach possesses a distinct advantage in its enhanced capacity to capture long-range correlations compared to conventional recurrent neural networks (RNN). This characteristic makes it particularly well-suited for the precise identification of emotions from samples of handwriting and drawings, signifying a notable advancement in the field of emotion detection. The proposed method produced cutting-edge outcomes of 92.64% on the benchmark dataset known as EMOTHAW (Emotion Recognition via Handwriting and Drawing). | en |
dc.format.extent | 23 | |
dc.format.extent | 2394452 | |
dc.language.iso | eng | |
dc.relation.ispartof | PeerJ Computer Science | |
dc.subject | Behavioral biometrics | |
dc.subject | Emotion detection | |
dc.subject | Emotional intelligence | |
dc.subject | Emotional state recognition | |
dc.subject | Handwriting/Drawing analysis | |
dc.subject | Human-computer Interaction | |
dc.subject | Transformer model | |
dc.subject | General Computer Science | |
dc.title | Emotion detection from handwriting and drawing samples using an attention-based transformer model | en |
dc.contributor.institution | School of Physics, Engineering & Computer Science | |
dc.contributor.institution | Biocomputation Research Group | |
dc.contributor.institution | Department of Computer Science | |
dc.contributor.institution | Cybersecurity and Computing Systems | |
dc.description.status | Peer reviewed | |
dc.identifier.url | http://www.scopus.com/inward/record.url?scp=85190278271&partnerID=8YFLogxK | |
rioxxterms.versionofrecord | 10.7717/peerj-cs.1887 | |
rioxxterms.type | Journal Article/Review | |
herts.preservation.rarelyaccessed | true | |