Show simple item record

dc.contributor.authorDar, Muhammad Najam
dc.contributor.authorAkram, Muhammad Usman
dc.contributor.authorKhawaja, Sajid Gul
dc.contributor.authorPujari, Amit N.
dc.date.accessioned2020-08-29T00:07:53Z
dc.date.available2020-08-29T00:07:53Z
dc.date.issued2020-08-14
dc.identifier.citationDar , M N , Akram , M U , Khawaja , S G & Pujari , A N 2020 , ' CNN and LSTM-Based Emotion Charting Using Physiological Signals ' , Sensors , vol. 20 , no. 16 , 4551 , pp. 1-26 . https://doi.org/10.3390/s20164551
dc.identifier.issn1424-3210
dc.identifier.otherORCID: /0000-0003-1688-4448/work/79522438
dc.identifier.urihttp://hdl.handle.net/2299/23093
dc.description.abstractNovel trends in affective computing are based on reliable sources of physiological signals such as Electroencephalogram (EEG), Electrocardiogram (ECG), and Galvanic Skin Response (GSR). The use of these signals provides challenges of performance improvement within a broader set of emotion classes in a less constrained real-world environment. To overcome these challenges, we propose a computational framework of 2D Convolutional Neural Network (CNN) architecture for the arrangement of 14 channels of EEG, and a combination of Long Short-Term Memory (LSTM) and 1D-CNN architecture for ECG and GSR. Our approach is subject-independent and incorporates two publicly available datasets of DREAMER and AMIGOS with low-cost, wearable sensors to extract physiological signals suitable for real-world environments. The results outperform state-of-the-art approaches for classification into four classes, namely High Valence—High Arousal, High Valence—Low Arousal, Low Valence—High Arousal, and Low Valence—Low Arousal. Emotion elicitation average accuracy of 98.73% is achieved with ECG right-channel modality, 76.65% with EEG modality, and 63.67% with GSR modality for AMIGOS. The overall highest accuracy of 99.0% for the AMIGOS dataset and 90.8% for the DREAMER dataset is achieved with multi-modal fusion. A strong correlation between spectral-and hidden-layer feature analysis with classification performance suggests the efficacy of the proposed method for significant feature extraction and higher emotion elicitation performance to a broader context for less constrained environments.en
dc.format.extent26
dc.format.extent1210456
dc.language.isoeng
dc.relation.ispartofSensors
dc.subjectConvolutional neural network (CNN)
dc.subjectDeep neural network
dc.subjectECG
dc.subjectEEG
dc.subjectEmotion recognition
dc.subjectGSR
dc.subjectLong short-term memory (LSTM)
dc.subjectPhysiological signals
dc.subjectAnalytical Chemistry
dc.subjectBiochemistry
dc.subjectAtomic and Molecular Physics, and Optics
dc.subjectInstrumentation
dc.subjectElectrical and Electronic Engineering
dc.titleCNN and LSTM-Based Emotion Charting Using Physiological Signalsen
dc.contributor.institutionSchool of Physics, Engineering & Computer Science
dc.contributor.institutionDepartment of Engineering and Technology
dc.contributor.institutionCentre for Engineering Research
dc.contributor.institutionBioEngineering
dc.description.statusPeer reviewed
dc.identifier.urlhttp://www.scopus.com/inward/record.url?scp=85089387052&partnerID=8YFLogxK
rioxxterms.versionofrecord10.3390/s20164551
rioxxterms.typeJournal Article/Review
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record