dc.contributor.author | Zhang, Mengxian | |
dc.contributor.author | Wang, Chunhua | |
dc.contributor.author | Sun, Yichuang | |
dc.contributor.author | Li, Tao | |
dc.date.accessioned | 2024-03-25T13:32:31Z | |
dc.date.available | 2024-03-25T13:32:31Z | |
dc.date.issued | 2024-03-30 | |
dc.identifier.citation | Zhang , M , Wang , C , Sun , Y & Li , T 2024 , ' Memristive PAD three-dimensional emotion generation system based on D-S evidence theory ' , Nonlinear Dynamics , vol. 112 , no. 6 , pp. 4841-4861 . https://doi.org/10.1007/s11071-023-09264-2 | |
dc.identifier.issn | 0924-090X | |
dc.identifier.uri | http://hdl.handle.net/2299/27557 | |
dc.description | © 2024, The Author(s), under exclusive licence to Springer Nature B.V. This is the accepted manuscript version of an article which has been published in final form at https://doi.org/10.1007/s11071-023-09264-2 | |
dc.description.abstract | In this work, a Pleasure–Arousal–Dominance (PAD) three-dimensional brain-like emotion generation system is proposed by simulating the brain tissue structures involved in emotion generation in the brain’s limbic system. The system utilizes volatile memristors to simulate the activation and recovery process of neurons, and non-volatile memristors to simulate the synaptic weight changes. It combines the brain emotion learning model and the biological long short-term memory model to simulate the emotion generation process in the brain. The system employs the Dempster–Shafter (D–S) evidence theory for multimodal feature fusion, ultimately representing the generated human-like emotions in the PAD three-dimensional emotion expression space. Considering the differences in emotional information represented in each dimension of the PAD emotion expression space, this work proposes the use of the D–S evidence theory to calculate the weight values of multimodal evidence and each dimension of emotion signals. The system performs weighted summation for multimodal feature fusion, which is more biologically inspired and realistic. As a result, the generated emotion signals are more accurate, and the PAD three-dimensional emotion expression model enhances the capability and richness of emotion expression. The system processes multimodal input signals (text, speech, visual signals) to generate three-dimensional emotion signals (pleasure, arousal, and dominance signals), which correspond to specific emotions in a three-dimensional space. These signals can be visually represented as facial images using MATLAB. The simulation results from PSPICE indicate a nonlinear mapping relationship between the system’s input and output. It shows that different inputs can generate distinct human-like emotions. | en |
dc.format.extent | 21 | |
dc.format.extent | 2262588 | |
dc.language.iso | eng | |
dc.relation.ispartof | Nonlinear Dynamics | |
dc.subject | D–S evidence theory | |
dc.subject | Emotion generation | |
dc.subject | Memristor | |
dc.subject | PAD model | |
dc.subject | Mechanical Engineering | |
dc.subject | Aerospace Engineering | |
dc.subject | Ocean Engineering | |
dc.subject | Applied Mathematics | |
dc.subject | Electrical and Electronic Engineering | |
dc.subject | Control and Systems Engineering | |
dc.title | Memristive PAD three-dimensional emotion generation system based on D-S evidence theory | en |
dc.contributor.institution | School of Physics, Engineering & Computer Science | |
dc.contributor.institution | Department of Engineering and Technology | |
dc.contributor.institution | Centre for Engineering Research | |
dc.contributor.institution | Centre for Future Societies Research | |
dc.contributor.institution | Communications and Intelligent Systems | |
dc.description.status | Peer reviewed | |
dc.date.embargoedUntil | 2025-01-28 | |
dc.identifier.url | http://www.scopus.com/inward/record.url?scp=85183341821&partnerID=8YFLogxK | |
dc.identifier.url | https://link.springer.com/article/10.1007/s11071-023-09264-2#citeas | |
rioxxterms.versionofrecord | 10.1007/s11071-023-09264-2 | |
rioxxterms.type | Journal Article/Review | |
herts.preservation.rarelyaccessed | true | |