Utilizing emotion recognition technology to enhance user experience in real-time

  • Yuanyuan Xu College of Design and Innovation, Tongji University, Shanghai 200092, China
  • Yin-Shan Lin Khoury College of Computer Science, Northeastern University, Boston, MA 02115, United States
  • Xiaofan Zhou Department of Computer and Information of Science and Engineering, University of Florida, Gainesville, FL 32611, United States
  • Xinyang Shan College of Design and Innovation, Tongji University, Shanghai 200092, China
Article ID: 1388
3808 Views
Keywords: emotion recognition; user experience; human-computer interaction

Abstract

In recent years, advancements in human-computer interaction (HCI) have led to the emergence of emotion recognition technology as a crucial tool for enhancing user engagement and satisfaction. This study investigates the application of emotion recognition technology in real-time environments to monitor and respond to users’ emotional states, creating more personalized and intuitive interactions. The research employs convolutional neural networks (CNN) and long short-term memory networks (LSTM) to analyze facial expressions and voice emotions. The experimental design includes an experimental group that uses an emotion recognition system, which dynamically adjusts learning content based on detected emotional states, and a control group that uses a traditional online learning platform. The results show that real-time emotion monitoring and dynamic content adjustments significantly improve user experiences, with the experimental group demonstrating better engagement, learning outcomes, and overall satisfaction. Quantitative results indicate that the emotion recognition system reduced task completion time by 14.3%, lowered error rates by 50%, and increased user satisfaction by 18.4%. These findings highlight the potential of emotion recognition technology to enhance user experiences. However, challenges such as the complexity of multimodal data integration, real-time processing capabilities, and privacy and data security issues remain. Addressing these challenges is crucial for the successful implementation and widespread adoption of this technology. The paper concludes that emotion recognition technology, by providing personalized and adaptive interactions, holds significant promise for improving user experience and offers valuable insights for future research and practical applications.

References

[1]Chen CM, Wang HP. Using emotion recognition technology to assess the effects of different multimedia materials on learning emotion and performance. Library & Information Science Research. 2011; 33(3): 244-255. doi: 10.1016/j.lisr.2010.09.010

[2]Lim JZ, Mountstephens J, Teo J. Emotion recognition using eye-tracking: taxonomy, review and current challenges. Sensors. 2020; 20(8): 2384. doi: 10.3390/s20082384

[3]Dzedzickis A, Kaklauskas A, Bucinskas V. Human emotion recognition: Review of sensors and methods. Sensors. 2020; 20(3): 592. doi: 10.3390/s20030592

[4]Abdullah SMSA, Ameen SYA, Sadeeq MAM, et al. Multimodal emotion recognition using deep learning. Journal of Applied Science and Technology Trends. 2021; 2(01): 73-79. doi: 10.38094/jastt20291

[5]Savchenko AV, Savchenko LV, Makarov I. Classifying emotions and engagement in online learning based on a single facial expression recognition neural network. IEEE Transactions on Affective Computing. 2022; 13(4): 2132-2143. doi: 10.1109/taffc.2022.3188390

[6]Hasnul MA, Aziz NAA, Alelyani S, et al. Electrocardiogram-based emotion recognition systems and their applications in healthcare—A review. Sensors. 2021; 21(15): 5015. doi: 10.3390/s21155015

[7]Saxena A, Khanna A, Gupta D. Emotion recognition and detection methods: A comprehensive survey. Journal of Artificial Intelligence and Systems. 2020; 2(1): 53-79. doi: 10.33969/ais.2020.21005

[8]Silva GM, Souto JJS, Fernandes TP, et al. Interventions with serious games and entertainment games in autism spectrum disorder: a systematic review. Developmental Neuropsychology. 2021; 46(7): 463-485. 10.1080/87565641.2021.1981905

[9]Hassouneh A, Mutawa AM, Murugappan M. Development of a real-time emotion recognition system using facial expressions and EEG based on machine learning and deep neural network methods. Informatics in Medicine Unlocked. 2020; 20: 100372. doi: 10.1016/j.imu.2020.100372

[10]Akçay MB, Oğuz K. Speech emotion recognition: Emotional models, databases, features, preprocessing methods, supporting modalities, and classifiers. Speech Communication. 2020; 116: 56-76. doi: 10.1016/j.specom.2019.12.001

[11]Akhand MAH, Roy S, Siddique N, et al. Facial emotion recognition using transfer learning in the deep CNN. Electronics. 2021; 10(9): 1036. doi: 10.3390/electronics10091036

[12]Khaireddin Y, Chen Z. Facial emotion recognition: State of the art performance on FER2013. arXiv. 2021; arXiv:2105.03588

[13]Canal FZ, Müller TR, Matias JC, et al. A survey on facial emotion recognition techniques: A state-of-the-art literature review. Information Sciences. 2022; 582: 593-617. doi: 10.1016/j.ins.2021.10.005

[14]Lameris H, Mehta S, Henter GE, et al. Prosody-controllable spontaneous TTS with neural HMMs. In: Proceedings of the ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP); 2023. pp. 1-5.

[15]Syed SA, Rashid M, Hussain S, et al. Comparative analysis of CNN and RNN for voice pathology detection. BioMed Research International. 2021; 2021: 1-8.

[16]Rizvi DR, Nissar I, Masood S, et al. An LSTM based deep learning model for voice-based detection of Parkinson’s disease. International Journal of Advanced Science and Technology 2020; 29(8).

[17]Gao Z, Dang W, Wang X, et al. Complex networks and deep learning for EEG signal analysis. Cognitive Neurodynamics. 2021; 15(3): 369-388. doi: 10.1007/s11571-020-09626-1

[18]Arsalan A, Majid M. Human stress classification during public speaking using physiological signals. Computers in Biology and Medicine. 2021; 133: 104377. doi: 10.1016/j.compbiomed.2021.104377

[19]Niu X, Yu Z, Han H, et al. Video-based remote physiological measurement via cross-verified feature disentangling. In: Proceedings of the Computer Vision—ECCV 2020: 16th European Conference; 23–28 August 2020; Glasgow, UK. pp. 295-310.

[20]Wang W, Xu K, Niu H, et al. Emotion recognition of students based on facial expressions in online education based on the perspective of computer simulation. Complexity. 2020; 2020: 1-9. doi: 10.1155/2020/4065207

[21]Ayata D, Yaslan Y, Kamasak ME. Emotion Recognition from Multimodal Physiological Signals for Emotion Aware Healthcare Systems. Journal of Medical and Biological Engineering. 2020; 40(2): 149-157. doi: 10.1007/s40846-019-00505-7

[22]Fei Z, Yang E, Li DDU, et al. Deep convolution network based emotion analysis towards mental health care. Neurocomputing. 2020; 388: 212-227. doi: 10.1016/j.neucom.2020.01.034

[23]Zhang J, Yin Z, Chen P, et al. Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review. Information Fusion. 2020; 59: 103-126. doi: 10.1016/j.inffus.2020.01.011

[24]Hu J, Liu Y, Zhao J, et al. MMGCN: Multimodal fusion via deep graph convolution network for emotion recognition in conversation. arXiv. 2021; arXiv:2107.06779.

[25]Park CY, Cha N, Kang S, et al. K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations. Scientific Data 2020; 7(1): 293. doi: 10.1038/s41597-020-00630-y

[26]Adjabi I, Ouahabi A, Benzaoui A, et al. Past, Present, and Future of Face Recognition: A Review. Electronics. 2020; 9(8): 1188. doi: 10.3390/electronics9081188

Published
2024-06-14
How to Cite
Xu, Y., Lin, Y.-S., Zhou, X., & Shan, X. (2024). Utilizing emotion recognition technology to enhance user experience in real-time. Computing and Artificial Intelligence, 2(1), 1388. https://doi.org/10.59400/cai.v2i1.1388
Section
Article