[introductory/intermediate] Vision-based Emotion AI
Emotions play a key role in human-human interactions and become one key focus in future Artificial Intelligence. There is a growing need to develop emotionally intelligent interfaces, which are able to read the emotions of the users and adapt their operations accordingly. Among the areas of application are human-robot interaction, emotional chatpots, health and medicine, on-line learning, user or customer analysis, and security and safety. This lecture will provide an introduction to the emotional interfaces, and overviews the progress in related research, covering from expressed, to suppressed and unseen visual cues for emotional analysis, more specifically, facial expression recognition, analysis of micro-expressions and micro-gestures, remote heart rate measurement from videos and potential applications. Finally, some future challenges are outlined.
- Introduction to emotion AI
- Expressed facial expression analysis
- Suppressed visual cue study for hidden emotion understanding
- Remote heart rate measure from video analysis and applications
- Xiaohua Huang, Abhinav Dhall, Roland Goecke and Matti Pietikäinen, Guoying Zhao. Multi-modal Framework for Analyzing the Affect of a Group of People. IEEE Transactions on Multimedia, 20(10): 2706-2721. 2018.
- Xiaobai Li, Xiaopeng Hong, Antti Moilanen, Xiaohua Huang, Tomas Pfister, Guoying Zhao, and Matti Pietikäinen. Towards Reading Hidden Emotions: A Comparative Study of Spontaneous Micro-expression Spotting and Recognition Methods. IEEE Transactions on Affective Computing, 9(4): 563-577, 2018.
- Xianye Ben, Yi Ren, Junping Zhang, Su-Jing Wang, Kidiyo Kpalma, Weixiao Meng, Yong-Jin Liu. Video-based Facial Micro-Expression Analysis: A Survey of Datasets, Features and Algorithms. PAMI 2021.
- Yante Li, Xiaohua Huang, Guoying Zhao. Joint Local and Global Information Learning with Single Apex Frame Detection for Micro-expression Recognition. IEEE Trans. on Image Processing, 30: 249 – 263, 2020.
- Zitong Yu, Xiaobai Li, and Guoying Zhao. Facial Video-based Physiological Signal Measurement: Recent Advances and Affective Applications. IEEE Signal Processing Magazine.
- Zitong Yu, Wei Peng, Xiaobai Li, Xiaopeng Hong, Guoying Zhao. Remote Heart Rate Measurement from Highly Compressed Facial Videos: an End-to-end Deep Learning Solution with Video Enhancement. ICCV 2019.
- Henglin Shi, Wei Peng, Haoyu Chen, Xin Liu, Guoying Zhao. Multi-scale 3D Shift Graph Convolution Network for Emotion Recognition from Human Actions. IEEE Intelligent Systems. 2022.
- Xin Liu, Henglin Shi, Haoyu Chen, Zitong Yu, Xiaobai Li, Guoying Zhao. iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis. CVPR, 2021.
Basic mathematical skills from bachelor’s and basic machine learning and computer vision knowledge.
Guoying got her PhD degree from Chinese Academy of Sciences, Beijing, China, in 2005. She is full professor (tenured from 2017) with University of Oulu, and currently an Academy Professor. She has authored or co-authored more than 260 papers in journals and conferences with 16400+ citations in Google Scholar and h-index 60. She is co-program chair for ACM International Conference on Multimodal Interaction (ICMI 2021), was co-publicity chair for FG2018, general chair of International Conference on Biometric Engineering and Applications (ICBEA 2019, 2020), and Co-Chair for Late Breaking Results of ICMI 2019, has served as area chairs for several conferences and is associate editor for Pattern Recognition, IEEE Transactions on Circuits and Systems for Video Technology, and Image and Vision Computing Journals. She has lectured tutorials at FG 2018, ICPR 2006, ICCV 2009, and SCIA 2013, and authored/edited three books and nine special issues in journals. Dr. Zhao was a Co-Chair of many International Workshops at ICCV, CVPR, ECCV, ACCV and BMVC. Her research has been reported by Finnish TV programs, newspapers and MIT Technology Review. She is IAPR Fellow and AAIA Fellow.