Abstract
Empathy is an important factor in human communication. For a robot to apply a matching emotion in human–robot communication, the robot needs to be able to understand human feelings. Therefore, in this study, we aimed to improve the human impression of the robot using a robot that expresses human-like expressions by synchronizing with human biological information and changing the expressions in real time. We first measured and estimated human emotions using an emotion estimation method based on biological information (brain waves and heartbeats). The three-emotion estimation methods were proposed and evaluated in the preliminary experiment. Among the three-emotion estimation methods proposed, the one that yields the highest impression rating was chosen to be used in the second experiment which was based on the emotional value in each cycle method. Then, we developed a robot that shows expressions in two patterns: (1) synchronized emotion (same emotion as subject conveyed) and (2) inversed emotion with the human. The subjects evaluated the robot’s expression from both patterns using semantic differential (SD) method while having their biological information measured based on the selected emotion estimation method from previous preliminary experiment. The evaluation by SD method and biological information results showed that when the human experienced the happiness emotion, and the robot synchronized and expressed the same emotion, this could increase the intimacy between human and robot. Here, it can be said that the impression created by the robot’s expression can be improved using biological information.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Necessity and issues of partner robots (2015), http://www.soumu.go.jp/johotsusintokei/whitepaper/ja/h27/html/nc241350.html (in Japanese)
Hiroi Y, Ito A, Nakano E (2008) Improvement of user familiarity using robot avatar for the human symbiosis robot. J Jpn Soc Kansei Eng 7(4):797–805
Yamaguci R, Miyamoto R (2018) Delineating the neural basis of cognitive and affective empathy in adult women during a novel facial assessment and observation task. Jpn J Clin Neuro-physiol 46(6):567–577
SoftBank, Pepper. https://www.softbank.jp/robot/consumer/products/
Hasegawa R, Fujimura T (2014) A New issue for the practical application of an eeg-based brain-machine interface (BMI); an emotional communication aid by the CG avatars that exhibit a variety of facial expressions. J Inst Image Info Television Eng 68(12):902–906
Yamano M, Usui T, Hashimoto M (2008) A proposal of human-robot interaction design based on emotional synchronization. Human-Agent Interaction (HAI) Symposium
Jimenez F, Yoshikawa T, Furuhashi T, Kanoh M (2016) Effects of collaborative learning with robots using model of emotional expressions. J Jpn Soc Fuzzy Theory Intell Info 28(4):700–704
Tanizaki Y, Jimenez F, Yoshikawa T, Furuhashi T (2018) Impression effects of educational support robots using sympathy expressions method by body movement and facial expression. J Jpn Soc Fuzzy Theory Intell Info 30(5):700–708
Encyclopedia Nipponice, Shogakukan. https://kotobank.jp/word/%E5%85%B1%E6%84%9F-477908
Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. International Conference on Human-Computer Interaction, pp 133–142
Sripian P, Kurono Y, Yoshida R, Sugaya M (2019) Study of empathy on robot expression based on emotion estimated from facial expression and biological signals.In: 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp 1–8
Ikeda Y, Horie R, Sugaya M (2017) Estimate emotion with biological information for robot interaction. In: 21st International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-2017), Marseille, France, pp 6–8
Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178
NeuroSky (2004) MindWave Mobile. http;//store.neurosky.com
Panicker SS, Gayathri P (2019) A survey of machine learning techniques in physiology based mental stress detection systems. Biocybern Biomed Eng 39(2):444–469
Switch Science (2010) Pulse sensor. https;//www.switch-science.com
Ragot M, Martin N, Em S, Pallamin N, Diverrez JM (2017), Emotion recognition using physiological signals: laboratory vs. wearable sensors. Advances in human factors in wearable technologies and game design, pp 15–22
López-Gil JM, Virgili-Gomá J, Gil R, García R (2016) Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front Comput Neurosci 10:1–14
Noguchi A, Watanabe D (2010) Estimation of emotional strength received from the accelerated linear motion. Tokyo University of Technology
Baveye Y, Dellandrea E, Chamaret C, Chen L (2015) LIRIS-ACCEDE: a video database for affective content analysis. IEEE Trans Affect Comput 6(1):43–55
Hayashi F (1978) The fundamental dimensions of interpersonal cognitive structure. Bulletin of the school of education. Psychology 25:233–247
Eerola T, Vuoskoski JK (2011) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49
Mori K, Iwanaga M (2014) Recent progress on music and motion studies: psychological response, peripheral nervous system activity, and musico-acoustic features. Jpn Psychol Rev 57(2):215–234
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Sripian, P., Mohd Anuardi, M.N.A., Kajihara, Y. et al. Empathetic robot evaluation through emotion estimation analysis and facial expression synchronization from biological information. Artif Life Robotics 26, 379–389 (2021). https://doi.org/10.1007/s10015-021-00696-w
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10015-021-00696-w