iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://link.springer.com/article/10.1007/s10846-015-0259-2
A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI | Journal of Intelligent & Robotic Systems Skip to main content
Log in

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In Human-Robot Interactions (HRI), robots should be socially intelligent. They should be able to respond appropriately to human affective and social cues in order to effectively engage in bi-directional communications. Social intelligence would allow a robot to relate to, understand, and interact and share information with people in real-world human-centered environments. This survey paper presents an encompassing review of existing automated affect recognition and classification systems for social robots engaged in various HRI settings. Human-affect detection from facial expressions, body language, voice, and physiological signals are investigated, as well as from a combination of the aforementioned modes. The automated systems are described by their corresponding robotic and HRI applications, the sensors they employ, and the feature detection techniques and affect classification strategies utilized. This paper also discusses pertinent future research directions for promoting the development of socially intelligent robots capable of recognizing, classifying and responding to human affective states during real-time HRI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Goodrich, M., Schultz, A.: Human-robot interaction: a survey. J. Foundations and Trends in Human-Computer Interaction 1(3), 203–275 (2007)

    Article  MATH  Google Scholar 

  2. Valero, A., Randelli, G., Botta, F.: Operator performance in exploration robotics. J. Intell. Robot. Syst. 64(3-4), 365–385 (2011)

    Article  Google Scholar 

  3. Rosenthal, S., Veloso, M.: Is someone in this office available to help me? J. Intell. Robot. Syst. 66(2), 205–221 (2011)

    Google Scholar 

  4. Swangnetr, M., Kaber, D.: Emotional state classification in patient–robot interaction using wavelet analysis and statistics-based feature selection. IEEE Trans. Human-Machine Syst. 43(1), 63–75 (2013)

    Article  Google Scholar 

  5. McColl, D., Nejat, G.: Determining the affective body language of older adults during socially assistive HRI. In: Proceedings of the IEEE Int. Conf. on Intell. Robots Syst. pp. 2633–2638 (2014)

  6. Liu, C., Conn, K., Sarkar, N., Stone, W.: Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans. Robot. 24(4), 883–896 (2008)

    Article  Google Scholar 

  7. Hegel, F., Spexard, T.: Playing a different imitation game: Interaction with an Empathic Android Robot. In: Proceedings of the IEEE-RAS Int. Conf. Humanoid Robots, pp. 56–61 (2006)

  8. Breazeal, C.: Social interactions in HRI: the robot view. IEEE Trans. Syst. Man Cybern. C, Appl. Rev. 34(2), 181–186 (2004)

    Article  Google Scholar 

  9. Keltner, D., Haidt, J.: Social functions of emotions at four levels of analysis. Cogn. Emot. 13(5), 505–521 (1999)

    Article  Google Scholar 

  10. Scherer, K.: Psychological models of emotion. The neuropsychology of emotion, pp. 137–162 (2000)

  11. Picard, R.: Affective computing. MIT Press (2000)

  12. Sorbello, R., Chella, A., Calí, C.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human–humanoid interaction. Robot. Auton. Syst. 62(9), 1329–1341 (2014)

    Article  Google Scholar 

  13. Park, H., Howard, A.: Providing tablets as collaborative-task workspace for human-robot interaction. In: Proceedings of the ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 207-208 (2013)

  14. McColl, D., Nejat, G.: Meal-time with a socially assistive robot and older adults at a long-term care facility. J. Human-Robot Interaction 2(1), 152–171 (2013)

    Article  Google Scholar 

  15. Kanda, T., Ishiguro, H., Imai, M., Ono, T.: Development and evaluation of interactive humanoid robots. Proc. IEEE 92(11), 1839–1850 (2004)

    Article  Google Scholar 

  16. Ishiguro, H., Ono, T., Imai, M., Maeda, T., Kanda, T., Nakatsu, R.: Robovie: an interactive humanoid robot. Industrial Robot An Int. J. 28(6), 498–504 (2001)

    Article  Google Scholar 

  17. Hinds, P.J., Roberts, T.L., Jones, H.: Whose Job Is It Anyway? A Study of Human-Robot Interaction in a Collaborative Task. J. Human-Computer Interaction 19(1), 151–181 (2004)

    Article  Google Scholar 

  18. Längle, T., Wörn, H.: Human–Robot Cooperation Using Multi-Agent-Systems. J. Intell. Robot. Syst. 32(2), 143–160 (2001)

    Article  MATH  Google Scholar 

  19. Scheutz, M., Dame, N., Schermerhorn, P., Kramer, J.: The Utility of Affect Expression in Natural Language Interactions in Joint Human-Robot Tasks. In: Proceedings of ACM SIGCHI/SIGART Conf. Human-Robot Interaction. pp. 226–233 (2006)

  20. Feil-Seifer, D., Mataric, M.J.: Socially Assistive Robotics. IEEE Robot. Autom. Mag. 18(1), 24–31 (2011)

    Article  Google Scholar 

  21. Ettelt, E., Furtwängler, R.: Design issues of a semi-autonomous robotic assistant for the health care environment. J. Intell. Robot. Syst. 22(3-4), 191–209 (1998)

    Article  Google Scholar 

  22. Conn, K., Liu, C., Sarkar, N.: Towards affect-sensitive assistive intervention technologies for children with autism. Affective Computing: Focus on Emotion Expression. Synthesis and Recognition, pp. 365–390 (2008)

  23. Nejat, G., Ficocelli, M.: Can I be of assistance? The intelligence behind an assistive robot. In: Proceedings of the IEEE Int. Conf. Robotics and Automation, pp. 3564–3569 (2008)

  24. Breazeal, C., Scassellati, B.: Robots that imitate humans. Trends Cogn. Sci. 6(11), 481–487 (2002)

    Article  Google Scholar 

  25. Bourgeois, P., Hess, U.: The impact of social context on mimicry. Biol. Psychol. 77(3), 343–352 (2008)

    Article  Google Scholar 

  26. Fasel, B., Luettin, J.: Automatic facial expression analysis: a survey. Pattern Recogn. 36(1), 259–275 (2003)

    Article  MATH  Google Scholar 

  27. Zeng, Z., Pantic, M.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)

    Article  Google Scholar 

  28. Calvo, R., D’Mello, S.: Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affective Comput. 1(1), 18–37 (2010)

    Article  Google Scholar 

  29. Kleinsmith, A., Bianchi-Berthouze, N.: Affective body expression perception and recognition: A survey. IEEE Trans. Affective Comput. 4(1), 15–33 (2013)

    Article  Google Scholar 

  30. Harish, R., Khan, S., Ali, S., Jain, V.: Human computer interaction-A brief study. Int. J. Managment, IT and Eng. 3(7), 390–401 (2013)

    Google Scholar 

  31. Rajruangrabin, J., Popa, D.: Robot head motion control with an emphasis on realism of neck–eye coordination during object tracking. J. Intell. Robot. Syst. 63(2), 163–190 (2011)

    Article  Google Scholar 

  32. Park, J., Lee, H., Chung, M.: Generation of realistic robot facial expressions for human robot interaction. J. Intell. Robot. Syst. (2014). doi:10.1007/s10846-014-0066-1

    Google Scholar 

  33. Powers, A., Kiesler, S.: Comparing a computer agent with a humanoid robot. In: Proceedings of the ACM/IEEE Int. Conf. Human-Robot Interaction, pp. 145–152 (2007)

  34. Shinozawa, K., Naya, F., Yamato, J., Kogure, K.: Differences in effect of robot and screen agent recommendations on human decision-making. Int. J. Human Comput. Stud. 62(2), 267–279 (2005)

    Article  Google Scholar 

  35. Kanda, T., Shiomi, M., Miyashita, Z.: An affective guide robot in a shopping mall. In: Proceedings of the ACM/IEEE Int. Conf. Human-Robot Interaction, pp. 173–180 (2009)

  36. Yan, H., Ang, M.H., Poo, A.N.: A Survey on Perception Methods for Human–Robot Interaction in Social Robots. Int. J. Soc. Robot. 6(1), 85–119 (2014)

    Article  Google Scholar 

  37. Martinez, A., Du, S.: A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives. J. Mach. Learn. Res. 13(1), 1589–1608 (2012)

    MathSciNet  Google Scholar 

  38. Niedenthal, P.M., Halberstadt, J.B., Setterlund, M.B.: Being happy and seeing “happy” emotional state mediates visual word recognition. Cogn. Emot. 11(4), 403–432 (1997)

    Article  Google Scholar 

  39. Russell, J.A.: Fernández-Dols, J. M.: The psychology of facial expression (1997)

  40. Darwin, C.: The expression of the emotions in man and animals. Amer. J. Med. Sci. 232(4), 477 (1956)

    Article  Google Scholar 

  41. Tomkins, S.: Affect, imagery, consciousness: vol. I. The positive affects. Oxford, England (1962)

  42. Tomkins, S.: Affect, imagery, consciousness: vol. II. The negative affects. Oxford, England (1963)

  43. Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124–129 (1971)

    Article  Google Scholar 

  44. Ekman, P., Friesen, W., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press (1972)

  45. Bann, E.Y., Bryson, J.J.: The Conceptualisation of Emotion Qualia: Semantic Clustering of Emotional Tweets. Prog. Neural Process. 21, 249–263 (2012)

    Google Scholar 

  46. Barrett, L.F., Gendron, M., Huang, Y.M.: Do discrete emotions exist? Philos. Psychol. 22(4), 427–437 (2009)

    Article  Google Scholar 

  47. Wundt, W.: Outlines of psychology. In: Wilhelm Wundt and the Making of a Scientific Psychology, pp. 179–195 (1980)

  48. Schlosberg, H.: Three dimensions of emotion. Psychol. Rev. 61(2), 81–88 (1954)

    Article  Google Scholar 

  49. Trnka, R., Balcar, K., Kuska, M.: Re-constructing Emotional Spaces: From Experience to Regulation. Prague Psychosocial Press (2011)

  50. Plutchik, R., Conte, H.: Circumplex models of personality and emotions. Washington, DC (1997)

  51. Mehrabian, A.: Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996)

    Article  MathSciNet  Google Scholar 

  52. Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)

    Article  Google Scholar 

  53. Remmington, N.A., Fabrigar, L.R., Visser, P.S.: Reexamining the circumplex model of affect. J. Pers. Soc. Psychol. 79(2), 286–300 (2000)

    Article  Google Scholar 

  54. Rubin, D.C., Talarico, J.M.: A comparison of dimensional models of emotion: evidence from emotions, prototypical events, autobiographical memories, and words. Memory 17(8), 802–808 (2009)

    Article  Google Scholar 

  55. Watson, D., Tellegen, A.: Toward a consensual structure of mood. Psychol. Bull. 98(2), 219–235 (1985)

    Article  Google Scholar 

  56. Barrett, L.F.: Discrete emotions or dimensions? The role of valence focus and arousal focus. Cogn. Emot. 12(4), 579–599 (1998)

    Article  Google Scholar 

  57. Kobayashi, H., Hara, F.: The recognition of basic facial expressions by neural network. In: Proceedings of the IEEE Int. Joint Conf. Neural Networks, pp. 460-466 (1991)

  58. Wimmer, M., MacDonald, B.A., Jayamuni, D., Yadav, A.: Facial Expression Recognition for Human-robot Interaction–A Prototype. Robot Vision 4931, 139–152 (2008)

    Article  Google Scholar 

  59. Luo, R.C., Lin, P.H., Wu, Y.C., Huang, C.Y.: Dynamic Face Recognition System in Recognizing Facial Expressions for Service Robotics. In: Proceedings of the IEEE/ASME Int. Conf. on Advanced Intell. Mechatronics, pp. 879–884 (2012)

  60. Tscherepanow, M., Hillebrand, M., Hegel, F., Wrede, B., Kummert, F.: Direct imitation of human facial expressions by a user-interface robot. In: Proceedings of the IEEE-RAS Int. Conf. on Humanoid Robots, pp. 154–160 (2009)

  61. Sanghvi, J., Castellano, G., Leite, I., Pereira, A., McOwan, P.W., Paiva, A.: Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: Proceedings of the ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 305–311 (2011)

  62. Barakova, E., Lourens, T.: Expressing and interpreting emotional movements in social games with robots. Personal Ubiquitous Comput. 14(5), 457–467 (2010)

    Article  Google Scholar 

  63. Xiao, Y., Zhang, Z., Beck, A., Yuan, J., Thalmann, D.: Human-virtual human interaction by upper body gesture understanding. In: Proceeding of the ACM Symp. on Virtual Reality Software and Technology, pp. 133–142 (2013)

  64. Cooney, M., Nishio, S., Ishiguro, H.: Recognizing affection for a touch-based interaction with a humanoid robot. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 1420–1427 (2012)

  65. Kim, H.R., Kwon, D.S.: Computational model of emotion generation for human–robot interaction based on the cognitive appraisal theory. J. Intell. Robot. Syst. 60(2), 263–283 (2010)

    Article  MATH  Google Scholar 

  66. Lin, Y.Y., Le, Z., Becker, E., Makedon, F.: Acoustical implicit communication in human-robot interaction. In: Proceedings of the Conf. on Pervasive Technologies Related to Assistive Environments, pp. 5 (2010)

  67. Hyun, K.H., Kim, E.H., Kwak, Y.K.: Emotional feature extraction method based on the concentration of phoneme influence for human–robot interaction. Adv. Robot. 24(1-2), 47–67 (2010)

    Article  Google Scholar 

  68. Yun, S., Yoo, C.D.: Speech emotion recognition via a max-margin framework incorporating a loss function based on the Watson and Tellegen’s emotion model. In: Proceedings of the IEEE Int. Conf. on Acoustics, Speech and Signal Processing, pp. 4169–4172 (2009)

  69. Kim, E.H., Hyun, K.H., Kim, S.H.: Improved emotion recognition with a novel speaker-independent feature. IEEE/ASME Trans. Mechatron. 14(3), 317–325 (2009)

    Article  Google Scholar 

  70. Kulic, D., Croft, E.: Anxiety detection during human-robot interaction. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 616–621 (2005)

  71. Rani, P., Liu, C., Sarkar, N., Vanman, E.: An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Anal. Applicat. 9(1), 58–69 (2006)

    Article  Google Scholar 

  72. Strait, M., Scheutz, M.: Using near infrared spectroscopy to index temporal changes in affect in realistic human–robot interactions. In: Physiological Computing Syst., Special Session on Affect Recogntion from Physiological Data for Social Robots (2014)

  73. Lazzeri, N., Mazzei, D., De Rossi, D.: Development and Testing of a Multimodal Acquisition Platform for Human-Robot Interaction Affective Studies. J. Human-Robot Interaction 3(2), 1–24 (2014)

    Article  Google Scholar 

  74. Paleari, M., Chellali, R., Huet, B.: Bimodal emotion recognition. Soc. Robot. 6414, 305–314 (2010)

    Article  Google Scholar 

  75. Cid, F., Prado, J.A., Bustos, P., Nunez, P.: A real time and robust facial expression recognition and imitation approach for affective human-robot interaction using Gabor filtering. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 2188–2193 (2013)

  76. Schacter, D., Wang, C., Nejat, G., Benhabib, B.: A two-dimensional facial-affect estimation system for human–robot interaction using facial expression parameters. Adv. Robot. 27(4), 259–273 (2013)

    Article  Google Scholar 

  77. Tielman, M., Neerincx, M., Meyer, J.J., Looije, R.: Adaptive emotional expression in robot-child interaction. In: Proceedings of the ACM/IEEE Int. Conf. on Human-robot interaction, pp. 407–414 (2014)

  78. Leite, I., Castellano, G., Pereira, A., Martinho, C., Paiva, A.: Modelling Empathic Behaviour in a Robotic Game Companion for Children?: an Ethnographic Study in Real-World Settings. In: Proceedings of the ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 367–374 (2012)

  79. McColl, D., Nejat, G.: Affect detection from body language during social HRI. In: Proceedings of the IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 1013–1018 (2012)

  80. Xu, J., Broekens, J., Hindriks, K., Neerincx, M.: Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of the Int. Conf. on Autonomous agents and multi-agent Syst., pp. 973–980 (2014)

  81. Iengo, S., Origlia, A., Staffa, M., Finzi, A.: Attentional and emotional regulation in human-robot interaction. In: Proceedings of the IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 1135–1140 (2012)

  82. Tahon, M.: Usual voice quality features and glottal features for emotional valence detection. In: Proceedings of Speech Prosody, pp. 1–8 (2012)

  83. Kulic, D., Croft, E.A.: Affective State Estimation for Human-Robot Interaction. IEEE Trans. Robot. 23(5), 991–1000 (2007)

    Article  Google Scholar 

  84. Rani, P., Liu, C., Sarkar, N.: Affective feedback in closed loop human-robot interaction. In: Proceedings of the ACM SIGCHI/SIGART Conf. on Human-robot interaction, pp. 335–336 (2006)

  85. Saulnier, P., Sharlin, E., Greenberg, S.: Using bio-electrical signals to influence the social behaviours of domesticated robots. In: Proceedings of the ACM/IEEE Int. Conf. on Human robot interaction, pp. 263–264 (2009)

  86. Broadbent, E., Lee, Y.I., Stafford, R.Q., Kuo, I.H.: Mental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human-robot interaction. Int. J. Soc. Robot. 3(3), 291–297 (2011)

    Article  Google Scholar 

  87. Schaaff, K., Schultz, T.: Towards an EEG-based emotion recognizer for humanoid robots. In: Proceedings of the IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 792–796 (2009)

  88. Castellano, G., Leite, I., Pereira, A., Martinho, C., Paiva, A., Mcowan, P.W.: Multimodal affect modeling and recognition for empathic robot companions. Int. J. Humanoid Robot. 10(1), 1–23 (2013)

    Article  Google Scholar 

  89. Gonsior, B., Sosnowski, S., Buss, M., Wollherr, D., Kuhnlenz, K.: An Emotional Adaption Approach to increase Helpfulness towards a Robot. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 2429–2436 (2012)

  90. Jung, H., Seo, Y., Ryoo, M.S., Yang, H.S.: Affective communication system with multimodality for a humanoid robot, AMI. In: Proceedings of the IEEE/RAS Int. Conf. on Humanoid Robots, pp. 690–706 (2004)

  91. Lim, A., Ogata, T., Okuno, H.G.: Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP. J. Audio, Speech, and Music Processing 2012(1), 1–12 (2012)

    Article  Google Scholar 

  92. Keltner, D., Ekman, P., Gonzaga, G.C., Beer, J.: Facial expression of emotion. Handbook of affective sciences, pp. 415–432. Series in affective science (2003)

  93. Schiano, D.J., Ehrlich, S. M., Rahardja, K., Sheridan, K: Face to interface: facial affect in (hu)man and machine. In: Proceedings of the SIGCHI Conf. on Human Factors in Computing Systems, pp. 193–200 (2000)

  94. Fridlund, A.J., Ekman, P., Oster, H.: Facial expressions of emotion. Nonverbal Behavior and Communication (2nd ed.), pp. 143–223 (1987)

  95. Fridlund, A.J.: Human facial expression: An evolutionary view. Academic Press (1994)

  96. Fridlund, A.J.: The new ethology of human facial expressions. In: The psychology of facial expression, pp. 103–127. Cambridge University Press (1997)

  97. Yang, Y., Ge, S.S., Lee, T.H., Wang, C.: Facial expression recognition and tracking for intelligent human-robot interaction. J. Intell. Serv. Robot. 1(2), 143–157 (2008)

    Article  Google Scholar 

  98. Tapus, A., Maja, M., Scassellatti, B.: The grand challenges in socially assistive robotics. IEEE Robot. Autom. Mag. 14(1), 1–7 (2007)

    Article  Google Scholar 

  99. Bartlett, M. S., Littlewort, G., Fasel, I., Movellan, J. R.: Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction. In: Proceedings of the CVPRW Conf. Computer Vision and Pattern Recognition, vol. 5, pp. 53–53 (2003)

  100. Castellano, G., Caridakis, G., Camurri, A., Karpouzis, K., Volpe, G., Kollias, S.: Body gesture and facial expression analysis for automatic affect recognition. Blueprint for affective computing: A sourcebook, pp. 245–255 (2010)

  101. Fong, T., Nourbakhsh, I., Dautenhahn, K.: A survey of socially interactive robots. Robot. Auton. Syst. 42(3), 143–166 (2003)

    Article  MATH  Google Scholar 

  102. Breuer, T., Giorgana Macedo, G.R., Hartanto, R., Hochgeschwender, N., Holz, D., Hegger, F., Jin, Z., Müller, C., Paulus, J., Reckhaus, M., Álvarez Ruiz, J.A., Plöger, P.G., Kraetzschmar, G.K.: Johnny: an autonomous service robot for domestic environments. J. Intell. Robot. Syst. 66(1-2), 245–272 (2011)

    Article  Google Scholar 

  103. Littlewort, G., Bartlett, M.S., Fasel, I., Chenu, J., Kanda, T., Ishiguro, H., Movellan, J.R.: Towards Social Robots: Automatic Evaluation of Human-robot Interaction by Face Detection and Expression Classification. In: Advances in Neural Information Processing Syst., vol. 16, MIT Press (2003)

  104. Boucenna, S., Gaussier, P., Andry, P., Hafemeister, L.: Imitation as a communication tool for online facial expression learning and recognition. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 5323–5328 (2010)

  105. Boucenna, S., Gaussier, P., Andry, P., Hafemeister, L.: A robot learns the facial expressions recognition and face/non-face discrimination through an imitation game. Int. J. Soc. Robot. 6(4), 633–652 (2014)

    Article  Google Scholar 

  106. Kobayashi, H., Hara, F.: Facial Interaction between Animated 3D Face Robot and Human Beings. In: Proceedings of the IEEE Int. Conf. on Syst., Man, and Cybernetics, vol. 4, pp. 3732–3737 (1997)

  107. Garcíia Bueno, J., González-Fierro, M., Moreno, L., Balaguer, C.: Facial emotion recognition and adaptative postural reaction by a humanoid based on neural evolution. Int. J. Adv. Comput. Sci. 3(10), 481–493 (2013)

    Google Scholar 

  108. Garcíia Bueno, J., González-Fierro, M., Moreno, L., Balaguer, C.: Facial gesture recognition using active appearance models based on neural evolution. In: Proceedings of the ACM/IEEE Int. Conf. on Human-Robot Interaction, pp. 133–134 (2012)

  109. Seeing Machines: faceAPI. http://www.seeingmachines.com/product/faceapi/ (2009)

  110. Castellano, G., Leite, I., Pereira, A., Martinho, C., Paiva, A., McOwan, P.W.: It’s all in the game: Towards an affect sensitive and context aware game companion. In: Proceedings of the Affective Computing and Intell. Interaction and Workshops, pp. 1–8 (2009)

  111. Castellano, G., Leite, I., Pereira, A., Martinho, C., Paiva, A., McOwan, P. W.: Inter-ACT: An affective and contextually rich multimodal video corpus for studying interaction with robots. In: Proceedings of the Int. Conf. on Multimedia, pp. 1031–1034 (2010)

  112. Barbaranelli, C., Caprara, G. V., Rabasca, A., Pastorelli, C.: A questionnaire for measuring the Big Five in late childhood. Pers. Individ. Dif. 34(4), 645–664 (2003)

    Article  Google Scholar 

  113. Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 46-53 (2000)

  114. Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37(3), 297–336 (1999)

    Article  MATH  Google Scholar 

  115. Wisspeintner, T., van der Zan, T., Iocchi, L., Schiffer, S.: Robocup@ Home: Results in benchmarking domestic service robots. In: RoboCup 2009: Robot Soccer World Cup XIII, pp. 390–401 (2010)

  116. Dornaika, F., Raducanu, B.: Efficient Facial Expression Recognition for Human Robot Interaction. In: Computational and Ambient Intelligence, pp. 700–708 (2007)

  117. Dornaika, F., Davoine, F.: On appearance based face and facial action tracking. IEEE Trans. Circuits Syst. Video Technol. 16(9), 1107–1124 (2006)

    Article  Google Scholar 

  118. Sim, T., Baker, S., Bsat, M.: The CMU pose, illumination, and expression database. IEEE Trans. Pattern Anal. Mach. Intell. 25(12), 1615–1618 (2003)

    Article  Google Scholar 

  119. Hyvärinen, A., Karhunen, J., Oja, E.: Independent component analysis. John Wiley & Sons (2004)

  120. Lee, D.D.: & Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)

    Article  Google Scholar 

  121. Fausett, L.: Fundamentals of Neural Networks: Architectures, Algorithms, and Applications. Prentice-Hall Int. (1994)

  122. Schölkopf, B., Smola, A.J., Williamson, R.C., Bartlett, P.L.: New support vector algorithms. Neural Comput. 12(5), 1207–1245 (2000)

    Article  Google Scholar 

  123. Kanungo, T., Mount, D.M., Netanyahu, N.S., Piatko, C.D., Silverman, R., Wu, A.Y.: An efficient k-means clustering algorithm: Analysis and implementation. IEEE Trans. Pattern Anal. Mach. Intell. 24(7), 881–892 (2002)

    Article  MATH  Google Scholar 

  124. Hara, F.: Artificial emotion of face robot through learning in communicative interactions with human. In: Proceedings of the IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 7–15 (2004)

  125. Li, Y., Hashimoto, M.: Effect of Emotional Synchronization using Facial Expression. In: Proceedings of the IEEE Int. Conf. on Robotics and Biomimetics, pp. 2872–2877 (2011)

  126. Nagamachi, M.: Kansei engineering: a new ergonomic consumer-oriented technology for product development. Int. J. Ind. Ergon. 15(1), 3–11 (1995)

    Article  MathSciNet  Google Scholar 

  127. Viola, P., Jones, M.J.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of the IEEE Computer Society Conf. on Computer Vision and Pattern Recognition, vol. 1, pp. 511–518 (2001)

  128. Yamada, H.: Models of perceptual judgment of emotion from facial expressions. Japanese Psychol. Rev. (2000)

  129. Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)

    Article  Google Scholar 

  130. Ekman, P., Friesen, W.: Facial action coding system. Consulting Psychologists Press Inc. (1977)

  131. Lyons, M.J., Akamatsu, S., Kamachi, M., Gyoba, J.: Coding Facial Expressions with Gabor Wavelets. In: Proceedings of the IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 200–205 (1998)

  132. Strupp, S., Schmitz, N., Berns, K.: Visual-based emotion detection for natural man-machine interaction. In: Advanced Artificial Intell. pp. 356–363 (2008)

  133. Ekman, P., Friesen, W.: Pictures of facial affect. Consulting Psychologists Press (1976)

  134. Luo, R.C., Lin, P.H., Chang, L.W.: Confidence fusion based emotion recognition of multiple persons for human-robot interaction. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 4590–4595 (2012)

  135. Cattinelli, I., Borghese, N.A.: A Simple Model for Human-Robot Emotional Interaction. In: Knowledge-Based Intell. Information and Engineering Syst., pp. 344–352 (2007)

  136. Cattinelli, I., Goldwurm, M., Borghese, N.A.: Interacting with an artificial partner: modeling the role of emotional aspects. Biol. Cybern. 99(6), 473–89 (2008)

    Article  MATH  Google Scholar 

  137. Ge, S.S., Samani, H.A., Ong, Y.H.J., Hang, C.C.: Active affective facial analysis for human-robot interaction. In: Proceedings of the IEEE Int. Symp. on Robot and Human Interactive Communication, pp. 83–88 (2008)

  138. Anderson, K., McOwan, P.W.: A real-time automated system for the recognition of human facial expressions. IEEE Trans. Syst. Man Cybern. Part B 36(1), 96–105 (2006)

    Article  Google Scholar 

  139. Cohn, J.F., Reed, L.I., Ambadar, Z., Moriyama, T.: Automatic analysis and recognition of brow actions and head motion in spontaneous facial behavior. In: Proceedings of the IEEE Int. Conf. on Syst., Man and Cybernetics, vol. 1, pp. 610–616 (2004)

  140. Tsai, C.C., Chen, Y.Z., Liao, C.W.: Interactive emotion recognition using Support Vector Machine for human-robot interaction. In: Proceedings of the IEEE Int. Conf. on Syst., Man and Cybernetics, pp. 407–412 (2009)

  141. Hong, J.W., Han, M.J., Song, K.T., Chang, F.Y.: A Fast Learning Algorithm for Robotic Emotion Recognition. In: Proceedings of the Int. Symp. on Computational Intell. in Robotics and Automation, pp. 25–30 (2007)

  142. Datcu, D., Rothkrantz, L.J.M.: Facial Expression Recognition with Relevance Vector Machines. In: Proceedings of the IEEE Int. Conf. on Multimedia and Expo, pp. 193–196 (2005)

  143. Lee, Y.B., Moon, S.B., Kim, Y.G.: Face and Facial Expression Recognition with an Embedded System for Human-Robot Interaction. Affective Computing and Intell. Interaction 3784, 271–278 (2005)

    Article  Google Scholar 

  144. Shan, C., Gong, S., McOwan, P.: Beyond Facial Expressions: Learning Human Emotion from Body Gestures. In: Proceedings of the British Mach. Vision Conf. pp. 1–10 (2007)

  145. Mehrabian, A.: Significance of posture and position in the communication of attitude and status relationships. Psychol. Bull. 71(5), 359–372 (1969)

    Article  Google Scholar 

  146. Montepare, J., Koff, E., Zaitchik, D., Albert, M.: The use of body movements and gestures as cues to emotions in younger and older adults. J. Nonverbal Behav. 23(2), 133–152 (1999)

    Article  Google Scholar 

  147. Wallbott, H.: Bodily expression of emotion. Eur. J. Soc. Psychol. 28(6), 879–896 (1998)

    Article  Google Scholar 

  148. Kleinsmith, A., Bianchi-Berthouze, N.: Recognizing affective dimensions from body posture. Affective Computing and Intell. Interaction 4738, 48–58 (2007)

    Article  Google Scholar 

  149. Gross, M., Crane, E., Fredrickson, B.: Effort-shape and kinematic assessment of bodily expression of emotion during gait. Human Movement Sci. 31(1), 202–221 (2012)

    Article  Google Scholar 

  150. Xu, D., Wu, X., Chen, Y., Xu, Y.: Online dynamic gesture recognition for human robot interaction. J. Intell. Robot. Syst. (2014). doi:10.1007/s10846-014-0039-4

    Google Scholar 

  151. Hasanuzzaman, M.: Gesture-based human-robot interaction using a knowledge-based software platform. Industrial Robot An Int. J. 33(1), 37–49 (2006)

    Article  Google Scholar 

  152. Yan, R., Tee, K., Chua, Y.: Gesture Recognition Based on Localist Attractor Networks with Application to Robot Control. In: IEEE Computational Intell. Mag., pp. 64–74 (2012)

  153. Suryawanshi, D., Khandelwal, C.: An integrated color and hand gesture recognition control for wireless robot. Int. J. Adv. Eng. Tech. 3(1), 427–435 (2012)

    Google Scholar 

  154. Obaid, M., Kistler, F., Häring, M.: A framework for user-defined body gestures to control a humanoid robot. Int. J. Soc. Robot. 6(3), 383–396 (2014)

    Article  Google Scholar 

  155. Malima, A., Ozgur, E., Çetin, M.: A fast algorithm for vision-based hand gesture recognition for robot control. In: Proceedings of the IEEE Signal Processing and Communication Applic. pp. 1–4 (2006)

  156. Waldherr, S., Romero, R., Thrun, S.: A gesture based interface for human-robot interaction. Auton. Robot. 9(2), 151–173 (2000)

    Article  Google Scholar 

  157. Corradini, A., Gross, H.: Camera-based gesture recognition for robot control. In: Proceedings of the IEEE-INNS-ENNS Int. Joint Conf. Neural Networks, vol. 4, pp. 133–138 (2000)

  158. Boehme, H.: Neural networks for gesture-based remote control of a mobile robot. In: Proceedings of the IEEE World Congr. on Computer Intell. and IEEE Int. Joint Conf. Neural Networks, vol. 1, pp. 372–377 (1998)

  159. Burger, B., Ferrané, I., Lerasle, F.: Multimodal interaction abilities for a robot companion. Comput. Vis. Syst. 5008, 549–558 (2008)

    Article  Google Scholar 

  160. Rogalla, O., Ehrenmann, M.: Using gesture and speech control for commanding a robot assistant. In: Proceedings of the IEEE Int. Workshop Robot and Human Interactive Comm. pp. 454–459 (2002)

  161. Becker, M., Kefalea, E., Maël, E.: GripSee: A gesture-controlled robot for object perception and manipulation. Auton. Robot. 6(2), 203–221 (1999)

    Article  MATH  Google Scholar 

  162. Gerlich, L., Parsons, B., White, A.: Gesture recognition for control of rehabilitation robots. Cogn. Tech. Work 9(4), 189–207 (2007)

    Article  Google Scholar 

  163. Raheja, J., Shyam, R.: Real-time robotic hand control using hand gestures. In: Proceedings of the Second Int. Conf. Machine Learning and Computing pp. 12–16 (2010)

  164. Hall, M., Frank, E., Holmes, G.: The WEKA data mining software: an update. ACM SIGKDD Explorations Newsletter 11(1), 10–18 (2009)

    Article  Google Scholar 

  165. Davis, M., Hadiks, D.: Non-verbal aspects of therapist attunement. J. Clin. Psychol. 50(3), 393–405 (1994)

    Article  Google Scholar 

  166. Ganapathi, V., Plagemann, C.: Real-time human pose tracking from range data. In: Comput. Vision–ECCV 2012, vol. 7577, pp. 738–751 (2012)

  167. Microsoft: Kinect for Windows Programming Guide. http://msdn.microsoft.com/en-us/library/hh855348.aspx (2014)

  168. Lourens, T., Berkel, R.: Van, Barakova, E.: Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robot. Auton. Syst. 58(12), 1256–1265 (2010)

    Article  Google Scholar 

  169. Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connect. Sci. 15(4), 259–269 (2003)

    Article  Google Scholar 

  170. Samadani, A., Kubica, E., Gorbet, R., Kulic, D.: Perception and generation of affective hand movements. Int. J. Soc. Robot. 5(1), 35–51 (2013)

    Article  Google Scholar 

  171. Glowinski, D., Dael, N., Camurri, A.: Toward a minimal representation of affective gestures. IEEE Trans. Affective Comput. 2(2), 106–118 (2011)

    Article  Google Scholar 

  172. Kim, W., Park, J., Lee, W.: LMA based emotional motion representation using RGB-D camera. In: Proceedings of the ACM/IEEE Int. Conf. on Human-Robot Interaction. pp. 163–164 (2013)

  173. Scherer, K.: Expression of emotion in voice and music. J. Voice 9(3), 235–248 (1995)

    Article  Google Scholar 

  174. Johnstone, T.: The effect of emotion on voice production and speech acoustics. University of Western Australia (2001)

  175. Scherer, K., Bänziger, T.: Emotional expression in prosody: a review and an agenda for future research. In: Proceedings of the Speech Prosody, pp. 359–366 (2004)

  176. Iohnstone, T., Scherer, K.: Vocal communication of emotion. In: Handbook of Emotion, pp. 220–235. Guilford, New York (2000)

    Google Scholar 

  177. Goudbeek, M., Scherer, K.: Beyond arousal: Valence and potency/control cues in the vocal expression of emotion. J. Acoust. Soc. 128, 1322 (2010)

    Article  Google Scholar 

  178. Hyun, K., Kim, E., Kwak, Y.: Emotional feature extraction based on phoneme information for speech emotion recognition. In: Proceedings of the IEEE Int. Symp. Robot and Human Interactive Comm. pp. 802–806 (2007)

  179. Song, K., Han, M., Wang, S.: Speech signal-based emotion recognition and its application to entertainment robots. J. Chinese Inst. Eng. 37(1), 14–25 (2014)

    Article  Google Scholar 

  180. Burkhardt, F., Paeschke, A., Rolfes, M.: A database of German emotional speech. Interspeech 5, 1517–1520 (2005)

    Google Scholar 

  181. Park, J., Kim, J., Oh, Y.: Feature vector classification based speech emotion recognition for service robots. IEEE Trans. Consum. Electron. 55(3), 1590–1596 (2009)

    Article  Google Scholar 

  182. Hyun, K., Kim, E., Kwak, Y.: Improvement of emotion recognition by Bayesian classifier using non-zero-pitch concept. In: Proceedings of the IEEE Int. Workshop Robot and Human Interactive Comm. pp. 312–316 (2005)

  183. Kim, E., Hyun, K., Kwak, Y.: Robust emotion recognition feature, frequency range of meaningful signal. In: Proceedings of the IEEE Int. Workshop Robot and Human Interactive Comm. pp. 667–671 (2005)

  184. Kim, E., Hyun, K.: Speech emotion recognition using eigen-fft in clean and noisy environments. In: Proceedings of the IEEE Int. Symp. Robot and Human Interactive Comm. pp. 689–694 (2007)

  185. Kim, E., Hyun, K.: Speech emotion recognition separately from voiced and unvoiced sound for emotional interaction robot. In: Proceedings of the Int. Conf. Control, Automation, and Syst. pp. 2014–2019 (2008)

  186. Liu, H., Zhang, W.: Mandarin emotion recognition based on multifractal theory towards human-robot interaction. In: Proceedings of the IEEE Int. Conf. Robot. Biomimetics, pp. 593–598 (2013)

  187. Roh, Y., Kim, D., Lee, W., Hong, K.: Novel acoustic features for speech emotion recognition. Sci. China Series E: J. Techn. Sci. 52(7), 1838–1848 (2009)

    Article  Google Scholar 

  188. Liberman, M., Davis, K., Grossman, M., Martey, N., Bell, J.: Emotional prosody speech and transcripts. In: Proceedings of the Linguist. Data Consortium, Philadelphia (2002)

  189. Kreibig, S.: Autonomic nervous system activity in emotion: A review. Biol. Psychol. 84(3), 394–421 (2010)

    Article  Google Scholar 

  190. Rani, P., Sarkar, N., Smith, C., Kirby, L.: Anxiety detecting robotic system–towards implicit human-robot collaboration. Robotica 22(1), 85–95 (2004)

    Article  Google Scholar 

  191. Smith, C.: Dimensions of appraisal and physiological response in emotion. J. Pers. Soc. Psychol. 56(3), 339–353 (1989)

    Article  Google Scholar 

  192. Fernández, C., Pascual, J., Soler, J.: Physiological responses induced by emotion-eliciting films. Appl. Psychophysiol. Biofeedback 37(2), 73–79 (2012)

    Article  Google Scholar 

  193. Kulic, D., Croft, E.: Affective state estimation for human–robot interaction. IEEE Trans. Robot. 23(5), 991–1000 (2007)

    Article  Google Scholar 

  194. Watson, D., Clark, L., Tellegen, A.: Development and validation of brief measures of positive and negative affect: the PANAS scales. J. Pers. Soc. Psychol. 54(6), 1063–1070 (1988)

    Article  Google Scholar 

  195. Savitzky, A., Golay, M.J.: Smoothing and differentiation of data by simplified least squares procedures. Anal. Chem. 36(8), 1627–1639 (1964)

    Article  Google Scholar 

  196. Paradiso, R., Loriga, G., Taccini, N.: A wearable health care system based on knitted integrated sensors. IEEE Trans. Inf. Technol. Biomed. 9(3), 337–344 (2005)

    Article  Google Scholar 

  197. Scilingo, E.P., Gemignani, A., Paradiso, R., Taccini, N., Ghelarducci, B., De Rossi, D.: Performance evaluation of sensing fabrics for monitoring physiological and biomechanical variables. IEEE Trans. Inf. Technol. Biomed. 9(3), 345–352 (2005)

    Article  Google Scholar 

  198. Valenza, G., Lanata, A., Scilingo, E.P.: The role of nonlinear dynamics in affective valence and arousal recognition. IEEE Trans. Affective Comput. 3(2), 237–249 (2012)

    Article  Google Scholar 

  199. Cid, F., Moreno, J., Bustos, P., Núñez, P.: Muecas: a multi-sensor robotic head for affective human robot interaction and imitation. Sensors 14(5), 7711–7737 (2014)

    Article  Google Scholar 

  200. Alonso-Martín, F., Malfaz, M., Sequeira, J., Gorostiza, J.F., Salichs, M.A.: A multimodal emotion detection system during human-robot interaction. Sensors 13(11), 15549–81 (2013)

    Article  Google Scholar 

  201. Littlewort, G., Whitehill, J., Wu, T.F., Butko, N., Ruvolo, P., Movellan, J., Bartlett, M.: The Motion in Emotion—A CERT based approach to the FERA emotion challenge. In: Proceedings of the IEEE Int. Conf. on Automatic Face & Gesture Recognition and Workshops, pp. 897–902 (2011)

  202. Limbu, D.K., Anthony, W.C.Y., Adrian, T.H.J., Dung, T.A., Kee, T.Y., Dat, T.H., Alvin, W.H.Y., Terence, N.W.Z., Ridong, J., Jun, L.: Affective social interaction with CuDDler robot. In: Proceedings of the IEEE Int. Conf. on Robotics, Automation and Mechatronics, pp. 179–184 (2013)

  203. Lim, A., Member, S., Okuno, H.G.: The MEI Robot?: Towards Using Motherese to Develop Multimodal Emotional Intelligence. IEEE Trans. Auton. Ment. Dev. 6(2), 126–138 (2014)

    Article  Google Scholar 

  204. Ma, Y., Paterson, H., Pollick, F.: A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav. Res. Methods 38(1), 134–141 (2006)

    Article  Google Scholar 

  205. Prado, J.A., Simplício, C., Lori, N.F., Dias, J.: Visuo-auditory multimodal emotional structure to improve human-robot-interaction. Int. J. Soc. Robot. 4(1), 29–51 (2011)

    Article  Google Scholar 

  206. Boersma, P.: Praat, a system for doing phonetics by computer. Glot Int. 5(9-10), 341–345 (2001)

    Google Scholar 

  207. Kwon, D., Kwak, Y.K., Park, J.C., Chung, M.J., Jee, E., Park, K., Kim, H., Kim, Y., Park, J., Kim, E., Hyun, K.H., Min, H., Lee, H.S., Park, J.W., Jo, S.H., Park, S., Lee, K.: Emotion interaction system for a service robot. In: Proceedings of the IEEE Int. Symp. on Robot and Human interactive Communication, pp. 351–356 (2007)

  208. Rabie, A., Handmann, U.: Fusion of audio-and visual cues for real-life emotional human robot interaction. In: Pattern Recognition, vol. 6835, pp. 346–355 (2011)

  209. Yoshitomi, Y., Kim, S.I., Kawano, T., Kilazoe, T.: Effect of sensor fusion for recognition of emotional states using voice, face image and thermal image of face. In: Proceedings of the IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 178–183 (2000)

  210. Castrillón, M., Déniz, O., Guerra, C., Hernández, M.: ENCARA2: Real-time detection of multiple faces at different resolutions in video streams. J. Vis. Commun. Image Represent. 18(2), 130–140 (2007)

    Article  Google Scholar 

  211. Vogt, T., André, E., Bee, N.: EmoVoice—A framework for online recognition of emotions from voice. In: Perception in Multimodal Dialogue Syst., pp. 188–199 (2008)

  212. Battocchi, A., Pianesi, F., Goren-Bar, D.: A first evaluation study of a database of kinetic facial expressions (dafex). In: Proceedings of the Int. Conf. on Multimodal Interfaces, pp. 214–221 (2005)

  213. Strait, M., Scheutz, M.: Measuring users’ responses to humans, robots, and human-like robots with functional near infrared spectroscopy. In: Proceedings of the IEEE Int. Symp. Robot and Human Interactive Communication, pp. 1128–1133 (2014)

  214. Hareli, S., Parkinson, B.: What’s social about social emotions. J. Theory Soc. Behav. 38(2), 131–156 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Beno Benhabib.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

McColl, D., Hong, A., Hatakeyama, N. et al. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. J Intell Robot Syst 82, 101–133 (2016). https://doi.org/10.1007/s10846-015-0259-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-015-0259-2

Keywords

Navigation