iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://api.crossref.org/works/10.20965/JRM.2020.P0097
{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,8,7]],"date-time":"2024-08-07T01:19:29Z","timestamp":1722993569173},"reference-count":40,"publisher":"Fuji Technology Press Ltd.","issue":"1","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["JRM","J. Robot. Mechatron."],"published-print":{"date-parts":[[2020,2,20]]},"abstract":"When a robot works among people in a public space, \tits behavior can make \tsome people feel uncomfortable. \tOne of the reasons for this \tis that it is difficult for people to understand \tthe robot\u2019s intended behavior based on its appearance. \tThis paper presents \ta \tnew intention expression method \tusing a three dimensional computer graphics (3D CG) face model. \tThe 3D CG face model is displayed on a flat panel screen \tand has two eyes and a head that can be rotated freely. \tWhen the mobile robot is about to change its traveling direction, \tthe robot rotates its head and eyes in the direction \tit intends to go, \tso that \tan oncoming person \tcan know the robot\u2019s intention \tfrom this previous announcement. \tThree main types of experiment were conducted, to confirm \tthe validity and effectiveness of \tour proposed previous announcement method \tusing the face interface. \tFirst, an appropriate timing for the previous announcement \twas determined from impression evaluations \tas a preliminary experiment. \tSecondly, \tdifferences between two experiments, \tin which a pedestrian and the robot \tpassed each other in a corridor \tboth with and without the previous announcement, \twere evaluated \tas main experiments of this study. \tFinally, \tdifferences between our proposed face interface \tand the conventional robot head were analyzed \tas a reference experiments. \tThe experimental results confirmed the validity and effectiveness of \tthe proposed method.<\/jats:p>","DOI":"10.20965\/jrm.2020.p0097","type":"journal-article","created":{"date-parts":[[2020,2,19]],"date-time":"2020-02-19T10:02:06Z","timestamp":1582106526000},"page":"97-112","source":"Crossref","is-referenced-by-count":4,"title":["Previous Announcement Method Using 3D CG Face Interface for Mobile Robot"],"prefix":"10.20965","volume":"32","author":[{"given":"Masahiko","family":"Mikawa","sequence":"first","affiliation":[]},{"given":"Jiayi","family":"Lyu","sequence":"additional","affiliation":[]},{"given":"Makoto","family":"Fujisawa","sequence":"additional","affiliation":[]},{"given":"Wasuke","family":"Hiiragi","sequence":"additional","affiliation":[]},{"given":"Toyoyuki","family":"Ishibashi","sequence":"additional","affiliation":[]},{"name":"University of Tsukuba 1-2 Kasuga, Tsukuba, Ibaraki 305-8550, Japan","sequence":"additional","affiliation":[]},{"name":"Chubu University 1200 Matsumoto-cho, Kasugai, Aichi 487-8501, Japan","sequence":"additional","affiliation":[]},{"name":"Gifu Women\u2019s University 80 Taromaru, Gifu, Gifu 501-2592, Japan","sequence":"additional","affiliation":[]}],"member":"8550","published-online":{"date-parts":[[2020,2,20]]},"reference":[{"key":"key-10.20965\/jrm.2020.p0097-1","unstructured":"T. Shibata, \u201cTherapeutic seal robot as biofeedback medical device: Qualitative and quantitative evaluations of robot therapy in dementia care,\u201d Proc. of the IEEE, Vol.100, No.8, pp. 2527-2538, 2012."},{"key":"key-10.20965\/jrm.2020.p0097-2","doi-asserted-by":"crossref","unstructured":"T. Shibata, \u201cAibo: Toward the era of digital creatures,\u201d The Int. J. of Robotics Research, Vol.20, No.10, pp. 781-794, 2001.","DOI":"10.1177\/02783640122068092"},{"key":"key-10.20965\/jrm.2020.p0097-3","unstructured":"D. Wooden, M. Malchano, K. Blankespoor, A. Howardy, A. A. Rizzi, and M. Raibert, \u201cAutonomous navigation for BigDog,\u201d Proc. of 2010 IEEE Int. Conf. on Robotics and Automation, pp. 4736-4741, 2010."},{"key":"key-10.20965\/jrm.2020.p0097-4","doi-asserted-by":"crossref","unstructured":"J. Jones, \u201cRobots at the tipping point: the road to irobot roomba,\u201d IEEE Robotics Automation Magazine, Vol.13, No.1, pp. 76-78, 2006.","DOI":"10.1109\/MRA.2006.1598056"},{"key":"key-10.20965\/jrm.2020.p0097-5","doi-asserted-by":"crossref","unstructured":"S. Cremer, L. Mastromoro, and D. O. Popa, \u201cOn the performance of the baxter research robot,\u201d Proc. of 2016 IEEE Int. Symp. on Assembly and Manufacturing (ISAM 2016), pp. 106-111, 2016.","DOI":"10.1109\/ISAM.2016.7750722"},{"key":"key-10.20965\/jrm.2020.p0097-6","doi-asserted-by":"crossref","unstructured":"A. K. Pandey and R. Gelin, \u201cA mass-produced sociable humanoid robot: Pepper: The first machine of its kind,\u201d IEEE Robotics Automation Magazine, Vol.25, No.3, pp. 40-48, 2018.","DOI":"10.1109\/MRA.2018.2833157"},{"key":"key-10.20965\/jrm.2020.p0097-7","doi-asserted-by":"crossref","unstructured":"T. Kanda, \u201cEnabling Harmonized Human-Robot Interaction in a Public Space,\u201d pp. 115-137, Springer Japan, 2017.","DOI":"10.1007\/978-4-431-56535-2_4"},{"key":"key-10.20965\/jrm.2020.p0097-8","unstructured":"E. Guizzo, \u201cCynthia Breazeal unveils Jibo, a social robot for the home,\u201d IEEE SPECTRUM, July 16, 2014."},{"key":"key-10.20965\/jrm.2020.p0097-9","unstructured":"J. K. Westlund et al., \u201cTega: A social robot,\u201d Proc. of 2016 11th ACM\/IEEE Int. Conf. on Human-Robot Interaction (HRI 2016), pp. 561-561, 2016."},{"key":"key-10.20965\/jrm.2020.p0097-10","unstructured":"K. Kaneko, H. Kaminaga, T. Sakaguchi, S. Kajita, M. Morisawa, I. Kumagai, and F. Kanehiro, \u201cHumanoid robot HRP-5P: An electrically actuated humanoid robot with high-power and wide-range joints,\u201d IEEE Robotics and Automation Letters, Vol.4, No.2, pp. 1431-1438, 2019."},{"key":"key-10.20965\/jrm.2020.p0097-11","unstructured":"N. A. Radford et al., \u201cValkyrie: NASA\u2019s first bipedal humanoid robot,\u201d J. of Field Robotics, Vol.32, No.3, pp. 397-419, 2015."},{"key":"key-10.20965\/jrm.2020.p0097-12","doi-asserted-by":"crossref","unstructured":"M. Mikawa, Y. Yoshikawa, and M. Fujisawa, \u201cExpression of intention by rotational head movements for teleoperated mobile robot,\u201d Proc. of 2018 IEEE 15th Int. Workshop on Advanced Motion Control (AMC2018), pp. 249-254, 2018.","DOI":"10.1109\/AMC.2019.8371097"},{"key":"key-10.20965\/jrm.2020.p0097-13","unstructured":"D. Todorovi\u0107, \u201cGeometrical basis of perception of gaze direction,\u201d Vision Research, Vol.46, No.21, pp. 3549-3562, 2006."},{"key":"key-10.20965\/jrm.2020.p0097-14","unstructured":"I. Kawaguchi, H. Kuzuoka, and Y. Suzuki, \u201cStudy on gaze direction perception of face image displayed on rotatable flat display,\u201d Proc. of the 33rd Annual ACM Conf. on Human Factors in Computing Systems (CHI\u201915), pp. 1729-1737, 2015."},{"key":"key-10.20965\/jrm.2020.p0097-15","unstructured":"D. Miyauchi, A. Nakamura, and Y. Kuno, \u201cBidirectional eye contact for human-robot communication,\u201d IEICE \u2013 Trans. on Information and Systems, Vol.E88-D, No.11, pp. 2509-2516, 2005."},{"key":"key-10.20965\/jrm.2020.p0097-16","doi-asserted-by":"crossref","unstructured":"I. Shindev, Y. Sun, M. Coovert, J. Pavlova, and T. Lee, \u201cExploration of intention expression for robots,\u201d Proc. of the 7th ACM\/IEEE Int. Conf. on Human-Robot\tInteraction (HRI2012), pp. 247-248, 2012.","DOI":"10.1145\/2157689.2157778"},{"key":"key-10.20965\/jrm.2020.p0097-17","unstructured":"T. Matsumaru, \u201cComparison of Displaying with Vocalizing on Preliminary-Announcement of Mobile Robot Upcoming Operation,\u201d C. Ciufudean and L. Garcia (Eds.), \u201cAdvances in Robotics \u2013 Modeling, Control and Applications,\u201d pp. 133-147, iConcept Press, 2013."},{"key":"key-10.20965\/jrm.2020.p0097-18","doi-asserted-by":"crossref","unstructured":"S. Andrist, X. Z. Tan, M. Gleicher, and B. Mutlu, \u201cConversational gaze aversion for humanlike robots,\u201d Proc. of the 2014 ACM\/IEEE Int. Conf. on Human-robot Interaction, pp. 25-32, 2014.","DOI":"10.1145\/2559636.2559666"},{"key":"key-10.20965\/jrm.2020.p0097-19","doi-asserted-by":"crossref","unstructured":"H. Admoni and B. Scassellati, \u201cSocial eye gaze in human-robot interaction: A review,\u201d J. of Human-Robot Interaction, Vol.6, No.1, pp. 25-63, 2017.","DOI":"10.5898\/JHRI.6.1.Admoni"},{"key":"key-10.20965\/jrm.2020.p0097-20","doi-asserted-by":"crossref","unstructured":"J. J. Gibson and A. D. Pick, \u201cPerception of another person\u2019s looking behavior,\u201d The American J. of Psychology, Vol.76, No.3, pp. 386-394, 1963.","DOI":"10.2307\/1419779"},{"key":"key-10.20965\/jrm.2020.p0097-21","doi-asserted-by":"crossref","unstructured":"S. M. Anstis, J. W. Mayhew, and T. Morley, \u201cThe perception of where a face or television \u2018portrait\u2019 is looking,\u201d The American J. of Psychology, Vol.82, No.4, pp. 474-489, 1969.","DOI":"10.2307\/1420441"},{"key":"key-10.20965\/jrm.2020.p0097-22","doi-asserted-by":"crossref","unstructured":"H. Hecht, E. Boyarskaya, and A. Kitaoka, \u201cThe Mona Lisa effect: Testing the limits of perceptual robustness vis-\u00e0-vis slanted images,\u201d Psihologija, Vol.47, pp. 287-301, 2014.","DOI":"10.2298\/PSI1403287H"},{"key":"key-10.20965\/jrm.2020.p0097-23","unstructured":"N. L. Kluttz, B. R. Mayes, R. W. West, and D. S. Kerby, \u201cThe effect of head turn on the perception of gaze,\u201d Vision Research, Vol.49, No.15, pp. 1979-1993, 2009."},{"key":"key-10.20965\/jrm.2020.p0097-24","unstructured":"S. A. Moubayed, J. Edlund, and J. Beskow, \u201cTaming Mona Lisa: Communicating gaze faithfully in 2d and 3d facial projections,\u201d ACM Trans. on Interactive Intelligent Systems (TiiS), Vol.1, No.2, pp. 11:1-11:25, 2012."},{"key":"key-10.20965\/jrm.2020.p0097-25","doi-asserted-by":"crossref","unstructured":"M. Gonzalez-Franco and P. A. Chou, \u201cNon-linear modeling of eye gaze perception as a function of gaze and head direction,\u201d Proc. of 2014 6th Int. Workshop on Quality of Multimedia Experience (QoMEX), pp. 275-280, 2014.","DOI":"10.1109\/QoMEX.2014.6982331"},{"key":"key-10.20965\/jrm.2020.p0097-26","unstructured":"J. Rollo, \u201cTracking for a roboceptionist,\u201d Master\u2019s thesis, School of Computer Science, Computer Science Department, Carnegie Mellon University, 2007."},{"key":"key-10.20965\/jrm.2020.p0097-27","doi-asserted-by":"crossref","unstructured":"D. Fox, W. Burgardy, and S. Thrun, \u201cThe dynamic window approach to collision avoidance,\u201d IEEE Robotics Automation Magazine, Vol.4, No.1, pp. 23-33, 1997.","DOI":"10.1109\/100.580977"},{"key":"key-10.20965\/jrm.2020.p0097-28","doi-asserted-by":"crossref","unstructured":"E. Pacchierotti, H. I. Christensen, and P. Jensfelt, \u201cEvaluation of passing distance for social robots,\u201d Proc. of the 15th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN 2006), pp. 315-320, 2006.","DOI":"10.1109\/ROMAN.2006.314436"},{"key":"key-10.20965\/jrm.2020.p0097-29","unstructured":"J. Redmon and A. Farhadi, \u201cYOLOv3: An incremental improvement,\u201d arXiv, abs\/1804.02767, 2018."},{"key":"key-10.20965\/jrm.2020.p0097-30","unstructured":"E. T. Hall, \u201cThe Hidden Dimension,\u201d Anchor Books, 1966."},{"key":"key-10.20965\/jrm.2020.p0097-31","doi-asserted-by":"crossref","unstructured":"J. Lyu, M. Mikawa, M. Fujisawa, and W. Hiiragi, \u201cMobile robot with previous announcement of upcoming operation using face interface,\u201d Proc. of 2019 IEEE\/SICE Int. Symp. on System Integration (SII2019), pp. 782-787, 2019.","DOI":"10.1109\/SII.2019.8700334"},{"key":"key-10.20965\/jrm.2020.p0097-32","doi-asserted-by":"crossref","unstructured":"J. Minguez and L. Montano, \u201cNearness diagram (nd) navigation: collision avoidance in troublesome scenarios,\u201d IEEE Trans. on Robotics and Automation, Vol.20, No.1, pp. 45-59, 2004.","DOI":"10.1109\/TRA.2003.820849"},{"key":"key-10.20965\/jrm.2020.p0097-33","doi-asserted-by":"crossref","unstructured":"S. Masuko and J. Hoshino, \u201cHead-eye animation corresponding to a conversation for cg characters,\u201d Computer Graphics Forum, Vol.26, No.3, pp. 303-312, 2007.","DOI":"10.1111\/j.1467-8659.2007.01052.x"},{"key":"key-10.20965\/jrm.2020.p0097-34","unstructured":"T. Matsumaru, S. Kudo, T. Kusada, K. Iwase, K. Akiyama, and T. Ito, \u201cSimulation of preliminary-announcement and display of mobile robot\u2019s following action by lamp, party-blowouts, or beam-light,\u201d Proc. of IEEE\/ASME Int. Conf. on Advanced Intelligent Mechatronics (AIM 2003), Vol.2, pp. 771-777, 2003."},{"key":"key-10.20965\/jrm.2020.p0097-35","doi-asserted-by":"crossref","unstructured":"J. Xu and A. M. Howard, \u201cThe Impact of First Impressions on Human-Robot Trust During Problem-Solving Scenarios,\u201d Proc. of 2018 27th IEEE Int. Symp. on Robot and Human Interactive Communication (RO-MAN), pp. 435-441, 2018.","DOI":"10.1109\/ROMAN.2018.8525669"},{"key":"key-10.20965\/jrm.2020.p0097-36","doi-asserted-by":"crossref","unstructured":"K. Bergmann, F. Eyssel, and S. Kopp, \u201cA Second Chance to Make a First Impression? How Appearance and Nonverbal Behavior Affect Perceived Warmth and Competence of Virtual Agents over Time,\u201d Proc. of Int. Conf. on Intelligent Virtual Agents, pp. 126-138, 2012.","DOI":"10.1007\/978-3-642-33197-8_13"},{"key":"key-10.20965\/jrm.2020.p0097-37","doi-asserted-by":"crossref","unstructured":"C. Bartneck, E. Croft, and D. Kulic, \u201cMeasurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots,\u201d Int. J. of Social Robotics, Vol.1, No.1, pp. 71-81, 2009.","DOI":"10.1007\/s12369-008-0001-3"},{"key":"key-10.20965\/jrm.2020.p0097-38","doi-asserted-by":"crossref","unstructured":"E. Goffman, \u201cRelations in public, Chapter The Individual as a Unit,\u201d pp. 3-27, Harper & Row, Publishers, Inc., 1971.","DOI":"10.4324\/9781315128337-1"},{"key":"key-10.20965\/jrm.2020.p0097-39","unstructured":"E. Goffman, \u201cBehavior in Public Places, Chapter Face Engagements,\u201d pp. 83-88, The Free Press Publishers, Inc., 1963."},{"key":"key-10.20965\/jrm.2020.p0097-40","doi-asserted-by":"crossref","unstructured":"M. Makatchev and R. Simmons, \u201cIncorporating a user model to improve detection of unhelpful robot answers,\u201d Proc. of Int. Symp. on Robot and Human Interactive Communication (RO-MAN 2009), pp. 973-978, 2009.","DOI":"10.1109\/ROMAN.2009.5326140"}],"container-title":["Journal of Robotics and Mechatronics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.fujipress.jp\/main\/wp-content\/themes\/Fujipress\/phyosetsu.php?ppno=ROBOT003200010010","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2020,2,19]],"date-time":"2020-02-19T10:04:08Z","timestamp":1582106648000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.fujipress.jp\/jrm\/rb\/robot003200010097"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,2,20]]},"references-count":40,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2020,2,20]]},"published-print":{"date-parts":[[2020,2,20]]}},"URL":"http:\/\/dx.doi.org\/10.20965\/jrm.2020.p0097","relation":{},"ISSN":["1883-8049","0915-3942"],"issn-type":[{"type":"electronic","value":"1883-8049"},{"type":"print","value":"0915-3942"}],"subject":[],"published":{"date-parts":[[2020,2,20]]}}}