IDEAS home Printed from https://ideas.repec.org/a/igg/jse000/v3y2012i2p1-30.html
   My bibliography  Save this article

Towards Natural Emotional Expression and Interaction: Development of Anthropomorphic Emotion Expression and Interaction Robots

Author

Listed:
  • Atsuo Takanishi

    (Waseda University, Japan)

  • Nobutsuna Endo

    (Waseda University, Japan)

  • Klaus Petersen

    (Waseda University, Japan)

Abstract

In present research the advanced fundamental mechanical capabilities of anthropomorphic robots developed in Takanishi laboratory at Waseda University are to be enhanced in order to enable these robots to interact with humans in a natural way. The anthropomorphic robot KOBIAN is able to express human-like facial expressions and whole-body gestures. It is equipped with vision and audio sensors that allow it to react to interaction input from human partners and to generate an appropriate emotional expression response. Furthermore, the anthropomorphic flute playing robot WF-4RVI is technically able to perform a musical wind-instrument performance at the level of an intermediate human player. Using this fundamental technical capability the authors implemented a musical-based interaction system (MbIS) that enables the robot to collaboratively play together with human musicians in a natural way. For both of the introduced interaction systems, the authors present and discuss the result of various experiments that were done to examine how well the interaction with a robot resembles realistic human-to-human interaction.

Suggested Citation

  • Atsuo Takanishi & Nobutsuna Endo & Klaus Petersen, 2012. "Towards Natural Emotional Expression and Interaction: Development of Anthropomorphic Emotion Expression and Interaction Robots," International Journal of Synthetic Emotions (IJSE), IGI Global, vol. 3(2), pages 1-30, July.
  • Handle: RePEc:igg:jse000:v:3:y:2012:i:2:p:1-30
    as

    Download full text from publisher

    File URL: http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/jse.2012070101
    Download Restriction: no
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jse000:v:3:y:2012:i:2:p:1-30. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.