IDEAS home Printed from https://ideas.repec.org/a/igg/jopcd0/v3y2013i3p62-75.html
   My bibliography  Save this article

An Automatic Mechanism to Recognize and Generate Emotional MIDI Sound Arts Based on Affective Computing Techniques

Author

Listed:
  • Hao-Chiang Koong Lin

    (National University of Tainan, Tainan, Taiwan)

  • Cong Jie Sun

    (National Taiwan Normal University, Taipei, Taiwan)

  • Bei Ni Su

    (National University of Tainan, Tainan, Taiwan)

  • Zu An Lin

    (National University of Tainan, Tainan, Taiwan)

Abstract

All kinds of arts have the chance to be represented in digital forms, and one of them is the sound art, including ballads by word of mouth, classical music, religious music, popular music and emerging computer music. Recently, affective computing has drowned a lot of attention in the academic field, and it has two parts: physiology and psychology. Through a variety of sensing devices, the authors can get behaviors which are represented by feelings and emotions. Therefore, the authors may not only identify but also understand human emotions. This work focuses on exploring and producing the MAX/MSP computer program which can generate the emotional music automatically. It can also recognize the emotion identified when users play MIDI instruments and create visual effects. The authors hope to achieve two major goals: (1) Producing the performance of art combined with dynamic vision and auditory tune. (2) Making computers understand human emotions and interact with music by affective computing. The results of this study are as follows:(1) The authors design a corresponding mechanism of music tone and human emotion recognition. (2) The authors develop a combination of affective computing and the auto music generator. (3) The authors design a music system which can be used with MIDI instrument and also be incorporated with other music effects to add the Musicality. (4) The authors Assess and complete the emotion discrimination mechanism of how mood music can feedback accurately. The authors make computers simulate (even have) human emotion, and obtain relevant basis for more accurate sound feedback. The authors use System Usability Scale to analyze and discuss about the usability of the system. Also, the average score of each item is obviously higher than the simple score (four points) for the overall response and the performance of music when we use “auto mood music generator”. There are average performance which is more than five points in each part of Interaction and Satisfaction Scale. Subjects are willing to accept this interactive work, so it proves that the work has the usability and the potential which the authors can keep developing on.

Suggested Citation

  • Hao-Chiang Koong Lin & Cong Jie Sun & Bei Ni Su & Zu An Lin, 2013. "An Automatic Mechanism to Recognize and Generate Emotional MIDI Sound Arts Based on Affective Computing Techniques," International Journal of Online Pedagogy and Course Design (IJOPCD), IGI Global, vol. 3(3), pages 62-75, July.
  • Handle: RePEc:igg:jopcd0:v:3:y:2013:i:3:p:62-75
    as

    Download full text from publisher

    File URL: http://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/ijopcd.2013070104
    Download Restriction: no
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:igg:jopcd0:v:3:y:2013:i:3:p:62-75. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Journal Editor (email available below). General contact details of provider: https://www.igi-global.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.