IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0005256.html
   My bibliography  Save this article

Auditory-Visual Object Recognition Time Suggests Specific Processing for Animal Sounds

Author

Listed:
  • Clara Suied
  • Isabelle Viaud-Delmon

Abstract

Background: Recognizing an object requires binding together several cues, which may be distributed across different sensory modalities, and ignoring competing information originating from other objects. In addition, knowledge of the semantic category of an object is fundamental to determine how we should react to it. Here we investigate the role of semantic categories in the processing of auditory-visual objects. Methodology/Findings: We used an auditory-visual object-recognition task (go/no-go paradigm). We compared recognition times for two categories: a biologically relevant one (animals) and a non-biologically relevant one (means of transport). Participants were asked to react as fast as possible to target objects, presented in the visual and/or the auditory modality, and to withhold their response for distractor objects. A first main finding was that, when participants were presented with unimodal or bimodal congruent stimuli (an image and a sound from the same object), similar reaction times were observed for all object categories. Thus, there was no advantage in the speed of recognition for biologically relevant compared to non-biologically relevant objects. A second finding was that, in the presence of a biologically relevant auditory distractor, the processing of a target object was slowed down, whether or not it was itself biologically relevant. It seems impossible to effectively ignore an animal sound, even when it is irrelevant to the task. Conclusions/Significance: These results suggest a specific and mandatory processing of animal sounds, possibly due to phylogenetic memory and consistent with the idea that hearing is particularly efficient as an alerting sense. They also highlight the importance of taking into account the auditory modality when investigating the way object concepts of biologically relevant categories are stored and retrieved.

Suggested Citation

  • Clara Suied & Isabelle Viaud-Delmon, 2009. "Auditory-Visual Object Recognition Time Suggests Specific Processing for Animal Sounds," PLOS ONE, Public Library of Science, vol. 4(4), pages 1-9, April.
  • Handle: RePEc:plo:pone00:0005256
    DOI: 10.1371/journal.pone.0005256
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0005256
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0005256&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0005256?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Pascal Belin & Robert J. Zatorre & Philippe Lafaille & Pierre Ahad & Bruce Pike, 2000. "Voice-selective areas in human auditory cortex," Nature, Nature, vol. 403(6767), pages 309-312, January.
    2. Israel Nelken & Yaron Rotman & Omer Bar Yosef, 1999. "Responses of auditory-cortex neurons to structural features of natural sounds," Nature, Nature, vol. 397(6715), pages 154-157, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sukhbinder Kumar & Klaas E Stephan & Jason D Warren & Karl J Friston & Timothy D Griffiths, 2007. "Hierarchical Processing of Auditory Objects in Humans," PLOS Computational Biology, Public Library of Science, vol. 3(6), pages 1-9, June.
    2. Sam V Norman-Haignere & Josh H McDermott, 2018. "Neural responses to natural and model-matched stimuli reveal distinct computations in primary and nonprimary auditory cortex," PLOS Biology, Public Library of Science, vol. 16(12), pages 1-46, December.
    3. Patrícia Vanzella & E Glenn Schellenberg, 2010. "Absolute Pitch: Effects of Timbre on Note-Naming Ability," PLOS ONE, Public Library of Science, vol. 5(11), pages 1-7, November.
    4. Marie-Lou Barnaud & Jean-Luc Schwartz & Pierre Bessière & Julien Diard, 2019. "Computer simulations of coupled idiosyncrasies in speech perception and speech production with COSMO, a perceptuo-motor Bayesian model of speech communication," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-34, January.
    5. Jacob N Oppenheim & Pavel Isakov & Marcelo O Magnasco, 2013. "Degraded Time-Frequency Acuity to Time-Reversed Notes," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-6, June.
    6. Julie E Elie & Frédéric E Theunissen, 2019. "Invariant neural responses for sensory categories revealed by the time-varying information for communication calls," PLOS Computational Biology, Public Library of Science, vol. 15(9), pages 1-43, September.
    7. Emmanuel Bigand & Charles Delbé & Yannick Gérard & Barbara Tillmann, 2011. "Categorization of Extremely Brief Auditory Stimuli: Domain-Specific or Domain-General Processes?," PLOS ONE, Public Library of Science, vol. 6(10), pages 1-6, October.
    8. Daniel E Re & Jillian J M O'Connor & Patrick J Bennett & David R Feinberg, 2012. "Preferences for Very Low and Very High Voice Pitch in Humans," PLOS ONE, Public Library of Science, vol. 7(3), pages 1-8, March.
    9. Lingyun Zhao & Li Zhaoping, 2011. "Understanding Auditory Spectro-Temporal Receptive Fields and Their Changes with Input Statistics by Efficient Coding Principles," PLOS Computational Biology, Public Library of Science, vol. 7(8), pages 1-16, August.
    10. Kazuo Imaizumi & Nicholas J Priebe & Tatyana O Sharpee & Steven W Cheung & Christoph E Schreiner, 2010. "Encoding of Temporal Information by Timing, Rate, and Place in Cat Auditory Cortex," PLOS ONE, Public Library of Science, vol. 5(7), pages 1-15, July.
    11. Mina Sadeghi & Xiu Zhai & Ian H Stevenson & Monty A Escabí, 2019. "A neural ensemble correlation code for sound category identification," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-41, October.
    12. Carolin Brück & Christina Gößling-Arnold & Jürgen Wertheimer & Dirk Wildgruber, 2016. "“The Inner Theaterâ€," SAGE Open, , vol. 6(1), pages 21582440166, March.
    13. Klaus Wimmer & K Jannis Hildebrandt & R Matthias Hennig & Klaus Obermayer, 2008. "Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2," PLOS Computational Biology, Public Library of Science, vol. 4(9), pages 1-18, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0005256. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.