IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0183723.html
   My bibliography  Save this article

Visuo-perceptual capabilities predict sensitivity for coinciding auditory and visual transients in multi-element displays

Author

Listed:
  • Hauke S Meyerhoff
  • Nina A Gehrer

Abstract

In order to obtain a coherent representation of the outside world, auditory and visual information are integrated during human information processing. There is remarkable variance among observers in the capability to integrate auditory and visual information. Here, we propose that visuo-perceptual capabilities predict detection performance for audiovisually coinciding transients in multi-element displays due to severe capacity limitations in audiovisual integration. In the reported experiment, we employed an individual differences approach in order to investigate this hypothesis. Therefore, we measured performance in a useful-field-of-view task that captures detection performance for briefly presented stimuli across a large perceptual field. Furthermore, we measured sensitivity for visual direction changes that coincide with tones within the same participants. Our results show that individual differences in visuo-perceptual capabilities predicted sensitivity for the presence of audiovisually synchronous events among competing visual stimuli. To ensure that this correlation does not stem from superordinate factors, we also tested performance in an unrelated working memory task. Performance in this task was independent of sensitivity for the presence of audiovisually synchronous events. Our findings strengthen the proposed link between visuo-perceptual capabilities and audiovisual integration. The results also suggest that basic visuo-perceptual capabilities provide the basis for the subsequent integration of auditory and visual information.

Suggested Citation

  • Hauke S Meyerhoff & Nina A Gehrer, 2017. "Visuo-perceptual capabilities predict sensitivity for coinciding auditory and visual transients in multi-element displays," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
  • Handle: RePEc:plo:pone00:0183723
    DOI: 10.1371/journal.pone.0183723
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0183723
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0183723&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0183723?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ladan Shams & Yukiyasu Kamitani & Shinsuke Shimojo, 2000. "What you see is what you hear," Nature, Nature, vol. 408(6814), pages 788-788, December.
    2. Steven J. Luck & Edward K. Vogel, 1997. "The capacity of visual working memory for features and conjunctions," Nature, Nature, vol. 390(6657), pages 279-281, November.
    3. Robert Sekuler & Allison B. Sekuler & Renee Lau, 1997. "Sound alters visual motion perception," Nature, Nature, vol. 385(6614), pages 308-308, January.
    4. John J. McDonald & Wolfgang A. Teder-Sälejärvi & Steven A. Hillyard, 2000. "Involuntary orienting to sound improves visual perception," Nature, Nature, vol. 407(6806), pages 906-908, October.
    5. C. Shawn Green & Daphne Bavelier, 2003. "Action video game modifies visual selective attention," Nature, Nature, vol. 423(6939), pages 534-537, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pavel Kraikivski, 2022. "A Dynamic Mechanistic Model of Perceptual Binding," Mathematics, MDPI, vol. 10(7), pages 1-12, April.
    2. Yoshiko Yabe & Hama Watanabe & Gentaro Taga, 2011. "Treadmill Experience Alters Treadmill Effects on Perceived Visual Motion," PLOS ONE, Public Library of Science, vol. 6(7), pages 1-9, July.
    3. Pierre Mégevand & Sophie Molholm & Ashabari Nayak & John J Foxe, 2013. "Recalibration of the Multisensory Temporal Window of Integration Results from Changing Task Demands," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
    4. Igor S. Utochkin & Vladislav A. Khvostov & Yulia M. Stakina, 2017. "Ensemble-Based Segmentation in the Perception of Multiple Feature Conjunctions," HSE Working papers WP BRP 78/PSY/2017, National Research University Higher School of Economics.
    5. Jastrzębski, Jan & Ciechanowska, Iwona & Chuderski, Adam, 2018. "The strong link between fluid intelligence and working memory cannot be explained away by strategy use," Intelligence, Elsevier, vol. 66(C), pages 44-53.
    6. Alice Masini & Marcello Lanari & Sofia Marini & Alessia Tessari & Stefania Toselli & Rita Stagni & Maria Cristina Bisi & Laura Bragonzoni & Davide Gori & Alessandra Sansavini & Andrea Ceciliani & Laur, 2020. "A Multiple Targeted Research Protocol for a Quasi-Experimental Trial in Primary School Children Based on an Active Break Intervention: The Imola Active Breaks (I-MOVE) Study," IJERPH, MDPI, vol. 17(17), pages 1-16, August.
    7. Aki Kondo & Jun Saiki, 2012. "Feature-Specific Encoding Flexibility in Visual Working Memory," PLOS ONE, Public Library of Science, vol. 7(12), pages 1-8, December.
    8. Hongwei Tan & Sebastiaan van Dijken, 2023. "Dynamic machine vision with retinomorphic photomemristor-reservoir computing," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    9. Robert W. Faff & Sebastian Kernbach, 2021. "A visualisation approach for pitching research," Accounting and Finance, Accounting and Finance Association of Australia and New Zealand, vol. 61(4), pages 5177-5197, December.
    10. Maori Kobayashi & Wataru Teramoto & Souta Hidaka & Yoichi Sugita, 2012. "Sound Frequency and Aural Selectivity in Sound-Contingent Visual Motion Aftereffect," PLOS ONE, Public Library of Science, vol. 7(5), pages 1-6, May.
    11. Yuri A. Markov & Natalia A. Tiurina & Igor S. Utochkin, 2018. "Different features are stored independently in visual working memory but mediated by object-based representations," HSE Working papers WP BRP 101/PSY/2018, National Research University Higher School of Economics.
    12. Benjamin de Haas & Roberto Cecere & Harriet Cullen & Jon Driver & Vincenzo Romei, 2013. "The Duration of a Co-Occurring Sound Modulates Visual Detection Performance in Humans," PLOS ONE, Public Library of Science, vol. 8(1), pages 1-8, January.
    13. Tullo, Domenico & Faubert, Jocelyn & Bertone, Armando, 2018. "The characterization of attention resource capacity and its relationship with fluid reasoning intelligence: A multiple object tracking study," Intelligence, Elsevier, vol. 69(C), pages 158-168.
    14. Brian D Glass & W Todd Maddox & Bradley C Love, 2013. "Real-Time Strategy Game Training: Emergence of a Cognitive Flexibility Trait," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-7, August.
    15. Alessio Fracasso & Stefano Targher & Massimiliano Zampini & David Melcher, 2013. "Fooling the Eyes: The Influence of a Sound-Induced Visual Motion Illusion on Eye Movements," PLOS ONE, Public Library of Science, vol. 8(4), pages 1-8, April.
    16. Aaron V Berard & Matthew S Cain & Takeo Watanabe & Yuka Sasaki, 2015. "Frequent Video Game Players Resist Perceptual Interference," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-10, March.
    17. Wendy J Adams, 2016. "The Development of Audio-Visual Integration for Temporal Judgements," PLOS Computational Biology, Public Library of Science, vol. 12(4), pages 1-17, April.
    18. Yasuhiro Takeshima & Jiro Gyoba, 2015. "Different Effects of Attentional Mechanisms between Visual and Auditory Cueing," International Journal of Psychological Studies, Canadian Center of Science and Education, vol. 7(3), pages 176-176, September.
    19. Jifan Zhou & Jun Yin & Tong Chen & Xiaowei Ding & Zaifeng Gao & Mowei Shen, 2011. "Visual Working Memory Capacity Does Not Modulate the Feature-Based Information Filtering in Visual Working Memory," PLOS ONE, Public Library of Science, vol. 6(9), pages 1-10, September.
    20. Stephanie Carlson & Yuichi Shoda & Ozlem Ayduk & Lawrence Aber & Catherine Schaefer & Anita Sethi & Nicole Wilson & Philip Peake & Walter Mischel, 2017. "Cohort Effects in Children's Delay-of-Gratification," Working Papers 2017-077, Human Capital and Economic Opportunity Working Group.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0183723. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.