IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0204217.html
   My bibliography  Save this article

What you saw is what you will hear: Two new illusions with audiovisual postdictive effects

Author

Listed:
  • Noelle R B Stiles
  • Monica Li
  • Carmel A Levitan
  • Yukiyasu Kamitani
  • Shinsuke Shimojo

Abstract

Neuroscience investigations are most often focused on the prediction of future perception or decisions based on prior brain states or stimulus presentations. However, the brain can also process information retroactively, such that later stimuli impact conscious percepts of the stimuli that have already occurred (called “postdiction”). Postdictive effects have thus far been mostly unimodal (such as apparent motion), and the models for postdiction have accordingly been limited to early sensory regions of one modality. We have discovered two related multimodal illusions in which audition instigates postdictive changes in visual perception. In the first illusion (called the “Illusory Audiovisual Rabbit”), the location of an illusory flash is influenced by an auditory beep-flash pair that follows the perceived illusory flash. In the second illusion (called the “Invisible Audiovisual Rabbit”), a beep-flash pair following a real flash suppresses the perception of the earlier flash. Thus, we showed experimentally that these two effects are influenced significantly by postdiction. The audiovisual rabbit illusions indicate that postdiction can bridge the senses, uncovering a relatively-neglected yet critical type of neural processing underlying perceptual awareness. Furthermore, these two new illusions broaden the Double Flash Illusion, in which a single real flash is doubled by two sounds. Whereas the double flash indicated that audition can create an illusory flash, these rabbit illusions expand audition’s influence on vision to the suppression of a real flash and the relocation of an illusory flash. These new additions to auditory-visual interactions indicate a spatio-temporally fine-tuned coupling of the senses to generate perception.

Suggested Citation

  • Noelle R B Stiles & Monica Li & Carmel A Levitan & Yukiyasu Kamitani & Shinsuke Shimojo, 2018. "What you saw is what you will hear: Two new illusions with audiovisual postdictive effects," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-22, October.
  • Handle: RePEc:plo:pone00:0204217
    DOI: 10.1371/journal.pone.0204217
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0204217
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0204217&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0204217?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ladan Shams & Yukiyasu Kamitani & Shinsuke Shimojo, 2000. "What you see is what you hear," Nature, Nature, vol. 408(6814), pages 788-788, December.
    2. Marc O. Ernst & Martin S. Banks, 2002. "Humans integrate visual and haptic information in a statistically optimal fashion," Nature, Nature, vol. 415(6870), pages 429-433, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wendy J Adams, 2016. "The Development of Audio-Visual Integration for Temporal Judgements," PLOS Computational Biology, Public Library of Science, vol. 12(4), pages 1-17, April.
    2. repec:plo:pone00:0236321 is not listed on IDEAS
    3. Simon Weiler & Vahid Rahmati & Marcel Isstas & Johann Wutke & Andreas Walter Stark & Christian Franke & Jürgen Graf & Christian Geis & Otto W. Witte & Mark Hübener & Jürgen Bolz & Troy W. Margrie & Kn, 2024. "A primary sensory cortical interareal feedforward inhibitory circuit for tacto-visual integration," Nature Communications, Nature, vol. 15(1), pages 1-24, December.
    4. Loreen Hertäg & Katharina A. Wilmes & Claudia Clopath, 2025. "Uncertainty estimation with prediction-error circuits," Nature Communications, Nature, vol. 16(1), pages 1-15, December.
    5. Catarina Mendonça & Pietro Mandelli & Ville Pulkki, 2016. "Modeling the Perception of Audiovisual Distance: Bayesian Causal Inference and Other Models," PLOS ONE, Public Library of Science, vol. 11(12), pages 1-18, December.
    6. Alice Masini & Marcello Lanari & Sofia Marini & Alessia Tessari & Stefania Toselli & Rita Stagni & Maria Cristina Bisi & Laura Bragonzoni & Davide Gori & Alessandra Sansavini & Andrea Ceciliani & Laur, 2020. "A Multiple Targeted Research Protocol for a Quasi-Experimental Trial in Primary School Children Based on an Active Break Intervention: The Imola Active Breaks (I-MOVE) Study," IJERPH, MDPI, vol. 17(17), pages 1-16, August.
    7. Wen-Hao Zhang & Si Wu & Krešimir Josić & Brent Doiron, 2023. "Sampling-based Bayesian inference in recurrent circuits of stochastic spiking neurons," Nature Communications, Nature, vol. 14(1), pages 1-19, December.
    8. Benjamin de Haas & Roberto Cecere & Harriet Cullen & Jon Driver & Vincenzo Romei, 2013. "The Duration of a Co-Occurring Sound Modulates Visual Detection Performance in Humans," PLOS ONE, Public Library of Science, vol. 8(1), pages 1-8, January.
    9. Adam N Sanborn & Ulrik R Beierholm, 2016. "Fast and Accurate Learning When Making Discrete Numerical Estimates," PLOS Computational Biology, Public Library of Science, vol. 12(4), pages 1-28, April.
    10. Seth W. Egger & Stephen G. Lisberger, 2022. "Neural structure of a sensory decoder for motor control," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    11. Yasuhiro Takeshima & Jiro Gyoba, 2015. "Different Effects of Attentional Mechanisms between Visual and Auditory Cueing," International Journal of Psychological Studies, Canadian Center of Science and Education, vol. 7(3), pages 176-176, September.
    12. Xiaochen Zhang & Lingling Jin & Jie Zhao & Jiazhen Li & Ding-Bang Luh & Tiansheng Xia, 2022. "The Influences of Different Sensory Modalities and Cognitive Loads on Walking Navigation: A Preliminary Study," Sustainability, MDPI, vol. 14(24), pages 1-14, December.
    13. Johannes Burge & Priyank Jaini, 2017. "Accuracy Maximization Analysis for Sensory-Perceptual Tasks: Computational Improvements, Filter Robustness, and Coding Advantages for Scaled Additive Noise," PLOS Computational Biology, Public Library of Science, vol. 13(2), pages 1-32, February.
    14. Yingjie Lai & Chaemoon Yoo & Xiaomin Zhou & Younghwan Pan, 2023. "Elements of Food Service Design for Low-Carbon Tourism-Based on Dine-In Tourist Behavior and Attitudes in China," Sustainability, MDPI, vol. 15(9), pages 1-21, May.
    15. Brocas, Isabelle & Carrillo, Juan D., 2012. "From perception to action: An economic model of brain processes," Games and Economic Behavior, Elsevier, vol. 75(1), pages 81-103.
    16. Jean-François Patri & Pascal Perrier & Jean-Luc Schwartz & Julien Diard, 2018. "What drives the perceptual change resulting from speech motor adaptation? Evaluation of hypotheses in a Bayesian modeling framework," PLOS Computational Biology, Public Library of Science, vol. 14(1), pages 1-38, January.
    17. Florent Meyniel, 2020. "Brain dynamics for confidence-weighted learning," PLOS Computational Biology, Public Library of Science, vol. 16(6), pages 1-27, June.
    18. Jennifer Laura Lee & Wei Ji Ma, 2021. "Point-estimating observer models for latent cause detection," PLOS Computational Biology, Public Library of Science, vol. 17(10), pages 1-29, October.
    19. Anna Lambrechts & Vincent Walsh & Virginie van Wassenhove, 2013. "Evidence Accumulation in the Magnitude System," PLOS ONE, Public Library of Science, vol. 8(12), pages 1-10, December.
    20. Sarah E Donohue & Lawrence G Appelbaum & Christina J Park & Kenneth C Roberts & Marty G Woldorff, 2013. "Cross-Modal Stimulus Conflict: The Behavioral Effects of Stimulus Input Timing in a Visual-Auditory Stroop Task," PLOS ONE, Public Library of Science, vol. 8(4), pages 1-13, April.
    21. Elina Stengård & Ronald van den Berg, 2019. "Imperfect Bayesian inference in visual perception," PLOS Computational Biology, Public Library of Science, vol. 15(4), pages 1-27, April.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0204217. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.