IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0065386.html
   My bibliography  Save this article

Degraded Time-Frequency Acuity to Time-Reversed Notes

Author

Listed:
  • Jacob N Oppenheim
  • Pavel Isakov
  • Marcelo O Magnasco

Abstract

Time-reversal symmetry breaking is a key feature of many classes of natural sounds, originating in the physics of sound production. While attention has been paid to the response of the auditory system to “natural stimuli,” very few psychophysical tests have been performed. We conduct psychophysical measurements of time-frequency acuity for stylized representations of “natural”-like notes (sharp attack, long decay) and the time-reversed versions of these notes (long attack, sharp decay). Our results demonstrate significantly greater precision, arising from enhanced temporal acuity, for such sounds over their time-reversed versions, without a corresponding decrease in frequency acuity. These data inveigh against models of auditory processing that include tradeoffs between temporal and frequency acuity, at least in the range of notes tested and suggest the existence of statistical priors for notes with a sharp-attack and a long-decay. We are additionally able to calculate a minimal theoretical bound on the sophistication of the nonlinearities in auditory processing. We find that among the best studied classes of nonlinear time-frequency representations, only matching pursuit, spectral derivatives, and reassigned spectrograms are able to satisfy this criterion.

Suggested Citation

  • Jacob N Oppenheim & Pavel Isakov & Marcelo O Magnasco, 2013. "Degraded Time-Frequency Acuity to Time-Reversed Notes," PLOS ONE, Public Library of Science, vol. 8(6), pages 1-6, June.
  • Handle: RePEc:plo:pone00:0065386
    DOI: 10.1371/journal.pone.0065386
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0065386
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0065386&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0065386?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Evan C. Smith & Michael S. Lewicki, 2006. "Efficient auditory coding," Nature, Nature, vol. 439(7079), pages 978-982, February.
    2. Israel Nelken & Yaron Rotman & Omer Bar Yosef, 1999. "Responses of auditory-cortex neurons to structural features of natural sounds," Nature, Nature, vol. 397(6715), pages 154-157, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lingyun Zhao & Li Zhaoping, 2011. "Understanding Auditory Spectro-Temporal Receptive Fields and Their Changes with Input Statistics by Efficient Coding Principles," PLOS Computational Biology, Public Library of Science, vol. 7(8), pages 1-16, August.
    2. Noga Mosheiff & Haggai Agmon & Avraham Moriel & Yoram Burak, 2017. "An efficient coding theory for a dynamic trajectory predicts non-uniform allocation of entorhinal grid cells to modules," PLOS Computational Biology, Public Library of Science, vol. 13(6), pages 1-19, June.
    3. Sam V Norman-Haignere & Josh H McDermott, 2018. "Neural responses to natural and model-matched stimuli reveal distinct computations in primary and nonprimary auditory cortex," PLOS Biology, Public Library of Science, vol. 16(12), pages 1-46, December.
    4. Jonathan J Hunt & Peter Dayan & Geoffrey J Goodhill, 2013. "Sparse Coding Can Predict Primary Visual Cortex Receptive Field Changes Induced by Abnormal Visual Input," PLOS Computational Biology, Public Library of Science, vol. 9(5), pages 1-17, May.
    5. Lubomir Kostal & Petr Lansky & Jean-Pierre Rospars, 2008. "Efficient Olfactory Coding in the Pheromone Receptor Neuron of a Moth," PLOS Computational Biology, Public Library of Science, vol. 4(4), pages 1-11, April.
    6. Julie E Elie & Frédéric E Theunissen, 2019. "Invariant neural responses for sensory categories revealed by the time-varying information for communication calls," PLOS Computational Biology, Public Library of Science, vol. 15(9), pages 1-43, September.
    7. Jonathan Schaffner & Sherry Dongqi Bao & Philippe N. Tobler & Todd A. Hare & Rafael Polania, 2023. "Sensory perception relies on fitness-maximizing codes," Nature Human Behaviour, Nature, vol. 7(7), pages 1135-1151, July.
    8. Gonzalo H Otazu & Christian Leibold, 2011. "A Corticothalamic Circuit Model for Sound Identification in Complex Scenes," PLOS ONE, Public Library of Science, vol. 6(9), pages 1-15, September.
    9. Clara Suied & Isabelle Viaud-Delmon, 2009. "Auditory-Visual Object Recognition Time Suggests Specific Processing for Animal Sounds," PLOS ONE, Public Library of Science, vol. 4(4), pages 1-9, April.
    10. Tomas Barta & Lubomir Kostal, 2019. "The effect of inhibition on rate code efficiency indicators," PLOS Computational Biology, Public Library of Science, vol. 15(12), pages 1-21, December.
    11. Mina Sadeghi & Xiu Zhai & Ian H Stevenson & Monty A Escabí, 2019. "A neural ensemble correlation code for sound category identification," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-41, October.
    12. Oded Barzelay & Miriam Furst & Omri Barak, 2017. "A New Approach to Model Pitch Perception Using Sparse Coding," PLOS Computational Biology, Public Library of Science, vol. 13(1), pages 1-36, January.
    13. Klaus Wimmer & K Jannis Hildebrandt & R Matthias Hennig & Klaus Obermayer, 2008. "Adaptation and Selective Information Transmission in the Cricket Auditory Neuron AN2," PLOS Computational Biology, Public Library of Science, vol. 4(9), pages 1-18, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0065386. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.