IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0032719.html
   My bibliography  Save this article

Preferences for Very Low and Very High Voice Pitch in Humans

Author

Listed:
  • Daniel E Re
  • Jillian J M O'Connor
  • Patrick J Bennett
  • David R Feinberg

Abstract

Manipulations of voice pitch have been shown to alter attractiveness ratings, but whether preferences extend to very low or very high voice pitch is unknown. Here, we manipulated voice pitch in averaged men's and women's voices by 2 Hz intervals to create a range of male and female voices speaking monopthong vowel sounds and spanning a range of frequencies from normal to very low and very high pitch. With these voices, we used the method of constant stimuli to measure preferences for voice. Nineteen university students (ages: 20–25) participated in three experiments. On average, men preferred high-pitched women's voices to low-pitched women's voices across all frequencies tested. On average, women preferred men's voices lowered in pitch, but did not prefer very low men's voices. The results of this study may reflect selection pressures for men's and women's voices, and shed light on a perceptual link between voice pitch and vocal attractiveness.

Suggested Citation

  • Daniel E Re & Jillian J M O'Connor & Patrick J Bennett & David R Feinberg, 2012. "Preferences for Very Low and Very High Voice Pitch in Humans," PLOS ONE, Public Library of Science, vol. 7(3), pages 1-8, March.
  • Handle: RePEc:plo:pone00:0032719
    DOI: 10.1371/journal.pone.0032719
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0032719
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0032719&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0032719?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Pascal Belin & Robert J. Zatorre & Philippe Lafaille & Pierre Ahad & Bruce Pike, 2000. "Voice-selective areas in human auditory cortex," Nature, Nature, vol. 403(6767), pages 309-312, January.
    2. Mary E Farrell & Elodie Briefer & Alan G McElligott, 2011. "Assortative Mating in Fallow Deer Reduces the Strength of Sexual Selection," PLOS ONE, Public Library of Science, vol. 6(4), pages 1-9, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sukhbinder Kumar & Klaas E Stephan & Jason D Warren & Karl J Friston & Timothy D Griffiths, 2007. "Hierarchical Processing of Auditory Objects in Humans," PLOS Computational Biology, Public Library of Science, vol. 3(6), pages 1-9, June.
    2. Sam V Norman-Haignere & Josh H McDermott, 2018. "Neural responses to natural and model-matched stimuli reveal distinct computations in primary and nonprimary auditory cortex," PLOS Biology, Public Library of Science, vol. 16(12), pages 1-46, December.
    3. Patrícia Vanzella & E Glenn Schellenberg, 2010. "Absolute Pitch: Effects of Timbre on Note-Naming Ability," PLOS ONE, Public Library of Science, vol. 5(11), pages 1-7, November.
    4. Marie-Lou Barnaud & Jean-Luc Schwartz & Pierre Bessière & Julien Diard, 2019. "Computer simulations of coupled idiosyncrasies in speech perception and speech production with COSMO, a perceptuo-motor Bayesian model of speech communication," PLOS ONE, Public Library of Science, vol. 14(1), pages 1-34, January.
    5. Emmanuel Bigand & Charles Delbé & Yannick Gérard & Barbara Tillmann, 2011. "Categorization of Extremely Brief Auditory Stimuli: Domain-Specific or Domain-General Processes?," PLOS ONE, Public Library of Science, vol. 6(10), pages 1-6, October.
    6. Clara Suied & Isabelle Viaud-Delmon, 2009. "Auditory-Visual Object Recognition Time Suggests Specific Processing for Animal Sounds," PLOS ONE, Public Library of Science, vol. 4(4), pages 1-9, April.
    7. Kazuo Imaizumi & Nicholas J Priebe & Tatyana O Sharpee & Steven W Cheung & Christoph E Schreiner, 2010. "Encoding of Temporal Information by Timing, Rate, and Place in Cat Auditory Cortex," PLOS ONE, Public Library of Science, vol. 5(7), pages 1-15, July.
    8. Carolin Brück & Christina Gößling-Arnold & Jürgen Wertheimer & Dirk Wildgruber, 2016. "“The Inner Theaterâ€," SAGE Open, , vol. 6(1), pages 21582440166, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0032719. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.