IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v13y2022i1d10.1038_s41467-022-33580-7.html
   My bibliography  Save this article

Layer-specific, retinotopically-diffuse modulation in human visual cortex in response to viewing emotionally expressive faces

Author

Listed:
  • Tina T. Liu

    (National Institute of Mental Health, NIH)

  • Jason Z Fu

    (National Institute of Mental Health, NIH)

  • Yuhui Chai

    (National Institute of Mental Health, NIH)

  • Shruti Japee

    (National Institute of Mental Health, NIH)

  • Gang Chen

    (National Institute of Mental Health, NIH)

  • Leslie G. Ungerleider

    (National Institute of Mental Health, NIH)

  • Elisha P. Merriam

    (National Institute of Mental Health, NIH)

Abstract

Viewing faces that are perceived as emotionally expressive evokes enhanced neural responses in multiple brain regions, a phenomenon thought to depend critically on the amygdala. This emotion-related modulation is evident even in primary visual cortex (V1), providing a potential neural substrate by which emotionally salient stimuli can affect perception. How does emotional valence information, computed in the amygdala, reach V1? Here we use high-resolution functional MRI to investigate the layer profile and retinotopic distribution of neural activity specific to emotional facial expressions. Across three experiments, human participants viewed centrally presented face stimuli varying in emotional expression and performed a gender judgment task. We found that facial valence sensitivity was evident only in superficial cortical layers and was not restricted to the retinotopic location of the stimuli, consistent with diffuse feedback-like projections from the amygdala. Together, our results provide a feedback mechanism by which the amygdala directly modulates activity at the earliest stage of visual processing.

Suggested Citation

  • Tina T. Liu & Jason Z Fu & Yuhui Chai & Shruti Japee & Gang Chen & Leslie G. Ungerleider & Elisha P. Merriam, 2022. "Layer-specific, retinotopically-diffuse modulation in human visual cortex in response to viewing emotionally expressive faces," Nature Communications, Nature, vol. 13(1), pages 1-15, December.
  • Handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-33580-7
    DOI: 10.1038/s41467-022-33580-7
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-022-33580-7
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-022-33580-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Xilin Zhang & Shruti Japee & Zaid Safiullah & Nicole Mlynaryk & Leslie G Ungerleider, 2016. "A Normalization Framework for Emotional Attention," PLOS Biology, Public Library of Science, vol. 14(11), pages 1-25, November.
    2. Ralph Adolphs & Frederic Gosselin & Tony W. Buchanan & Daniel Tranel & Philippe Schyns & Antonio R. Damasio, 2005. "A mechanism for impaired fear recognition after amygdala damage," Nature, Nature, vol. 433(7021), pages 68-72, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Martin Wegrzyn & Maria Vogt & Berna Kireclioglu & Julia Schneider & Johanna Kissler, 2017. "Mapping the emotional face. How individual face parts contribute to successful emotion recognition," PLOS ONE, Public Library of Science, vol. 12(5), pages 1-15, May.
    2. Ilicic, Jasmina & Baxter, Stacey M. & Kulczynski, Alicia, 2016. "White eyes are the window to the pure soul: Metaphorical association and overgeneralization effects for spokespeople with limbal rings," International Journal of Research in Marketing, Elsevier, vol. 33(4), pages 840-855.
    3. Feng Zhou & Weihua Zhao & Ziyu Qi & Yayuan Geng & Shuxia Yao & Keith M. Kendrick & Tor D. Wager & Benjamin Becker, 2021. "A distributed fMRI-based signature for the subjective experience of fear," Nature Communications, Nature, vol. 12(1), pages 1-16, December.
    4. Yung-Hao Yang & Su-Ling Yeh, 2018. "Can emotional content be extracted under interocular suppression?," PLOS ONE, Public Library of Science, vol. 13(11), pages 1-18, November.
    5. Yao Song & Yan Luximon, 2019. "Design for Sustainability: The Effect of Lettering Case on Environmental Concern from a Green Advertising Perspective," Sustainability, MDPI, vol. 11(5), pages 1-15, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:13:y:2022:i:1:d:10.1038_s41467-022-33580-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.