IDEAS home Printed from https://ideas.repec.org/a/nat/natcom/v16y2025i1d10.1038_s41467-025-65624-z.html
   My bibliography  Save this article

A hyperconformal dual-modal metaskin for well-defined and high-precision contextual interactions

Author

Listed:
  • Shifan Yu

    (Xiamen University, Department of Electronic Science)

  • Zhenzhou Ji

    (Xiamen University, Department of Electronic Science)

  • Lei Liu

    (Xiamen University, Department of Electronic Science)

  • Zijian Huang

    (Xiamen University, Department of Electronic Science)

  • Yanhao Luo

    (Xiamen University, Department of Electronic Science)

  • Huasen Wang

    (Xiamen University, Department of Electronic Science)

  • Ruize Wangyuan

    (Xiamen University, Department of Electronic Science)

  • Ziquan Guo

    (Xiamen University, Department of Electronic Science)

  • Zhong Chen

    (Xiamen University, Department of Electronic Science)

  • Qingliang Liao

    (University of Science and Technology Beijing, Academy for Advanced Interdisciplinary Science and Technology, Key Laboratory of Advanced Materials and Devices for Post-Moore Chips Ministry of Education
    University of Science and Technology Beijing, Beijing Key Laboratory for Advanced Energy Materials and Technologies, School of Materials Science and Engineering)

  • Yuanjin Zheng

    (Nanyang Technological University, School of Electrical and Electronic Engineering)

  • Xinqin Liao

    (Xiamen University, Department of Electronic Science)

Abstract

Proprioception and touch serve as complementary sensory modalities to coordinate hand kinematics and recognize users’ intent for precise interactions. However, current motion-tracking electronics remain bulky and insufficiently precise. Accurately decoding both is also challenging owing to the mechanical crosstalk of endogenous and exogenous deformations. Here, we report a hyperconformal dual-modal (HDM) metaskin for interactive hand motion interpretation. The metaskin integrates a strongly coupled hydrophilic interface with a two-step transfer strategy to minimize interfacial mechanical losses. The 10-μm-scale hyperconformal film is highly sensitive to intricate skin stretches while minimizing signal distortion. It accurately tracks skin stretches as well as touch locations and translates them into polar signals, which are individually salient. This approach enables a differentiable signaling topology within one single data channel without burdening structural complexity to the metaskin. When combined with temporal differential calculations and time-series machine learning network, the metaskin extracts interactive context and action cues from the low-dimensional data. This phenomenon is further exemplified through demonstrations in contextual navigation, typing and control integration, and multi-scenario object interaction. We demonstrate this fundamental approach in advanced skin-integrated electronics, highlighting its potential for instinctive interaction paradigms and paving the way for augmented somatosensation recognition.

Suggested Citation

  • Shifan Yu & Zhenzhou Ji & Lei Liu & Zijian Huang & Yanhao Luo & Huasen Wang & Ruize Wangyuan & Ziquan Guo & Zhong Chen & Qingliang Liao & Yuanjin Zheng & Xinqin Liao, 2025. "A hyperconformal dual-modal metaskin for well-defined and high-precision contextual interactions," Nature Communications, Nature, vol. 16(1), pages 1-15, December.
  • Handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-65624-z
    DOI: 10.1038/s41467-025-65624-z
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41467-025-65624-z
    File Function: Abstract
    Download Restriction: no

    File URL: https://libkey.io/10.1038/s41467-025-65624-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-65624-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.