IDEAS home Printed from https://ideas.repec.org/a/nat/nature/v569y2019i7758d10.1038_s41586-019-1234-z.html
   My bibliography  Save this article

Learning the signatures of the human grasp using a scalable tactile glove

Author

Listed:
  • Subramanian Sundaram

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology
    Boston University
    Harvard University)

  • Petr Kellnhofer

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Yunzhu Li

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Jun-Yan Zhu

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Antonio Torralba

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

  • Wojciech Matusik

    (Massachusetts Institute of Technology
    Massachusetts Institute of Technology)

Abstract

Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material properties while applying the right amount of force—a challenging set of tasks for a modern robot1. Mechanoreceptor networks that provide sensory feedback and enable the dexterity of the human grasp2 remain difficult to replicate in robots. Whereas computer-vision-based robot grasping strategies3–5 have progressed substantially with the abundance of visual data and emerging machine-learning tools, there are as yet no equivalent sensing platforms and large-scale datasets with which to probe the use of the tactile information that humans rely on when grasping objects. Studying the mechanics of how humans grasp objects will complement vision-based robotic object handling. Importantly, the inability to record and analyse tactile signals currently limits our understanding of the role of tactile information in the human grasp itself—for example, how tactile maps are used to identify objects and infer their properties is unknown6. Here we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. The sensor array (548 sensors) is assembled on a knitted glove, and consists of a piezoresistive film connected by a network of conductive thread electrodes that are passively probed. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand, while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analogue of the natural mechanoreceptor network—can thus aid the future design of prosthetics7, robot grasping tools and human–robot interactions1,8–10.

Suggested Citation

  • Subramanian Sundaram & Petr Kellnhofer & Yunzhu Li & Jun-Yan Zhu & Antonio Torralba & Wojciech Matusik, 2019. "Learning the signatures of the human grasp using a scalable tactile glove," Nature, Nature, vol. 569(7758), pages 698-702, May.
  • Handle: RePEc:nat:nature:v:569:y:2019:i:7758:d:10.1038_s41586-019-1234-z
    DOI: 10.1038/s41586-019-1234-z
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41586-019-1234-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41586-019-1234-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Min Chen & Jingyu Ouyang & Aijia Jian & Jia Liu & Pan Li & Yixue Hao & Yuchen Gong & Jiayu Hu & Jing Zhou & Rui Wang & Jiaxi Wang & Long Hu & Yuwei Wang & Ju Ouyang & Jing Zhang & Chong Hou & Lei Wei , 2022. "Imperceptible, designable, and scalable braided electronic cord," Nature Communications, Nature, vol. 13(1), pages 1-10, December.
    2. Rui Chen & Tao Luo & Jincheng Wang & Renpeng Wang & Chen Zhang & Yu Xie & Lifeng Qin & Haimin Yao & Wei Zhou, 2023. "Nonlinearity synergy: An elegant strategy for realizing high-sensitivity and wide-linear-range pressure sensing," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    3. Zhongda Sun & Minglu Zhu & Xuechuan Shan & Chengkuo Lee, 2022. "Augmented tactile-perception and haptic-feedback rings as human-machine interfaces aiming for immersive interactions," Nature Communications, Nature, vol. 13(1), pages 1-13, December.
    4. Shijing Zhang & Yingxiang Liu & Jie Deng & Xiang Gao & Jing Li & Weiyi Wang & Mingxin Xun & Xuefeng Ma & Qingbing Chang & Junkao Liu & Weishan Chen & Jie Zhao, 2023. "Piezo robotic hand for motion manipulation from micro to macro," Nature Communications, Nature, vol. 14(1), pages 1-12, December.
    5. Yijia Lu & Han Tian & Jia Cheng & Fei Zhu & Bin Liu & Shanshan Wei & Linhong Ji & Zhong Lin Wang, 2022. "Decoding lip language using triboelectric sensors with deep learning," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    6. Yiyue Luo & Chao Liu & Young Joong Lee & Joseph DelPreto & Kui Wu & Michael Foshey & Daniela Rus & Tomás Palacios & Yunzhu Li & Antonio Torralba & Wojciech Matusik, 2024. "Adaptive tactile interaction transfer via digitally embroidered smart gloves," Nature Communications, Nature, vol. 15(1), pages 1-13, December.
    7. Hyung Woo Choi & Dong-Wook Shin & Jiajie Yang & Sanghyo Lee & Cátia Figueiredo & Stefano Sinopoli & Kay Ullrich & Petar Jovančić & Alessio Marrani & Roberto Momentè & João Gomes & Rita Branquinho & Um, 2022. "Smart textile lighting/display system with multifunctional fibre devices for large scale smart home and IoT applications," Nature Communications, Nature, vol. 13(1), pages 1-9, December.
    8. Jayraj V. Vaghasiya & Carmen C. Mayorga-Martinez & Jan Vyskočil & Martin Pumera, 2023. "Black phosphorous-based human-machine communication interface," Nature Communications, Nature, vol. 14(1), pages 1-8, December.
    9. Haojie Lu & Yong Zhang & Mengjia Zhu & Shuo Li & Huarun Liang & Peng Bi & Shuai Wang & Haomin Wang & Linli Gan & Xun-En Wu & Yingying Zhang, 2024. "Intelligent perceptual textiles based on ionic-conductive and strong silk fibers," Nature Communications, Nature, vol. 15(1), pages 1-9, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nature:v:569:y:2019:i:7758:d:10.1038_s41586-019-1234-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.