IDEAS home Printed from https://ideas.repec.org/a/spr/joinma/v34y2023i7d10.1007_s10845-022-01992-3.html
   My bibliography  Save this article

A digital apprentice for chatter detection in machining via human–machine interaction

Author

Listed:
  • Xiaoliang Yan

    (Georgia Institute of Technology)

  • Shreyes Melkote

    (Georgia Institute of Technology)

  • Anant Kumar Mishra

    (Siemens Corporate Technology)

  • Sudhir Rajagopalan

    (Siemens Corporate Technology)

Abstract

Regenerative chatter in machining operations such as milling is a common process anomaly that limits productivity and part quality, which in turn lead to increased manufacturing costs. The industrial relevance of the problem has sparked many research efforts over the recent decades, with a growing interest in real-time chatter detection and suppression. Inspired by learning from human demonstration frameworks, this paper proposes a new approach to milling chatter detection via effective human–machine interaction, which facilitates knowledge transfer from an experienced machine tool operator to a “Digital Apprentice.” The proposed chatter detection approach acquires chatter-specific knowledge through a learnable skill primitive (LSP) algorithm designed to establish a robust chatter detection threshold from few-shot real-time demonstrations by an experienced human operator. In this work, digital audio data were acquired from milling experiments through a microphone mounted inside the milling machine. During the training phase, data for the human operator’s natural reaction to chatter were collected via a specially designed human–machine interface. The learned chatter detection thresholds were obtained via the LSP algorithm by temporally mapping the reaction time data to the audio signal. During the testing phase, experiments were conducted to validate the detection accuracy and detection speed of the learned chatter detection thresholds under different cutting conditions. The experimental validation results of the learned thresholds indicate an average chatter detection accuracy of 94.4%, with 55.6% of chatter cases detected before chatter marks are produced on a 4140 Steel workpiece, thus demonstrating the effectiveness of human–machine interaction in chatter detection.

Suggested Citation

  • Xiaoliang Yan & Shreyes Melkote & Anant Kumar Mishra & Sudhir Rajagopalan, 2023. "A digital apprentice for chatter detection in machining via human–machine interaction," Journal of Intelligent Manufacturing, Springer, vol. 34(7), pages 3039-3052, October.
  • Handle: RePEc:spr:joinma:v:34:y:2023:i:7:d:10.1007_s10845-022-01992-3
    DOI: 10.1007/s10845-022-01992-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10845-022-01992-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10845-022-01992-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joinma:v:34:y:2023:i:7:d:10.1007_s10845-022-01992-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.