IDEAS home Printed from https://ideas.repec.org/a/gam/jdataj/v4y2019i1p40-d213447.html
   My bibliography  Save this article

LNSNet: Lightweight Navigable Space Segmentation for Autonomous Robots on Construction Sites

Author

Listed:
  • Khashayar Asadi

    (Department of Civil, Construction, and Environmental Engineering, North Carolina State University, 2501 Stinson Dr, Raleigh, NC 27606, USA)

  • Pengyu Chen

    (Department of Computer Science, Columbia University in the City of New York, Mudd Building, 500 W 120th St, New York, NY 10027, USA)

  • Kevin Han

    (Department of Civil, Construction, and Environmental Engineering, North Carolina State University, 2501 Stinson Dr, Raleigh, NC 27606, USA)

  • Tianfu Wu

    (Department of Electrical and Computer Engineering, North Carolina State University, 890 Oval Drive, Raleigh, NC 27606, USA)

  • Edgar Lobaton

    (Department of Electrical and Computer Engineering, North Carolina State University, 890 Oval Drive, Raleigh, NC 27606, USA)

Abstract

An autonomous robot that can monitor a construction site should be able to be can contextually detect its surrounding environment by recognizing objects and making decisions based on its observation. Pixel-wise semantic segmentation in real-time is vital to building an autonomous and mobile robot. However, the learning models’ size and high memory usage associated with real-time segmentation are the main challenges for mobile robotics systems that have limited computing resources. To overcome these challenges, this paper presents an efficient semantic segmentation method named LNSNet (lightweight navigable space segmentation network) that can run on embedded platforms to determine navigable space in real-time. The core of model architecture is a new block based on separable convolution which compresses the parameters of present residual block meanwhile maintaining the accuracy and performance. LNSNet is faster, has fewer parameters and less model size, while provides similar accuracy compared to existing models. A new pixel-level annotated dataset for real-time and mobile navigable space segmentation in construction environments has been constructed for the proposed method. The results demonstrate the effectiveness and efficiency that are necessary for the future development of the autonomous robotics systems.

Suggested Citation

  • Khashayar Asadi & Pengyu Chen & Kevin Han & Tianfu Wu & Edgar Lobaton, 2019. "LNSNet: Lightweight Navigable Space Segmentation for Autonomous Robots on Construction Sites," Data, MDPI, vol. 4(1), pages 1-17, March.
  • Handle: RePEc:gam:jdataj:v:4:y:2019:i:1:p:40-:d:213447
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2306-5729/4/1/40/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2306-5729/4/1/40/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jdataj:v:4:y:2019:i:1:p:40-:d:213447. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.