IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v11y2023i22p4645-d1279911.html
   My bibliography  Save this article

A Deep Joint Network for Monocular Depth Estimation Based on Pseudo-Depth Supervision

Author

Listed:
  • Jiahai Tan

    (School of Optoelectronic Engineering, Xi’an Technological University, Xi’an 710021, China
    State Key Laboratory of Transient Optics and Photonics, Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi’an 710119, China)

  • Ming Gao

    (School of Optoelectronic Engineering, Xi’an Technological University, Xi’an 710021, China)

  • Tao Duan

    (State Key Laboratory of Transient Optics and Photonics, Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi’an 710119, China)

  • Xiaomei Gao

    (Xi’an Mapping and Printing of China National Administration of Coal Geology, Xi’an 710199, China)

Abstract

Depth estimation from a single image is a significant task. Although deep learning methods hold great promise in this area, they still face a number of challenges, including the limited modeling of nonlocal dependencies, lack of effective loss function joint optimization models, and difficulty in accurately estimating object edges. In order to further increase the network’s prediction accuracy, a new structure and training method are proposed for single-image depth estimation in this research. A pseudo-depth network is first deployed for generating a single-image depth prior, and by constructing connecting paths between multi-scale local features using the proposed up-mapping and jumping modules, the network can integrate representations and recover fine details. A deep network is also designed to capture and convey global context by utilizing the Transformer Conv module and Unet Depth net to extract and refine global features. The two networks jointly provide meaningful coarse and fine features to predict high-quality depth images from single RGB images. In addition, multiple joint losses are utilized to enhance the training model. A series of experiments are carried out to confirm and demonstrate the efficacy of our method. The proposed method exceeds the advanced method DPT by 10% and 3.3% in terms of root mean square error (RMSE(log)) and 1.7% and 1.6% in terms of squared relative difference (SRD), respectively, according to experimental results on the NYU Depth V2 and KITTI depth estimation benchmarks.

Suggested Citation

  • Jiahai Tan & Ming Gao & Tao Duan & Xiaomei Gao, 2023. "A Deep Joint Network for Monocular Depth Estimation Based on Pseudo-Depth Supervision," Mathematics, MDPI, vol. 11(22), pages 1-19, November.
  • Handle: RePEc:gam:jmathe:v:11:y:2023:i:22:p:4645-:d:1279911
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/11/22/4645/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/11/22/4645/
    Download Restriction: no
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:11:y:2023:i:22:p:4645-:d:1279911. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.