IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0341589.html

Vision-Controlled autonomous navigation in unstructured environments: Integrating image processing, path planning, and trajectory control in robotic systems

Author

Listed:
  • Pengyuan Wang
  • Haipeng Yu
  • Shuqing Wang

Abstract

Advancements in artificial intelligence (AI) have driven robotics to the forefront of technological innovation, enhancing productivity and safety across industries. Autonomous navigation, especially in unstructured environments with irregular terrains and dynamic obstacles, remains a key challenge. This paper introduces a vision-controlled autonomous navigation framework that enables robots to traverse complex environments using only vision sensors and image processing. The system integrates visual segmentation, optimized path planning, and advanced trajectory tracking. Key contributions include: (1) Semantic Mapping and Localization – A target detection network generates a global semantic map from local views, enhancing perception without external markers; (2) Improved Path Planning – The RRT-connect algorithm is refined for safer, adaptive navigation in unpredictable terrains; (3) Accurate Trajectory Control–A Soft Actor-Critic (SAC)-based model reduces tracking errors and enhances path-following precision; (4) Empirical Validation – Experiments with a magnetic miniature robot in unstructured environments confirm the system’s robustness and accuracy. The proposed framework addresses existing limitations, paving the way for more autonomous and resilient robotic systems in complex environments.

Suggested Citation

  • Pengyuan Wang & Haipeng Yu & Shuqing Wang, 2026. "Vision-Controlled autonomous navigation in unstructured environments: Integrating image processing, path planning, and trajectory control in robotic systems," PLOS ONE, Public Library of Science, vol. 21(3), pages 1-22, March.
  • Handle: RePEc:plo:pone00:0341589
    DOI: 10.1371/journal.pone.0341589
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0341589
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0341589&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0341589?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0341589. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.