iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://doi.org/10.1007/s10514-020-09944-7
Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations | Autonomous Robots Skip to main content
Log in

Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

This paper proposes an enhanced tightly-coupled sensor fusion scheme using a monocular camera and ultra-wideband (UWB) ranging sensors for the task of simultaneous localization and mapping. By leveraging UWB data, the method can achieve metric-scale, drift-reduced odometry and a map consisting of visual landmarks and UWB anchors without knowing the anchor positions. Firstly, the UWB configuration accommodates any degenerate cases with an insufficient number of anchors for 3D triangulation (\(N\le 3\) and no height data). Secondly, a practical model for UWB measurement is used, ensuring more accurate estimates for all the states. Thirdly, selected prior range measurements including the anchor-world origin and anchor–anchor ranges are utilized to alleviate the requirement of good initial guesses for anchor position. Lastly, a monitoring scheme is introduced to appropriately fix the scale factor to maintain a smooth trajectory as well as the UWB anchor position to fuse camera and UWB measurement in the bundle adjustment. Extensive experiments are carried out to showcase the effectiveness of the proposed system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. https://www.intelrealsense.com/.

  2. https://humatics.com/.

  3. https://www.vicon.com/.

  4. https://leica-geosystems.com/.

References

  • Alarifi, A., Al-Salman, A., Alsaleh, M., Alnafessah, A., Al-Hadhrami, S., Al-Ammar, M. A., et al. (2016). Ultra-wideband indoor positioning technologies: Analysis and recent advances. Sensors, 16(5), 707. https://doi.org/10.3390/s16050707.

    Article  Google Scholar 

  • Benini, A., Mancini, A., & Longhi, S. (2013). An IMU/UWB/vision-based extended Kalman Filter for mini-UAV localization in indoor environment using 802.15.4a wireless sensor network. Journal of Intelligent & Robotic Systems, 70(1–4), 461–476. https://doi.org/10.1007/s10846-012-9742-1.

    Article  Google Scholar 

  • Blanco, J. L., González, J., & Fernández-Madrigal, J. A. (2008). A pure probabilistic approach to range-only SLAM. In 2008 IEEE international conference on robotics and automation (pp. 1436–1441). https://doi.org/10.1109/robot.2008.4543404

  • Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., et al. (2016). The EuRoC micro aerial vehicle datasets. The International Journal of Robotics Research, 35(10), 1157–1163. https://doi.org/10.1177/0278364915620033.

    Article  Google Scholar 

  • Chen, X., Hu, W., Zhang, L., Shi, Z., & Li, M. (2018). Integration of low-cost GNSS and monocular cameras for simultaneous localization and mapping. Sensors, 18(7), 2193. https://doi.org/10.3390/s18072193.

    Article  Google Scholar 

  • de Ponte, Müller F. (2017). Survey on ranging sensors and cooperative techniques for relative positioning of vehicles. Sensors, 17(2), 271. https://doi.org/10.3390/s17020271.

    Article  Google Scholar 

  • Delmerico, J., & Scaramuzza, D. (2018). A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2502–2509). https://doi.org/10.1109/ICRA.2018.8460664.

  • Djugash, J., Singh, S., Kantor, G., & Zhang, W. (2006). Range-only SLAM for robots operating cooperatively with sensor networks. In Proceedings 2006 IEEE international conference on robotics and automation, 2006. ICRA 2006 (pp. 2078–2084). https://doi.org/10.1109/robot.2006.1642011.

  • Fang, B. T., et al. (1990). Simple solutions for hyperbolic and related position fixes. IEEE Transactions on Aerospace and Electronic Systems, 26(5), 748–753. https://doi.org/10.1109/7.102710.

    Article  Google Scholar 

  • Fernando, E., De Silva, O., Mann, G. K., & Gosine, R. G. (2019). Observability analysis of position estimation for quadrotors with modified dynamics and range measurements. In 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 2783–2788). https://doi.org/10.1109/iros40897.2019.8968057.

  • Frost, D., Prisacariu, V., & Murray, D. (2018). Recovering stable scale in monocular SLAM using object-supplemented bundle adjustment. IEEE Transactions on Robotics, 34(3), 736–747. https://doi.org/10.1109/tro.2018.2820722.

    Article  Google Scholar 

  • Furgale, P., Rehder, J., & Siegwart, R. (2013). Unified temporal and spatial calibration for multi-sensor systems. In 2013 IEEE/RSJ international conference on intelligent robots and systems (pp. 1280–1286). https://doi.org/10.1109/iros.2013.6696514.

  • Gálvez-López, D., Salas, M., Tardós, J. D., & Montiel, J. (2016). Real-time monocular object SLAM. Robotics and Autonomous Systems, 75, 435–449. https://doi.org/10.1016/j.robot.2015.08.009.

    Article  Google Scholar 

  • Giubilato, R., Chiodini, S., Pertile, M., & Debei, S. (2018). Scale correct monocular visual odometry using a LiDAR altimeter. In 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3694–3700). https://doi.org/10.1109/iros.2018.8594096.

  • Guo, K., Qiu, Z., Miao, C., Zaini, A. H., Chen, C. L., Meng, W., et al. (2016). Ultra-wideband-based localization for quadcopter navigation. Unmanned Systems, 4(01), 23–34. https://doi.org/10.1142/S2301385016400033.

    Article  Google Scholar 

  • Hausman, K., Weiss, S., Brockers, R., Matthies, L., & Sukhatme, G. S. (2016). Self-calibrating multi-sensor fusion with probabilistic measurement validation for seamless sensor switching on a UAV. In 2016 IEEE international conference on robotics and automation (ICRA) (pp. 4289–4296). https://doi.org/10.1109/icra.2016.7487626.

  • Hoeller, D., Ledergerber, A., Hamer, M., & D’Andrea, R. (2017). Augmenting ultra-wideband localization with computer vision for accurate flight. IFAC-PapersOnLine, 50(1), 12734–12740. https://doi.org/10.1016/j.ifacol.2017.08.1826.

    Article  Google Scholar 

  • Huang, G. (2019). Visual-inertial navigation: A concise review. In 2019 international conference on robotics and automation (ICRA) (pp. 9572–9582). https://doi.org/10.1109/icra.2019.8793604.

  • Kümmerle, R., Grisetti, G., Strasdat, H., Konolige, K., & Burgard, W. (2011). G2o: A general framework for graph optimization. In 2011 IEEE international conference on robotics and automation (ICRA) (pp. 3607–3613). https://doi.org/10.1109/icra.2011.5979949.

  • Li, J., Bi, Y., Li, K., Wang, K., Lin, F., & Chen, B. M. (2018). Accurate 3D Localization for MAV swarms by UWB and IMU fusion. In 2018 14th IEEE international conference on control and automation (ICCA) (pp. 100–105). https://doi.org/10.1109/icca.2018.8444329.

  • Lim, H., & Sinha, S. N. (2015). Monocular localization of a moving person onboard a quadrotor MAV. In 2015 IEEE international conference on robotics and automation (ICRA) (pp. 2182–2189). https://doi.org/10.1109/icra.2015.7139487.

  • Mohta, K., Watterson, M., Mulgaonkar, Y., Liu, S., Qu, C., Makineni, A., et al. (2018). Fast, autonomous flight in GPS-denied and cluttered environments. Journal of Field Robotics, 35(1), 101–120. https://doi.org/10.1002/rob.21774.

    Article  Google Scholar 

  • Molina Martel, F., Sidorenko, J., Bodensteiner, C., Arens, M., & Hugentobler, U. (2019). Unique 4-DOF relative pose estimation with six distances for UWB/V-SLAM-based devices. Sensors, 19(20), 4366. https://doi.org/10.3390/s19204366.

    Article  Google Scholar 

  • Mur-Artal, R., Montiel, J. M. M., & Tardós, J. D. (2015). ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Transactions on Robotics, 31(5), 1147–1163. https://doi.org/10.1109/TRO.2015.2463671.

    Article  Google Scholar 

  • Nguyen, T. H., Nguyen, T. M., Cao, M., & Xie, L. (2020a). Loosely-coupled ultra-wideband-aided scale correction for monocular visual odometry. Unmanned Systems,. https://doi.org/10.1142/S2301385020500119.

    Article  Google Scholar 

  • Nguyen, T. H., Nguyen, T. M., & Xie, L. (2020b). Tightly-coupled single-anchor ultra-wideband-aided monocular visual odometry system. In 2020 IEEE international conference on robotics and automation (ICRA) (accepted, to appear).

  • Nguyen, T. M., Zaini, A. H., Wang, C., Guo, K., & Xie, L. (2018). Robust target-relative localization with ultra-wideband ranging and communication. In 2018 IEEE international conference on robotics and automation (ICRA) (pp. 2312–2319). https://doi.org/10.1109/ICRA.2018.8460844.

  • Nguyen, T. H., Cao, M., Nguyen, T. M., & Xie, L. (2018). Post-mission autonomous return and precision landing of UAV. In 2018 15th international conference on control, automation, robotics and vision (ICARCV) (pp. 1747–1752). https://doi.org/10.1109/ICARCV.2018.8581117.

  • Nyqvist, H. E., Skoglund, M. A., Hendeby, G., & Gustafsson, F. (2015). Pose estimation using monocular vision and inertial sensors aided with ultra-wideband. In 2015 international conference on indoor positioning and indoor navigation (IPIN) (pp. 1–10). https://doi.org/10.1109/ipin.2015.7346940

  • Perez-Grau, F. J., Caballero, F., Merino, L., & Viguria, A. (2017). Multi-modal mapping and localization of unmanned aerial robots based on ultra-wideband and RGB-D sensing. In 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3495–3502). https://doi.org/10.1109/iros.2017.8206191.

  • Qin, T., Li, P., & Shen, S. (2018). VINS-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 34(4), 1004–1020. https://doi.org/10.1109/tro.2018.2853729.

    Article  Google Scholar 

  • Qin, T., Pan, J., Cao, S., & Shen, S. (2019). A general optimization-based framework for local odometry estimation with multiple sensors. arXiv:1901.03638.

  • Scaramuzza, D., Fraundorfer, F., Pollefeys, M., & Siegwart, R. (2009). Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints. In 2009 IEEE 12th international conference on computer vision (pp. 1413–1419). https://doi.org/10.1109/iccv.2009.5459294

  • Shariati, A., Mohta, K., & Taylor, C. J. (2016). Recovering relative orientation and scale from visual odometry and ranging radio measurements. In 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 3627–3633). https://doi.org/10.1109/iros.2016.7759534.

  • Shi, Q., Cui, X., Li, W., Xia, Y., & Lu, M. (2018). Visual-UWB navigation system for unknown environments. In Proceedings of the 31st international technical meeting of the satellite division of the institute of navigation (ION-GNSS+ 2018), Institute of Navigation. https://doi.org/10.33012/2018.15962

  • Shi, Q., Zhao, S., Oui, X., Lu, M., & Jia, M. (2019). Anchor self-localization algorithm based on UWB ranging and inertial measurements. Tsinghua Science and Technology, 24(6), 728–737. https://doi.org/10.26599/tst.2018.9010102.

    Article  Google Scholar 

  • Song, Y., Guan, M., Tay, W. P., Law, C. L., & Wen, C. (2019). UWB/LiDAR fusion for cooperative range-only SLAM. In 2019 international conference on robotics and automation (ICRA) (pp. 6568–6574). https://doi.org/10.1109/icra.2019.8794222

  • Taketomi, T., Uchiyama, H., & Ikeda, S. (2017). Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Transactions on Computer Vision and Applications, 9(1), 16. https://doi.org/10.1186/s41074-017-0027-2.

    Article  Google Scholar 

  • Tateno, K., Tombari, F., Laina, I., & Navab, N. (2017). CNN-SLAM: Real-time dense monocular SLAM with learned depth prediction. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR),. https://doi.org/10.1109/cvpr.2017.695.

    Article  Google Scholar 

  • Tiemann, J., & Wietfeld, C. (2017). Scalable and precise multi-UAV indoor navigation using TDOA-based UWB localization. In 2017 international conference on indoor positioning and indoor navigation (IPIN) (pp. 1–7). https://doi.org/10.1109/ipin.2017.8115937

  • Wang, C., Zhang, H., Nguyen, T. M., & Xie, L. (2017). Ultra-wideband aided fast localization and mapping system. In 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS) (pp. 1602–1609). https://doi.org/10.1109/iros.2017.8205968.

  • Yang, N., Wang, R., Stuckler, J., & Cremers, D. (2018). Deep virtual stereo odometry: Leveraging deep depth prediction for monocular direct sparse odometry. Proceedings of the European conference on computer vision (ECCV) (pp. 817–833). https://doi.org/10.1007/978-3-030-01237-3_50.

  • Yin, X., Wang, X., Du, X., & Chen, Q. (2017). Scale recovery for monocular visual odometry using depth estimated with deep convolutional neural fields. In Proceedings of the IEEE international conference on computer vision (pp. 5870–5878). https://doi.org/10.1109/iccv.2017.625.

  • Zhang, J., & Singh, S. (2015). Visual-lidar odometry and mapping: Low-drift, robust, and fast. In 2015 IEEE international conference on robotics and automation (ICRA) (pp. 2174–2181). https://doi.org/10.1109/icra.2015.7139486.

  • Zhang, Z., & Scaramuzza, D. (2018). A tutorial on quantitative trajectory evaluation for visual(-inertial) odometry. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS),. https://doi.org/10.1109/iros.2018.8593941.

    Article  Google Scholar 

  • Zhang, Z., Zhao, R., Liu, E., Yan, K., & Ma, Y. (2018). Scale estimation and correction of the monocular simultaneous localization and mapping (SLAM) based on fusion of 1D laser range finder and vision data. Sensors, 18(6), 1948. https://doi.org/10.3390/s18061948.

    Article  Google Scholar 

  • Zhou, D., Dai, Y., & Li, H. (2016). Reliable scale estimation and correction for monocular visual odometry. In 2016 IEEE intelligent vehicles symposium (IV) (pp. 490–495). https://doi.org/10.1109/ivs.2016.7535431.

Download references

Acknowledgements

This work was supported by Delta-NTU Corporate Lab through the National Research Foundation Corporate Lab@University Scheme.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thien-Minh Nguyen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nguyen, T.H., Nguyen, TM. & Xie, L. Tightly-coupled ultra-wideband-aided monocular visual SLAM with degenerate anchor configurations. Auton Robot 44, 1519–1534 (2020). https://doi.org/10.1007/s10514-020-09944-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-020-09944-7

Keywords

Navigation