The Use of Multi-Scale Fiducial Markers To Aid Takeoff and Landing Navigation by Rotorcraft (2309.08769v3)
Abstract: This paper quantifies the performance of visual SLAM that leverages multi-scale fiducial markers (i.e., artificial landmarks that can be detected at a wide range of distances) to show its potential for reliable takeoff and landing navigation in rotorcraft. Prior work has shown that square markers with a black-and-white pattern of grid cells can be used to improve the performance of visual SLAM with color cameras. We extend this prior work to allow nested marker layouts. We evaluate performance during semi-autonomous takeoff and landing operations in a variety of environmental conditions by a DJI Matrice 300 RTK rotorcraft with two FLIR Blackfly color cameras, using RTK GNSS to obtain ground truth pose estimates. Performance measures include absolute trajectory error and the fraction of the number of estimated poses to the total frame. We release all of our results -- our dataset and the code of the implementation of the visual SLAM with fiducial markers -- to the public as open-source.
- Lim, H., and Lee, Y. S., “Real-time single camera SLAM using fiducial markers,” 2009 ICCAS-SICE, 2009, pp. 177–182.
- Yamada, T., Yairi, T., Bener, S. H., and Machida, K., “A study on SLAM for indoor blimp with visual markers,” 2009 ICCAS-SICE, 2009, pp. 647–652.
- Pfrommer, B., and Daniilidis, K., “Tagslam: Robust slam with fiducial markers,” arXiv preprint arXiv:1910.00679, 2019.
- Munoz-Salinas, R., and Medina-Carnicer, R., “UcoSLAM: Simultaneous localization and mapping by fusion of keypoints and squared planar markers,” Pattern Recognition, Vol. 101, 2020, p. 107193.
- Lee, J., Choi, S. Y., Hanley, D., and Bretl, T., “Comparative Study of Visual SLAM-Based Mobile Robot Localization Using Fiducial Markers,” 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Workshop on Closing the Loop on Localization, 2023.
- Krogius, M., Haggenmiller, A., and Olson, E., “Flexible Layouts for Fiducial Tags,” 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 1898–1903. 10.1109/IROS40897.2019.8967787.
- Springer, J., and Kyas, M., “Autonomous Drone Landing with Fiducial Markers and a Gimbal-Mounted Camera for Active Tracking,” 2022, pp. 243–247. 10.1109/IRC55401.2022.00047.
- Administration, F. A., “Vertiport Design,” https://www.faa.gov/airports/engineering/engineering_briefs/engineering_brief_105_vertiport_design, 2023. Last updated: 03.13.2023.
- Mur-Artal, R., and Tardós, J. D., “Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras,” IEEE transactions on robotics, Vol. 33, No. 5, 2017, pp. 1255–1262.
- Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., and Marín-Jiménez, M. J., “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, Vol. 47, No. 6, 2014, pp. 2280–2292.
- Sola, J., Vallvé, J., Casals, J., Deray, J., Fourmy, M., Atchuthan, D., Corominas-Murtra, A., and Andrade-Cetto, J., “WOLF: A modular estimation framework for robotics based on factor graphs,” IEEE Robotics and Automation Letters, Vol. 7, No. 2, 2022, pp. 4710–4717.
- Kalaitzakis, M., Cain, B., Carroll, S., Ambrosi, A., Whitehead, C., and Vitzilaios, N., “Fiducial markers for pose estimation: Overview, applications and experimental comparison of the artag, apriltag, aruco and stag markers,” Journal of Intelligent & Robotic Systems, Vol. 101, 2021, pp. 1–26.
- Furgale, P., Rehder, J., and Siegwart, R., “Unified temporal and spatial calibration for multi-sensor systems,” 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2013, pp. 1280–1286.
- Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M., and Tardós, J. D., “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Transactions on Robotics, Vol. 37, No. 6, 2021, pp. 1874–1890.
- Lee, J., Hanley, D., and Bretl, T., “Extrinsic calibration of multiple inertial sensors from arbitrary trajectories,” IEEE Robotics and Automation Letters, Vol. 7, No. 2, 2022, pp. 2055–2062.