Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 75 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 20 tok/s Pro
GPT-5 High 18 tok/s Pro
GPT-4o 95 tok/s Pro
Kimi K2 193 tok/s Pro
GPT OSS 120B 467 tok/s Pro
Claude Sonnet 4 37 tok/s Pro
2000 character limit reached

The Use of Multi-Scale Fiducial Markers To Aid Takeoff and Landing Navigation by Rotorcraft (2309.08769v3)

Published 15 Sep 2023 in cs.CV and cs.RO

Abstract: This paper quantifies the performance of visual SLAM that leverages multi-scale fiducial markers (i.e., artificial landmarks that can be detected at a wide range of distances) to show its potential for reliable takeoff and landing navigation in rotorcraft. Prior work has shown that square markers with a black-and-white pattern of grid cells can be used to improve the performance of visual SLAM with color cameras. We extend this prior work to allow nested marker layouts. We evaluate performance during semi-autonomous takeoff and landing operations in a variety of environmental conditions by a DJI Matrice 300 RTK rotorcraft with two FLIR Blackfly color cameras, using RTK GNSS to obtain ground truth pose estimates. Performance measures include absolute trajectory error and the fraction of the number of estimated poses to the total frame. We release all of our results -- our dataset and the code of the implementation of the visual SLAM with fiducial markers -- to the public as open-source.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. Lim, H., and Lee, Y. S., “Real-time single camera SLAM using fiducial markers,” 2009 ICCAS-SICE, 2009, pp. 177–182.
  2. Yamada, T., Yairi, T., Bener, S. H., and Machida, K., “A study on SLAM for indoor blimp with visual markers,” 2009 ICCAS-SICE, 2009, pp. 647–652.
  3. Pfrommer, B., and Daniilidis, K., “Tagslam: Robust slam with fiducial markers,” arXiv preprint arXiv:1910.00679, 2019.
  4. Munoz-Salinas, R., and Medina-Carnicer, R., “UcoSLAM: Simultaneous localization and mapping by fusion of keypoints and squared planar markers,” Pattern Recognition, Vol. 101, 2020, p. 107193.
  5. Lee, J., Choi, S. Y., Hanley, D., and Bretl, T., “Comparative Study of Visual SLAM-Based Mobile Robot Localization Using Fiducial Markers,” 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Workshop on Closing the Loop on Localization, 2023.
  6. Krogius, M., Haggenmiller, A., and Olson, E., “Flexible Layouts for Fiducial Tags,” 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2019, pp. 1898–1903. 10.1109/IROS40897.2019.8967787.
  7. Springer, J., and Kyas, M., “Autonomous Drone Landing with Fiducial Markers and a Gimbal-Mounted Camera for Active Tracking,” 2022, pp. 243–247. 10.1109/IRC55401.2022.00047.
  8. Administration, F. A., “Vertiport Design,” https://www.faa.gov/airports/engineering/engineering_briefs/engineering_brief_105_vertiport_design, 2023. Last updated: 03.13.2023.
  9. Mur-Artal, R., and Tardós, J. D., “Orb-slam2: An open-source slam system for monocular, stereo, and rgb-d cameras,” IEEE transactions on robotics, Vol. 33, No. 5, 2017, pp. 1255–1262.
  10. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J., and Marín-Jiménez, M. J., “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognition, Vol. 47, No. 6, 2014, pp. 2280–2292.
  11. Sola, J., Vallvé, J., Casals, J., Deray, J., Fourmy, M., Atchuthan, D., Corominas-Murtra, A., and Andrade-Cetto, J., “WOLF: A modular estimation framework for robotics based on factor graphs,” IEEE Robotics and Automation Letters, Vol. 7, No. 2, 2022, pp. 4710–4717.
  12. Kalaitzakis, M., Cain, B., Carroll, S., Ambrosi, A., Whitehead, C., and Vitzilaios, N., “Fiducial markers for pose estimation: Overview, applications and experimental comparison of the artag, apriltag, aruco and stag markers,” Journal of Intelligent & Robotic Systems, Vol. 101, 2021, pp. 1–26.
  13. Furgale, P., Rehder, J., and Siegwart, R., “Unified temporal and spatial calibration for multi-sensor systems,” 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE, 2013, pp. 1280–1286.
  14. Campos, C., Elvira, R., Rodríguez, J. J. G., Montiel, J. M., and Tardós, J. D., “Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam,” IEEE Transactions on Robotics, Vol. 37, No. 6, 2021, pp. 1874–1890.
  15. Lee, J., Hanley, D., and Bretl, T., “Extrinsic calibration of multiple inertial sensors from arbitrary trajectories,” IEEE Robotics and Automation Letters, Vol. 7, No. 2, 2022, pp. 2055–2062.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.