Camera Motion Estimation from RGB-D-Inertial Scene Flow (2404.17251v1)
Abstract: In this paper, we introduce a novel formulation for camera motion estimation that integrates RGB-D images and inertial data through scene flow. Our goal is to accurately estimate the camera motion in a rigid 3D environment, along with the state of the inertial measurement unit (IMU). Our proposed method offers the flexibility to operate as a multi-frame optimization or to marginalize older data, thus effectively utilizing past measurements. To assess the performance of our method, we conducted evaluations using both synthetic data from the ICL-NUIM dataset and real data sequences from the OpenLORIS-Scene dataset. Our results show that the fusion of these two sensors enhances the accuracy of camera motion estimation when compared to using only visual data.
- Fast ego-motion estimation with multi-rate fusion of inertial and vision. The International Journal of Robotics Research, 26(6):577–589, 2007.
- The computation of optical flow. ACM Computing Surveys, 27(3):433–466, 1995.
- Orb-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Transactions on Robotics, 37(6):1874–1890, 2021.
- On-Manifold Preintegration for Real-Time Visual–Inertial Odometry. IEEE Transactions on Robotics, 33(1):1–21, 2017.
- A benchmark for rgb-d visual odometry, 3d reconstruction and slam. In 2014 IEEE International Conference on Robotics and Automation (ICRA), pages 1524–1531, 2014.
- Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds. Information Fusion, 14(1):57–77, 2013.
- Fast visual odometry for 3-d range sensors. IEEE Transactions on Robotics, 31(4):809–822, 2015.
- A primal-dual framework for real-time dense rgb-d scene flow. In 2015 IEEE international conference on robotics and automation (ICRA), pages 98–104. IEEE, 2015.
- Robust planar odometry based on symmetric range flow and multiscan alignment. IEEE Transactions on Robotics, 34(6):1623–1635, 2018.
- Robust odometry estimation for RGB-D cameras. In 2013 IEEE International Conference on Robotics and Automation, pages 3748–3754, Karlsruhe, Germany, 2013. IEEE.
- Dense continuous-time tracking and mapping with rolling shutter rgb-d cameras. In 2015 IEEE International Conference on Computer Vision (ICCV), pages 2264–2272, 2015.
- Dense rgb-d-inertial slam with map deformations. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 6741–6748, 2017a.
- Dense RGB-D-inertial SLAM with map deformations. In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 6741–6748, Vancouver, BC, 2017b. IEEE.
- Aggressive perception-aware navigation using deep optical flow dynamics and pixelmpc. IEEE Robotics and Automation Letters, 5(2):1207–1214, 2020.
- Keyframe-based visual–inertial odometry using nonlinear optimization. The International Journal of Robotics Research, 34(3):314–334, 2015.
- RGBD Scene Flow Estimation with Global Nonrigid and Local Rigid Assumption. Discrete Dynamics in Nature and Society, 2020:1–9, 2020.
- Combining inertial navigation and icp for real-time 3d surface reconstruction. In Eurographics (Short Papers), pages 13–16. Citeseer, 2014.
- Maximum likelihood identification of inertial sensor noise model parameters. IEEE Sensors Journal, 16(1):163–176, 2016.
- Visual-inertial teach and repeat. Robotics and Autonomous Systems, 131:103577, 2020.
- S-ptam: Stereo parallel tracking and mapping. Robotics and Autonomous Systems, 93:27–42, 2017.
- Learning multi-object tracking and segmentation from automatic annotations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 6846–6855, 2020.
- Learning object class detectors from weakly annotated video. In 2012 IEEE Conference on computer vision and pattern recognition, pages 3282–3289. IEEE, 2012.
- Dense Semi-rigid Scene Flow Estimation from RGBD Images. In Computer Vision – ECCV 2014, pages 567–582. Springer International Publishing, Cham, 2014. Series Title: Lecture Notes in Computer Science.
- Nerf-slam: Real-time dense monocular slam with neural radiance fields. In 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2023.
- Neuromorphic optical flow and real-time implementation with event cameras. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4128–4137, 2023.
- Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. In 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), pages 5135–5142. IEEE, 2020.
- RGBD-Inertial Trajectory Estimation and Mapping for Ground Robots. Sensors, 19(10):2251, 2019.
- Are we ready for service robots? the OpenLORIS-Scene datasets for lifelong SLAM. In 2020 International Conference on Robotics and Automation (ICRA), pages 3139–3145, 2020.
- Range flow estimation. Comput. Vis. Image Underst., 85(3):209–231, 2002.
- Droid-slam: Deep visual slam for monocular, stereo, and rgb-d cameras. Advances in neural information processing systems, 34:16558–16569, 2021.
- Three-dimensional scene flow. In Proceedings of the Seventh IEEE International Conference on Computer Vision, pages 722–729 vol.2, Kerkyra, Greece, 1999. IEEE.
- Optical flow and scene flow estimation: A survey. Pattern Recognition, 114:107861, 2021.
- Flowfusion: Dynamic dense rgb-d slam based on optical flow. In 2020 IEEE International Conference on Robotics and Automation (ICRA), pages 7322–7328. IEEE, 2020.
- Unsupervised event-based optical flow using motion compensation. In Proceedings of the European Conference on Computer Vision (ECCV) Workshops, 2018.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.