dataset
2024 年 9 月 23 日
UncertaintyAware VisualInertial SLAM with Volumetric Occupancy Mapping
title: UncertaintyAware VisualInertial SLAM with Volumetric Occupancy Mapping
publish date:
2024-09-18
authors:
Jaehyung Jung et.al.
paper id
2409.12051v1
download
abstracts:
We propose visual-inertial simultaneous localization and mapping that tightly couples sparse reprojection errors, inertial measurement unit pre-integrals, and relative pose factors with dense volumetric occupancy mapping. Hereby depth predictions from a deep neural network are fused in a fully probabilistic manner. Specifically, our method is rigorously uncertainty-aware: first, we use depth and uncertainty predictions from a deep network not only from the robot’s stereo rig, but we further probabilistically fuse motion stereo that provides depth information across a range of baselines, therefore drastically increasing mapping accuracy. Next, predicted and fused depth uncertainty propagates not only into occupancy probabilities but also into alignment factors between generated dense submaps that enter the probabilistic nonlinear least squares estimator. This submap representation offers globally consistent geometry at scale. Our method is thoroughly evaluated in two benchmark datasets, resulting in localization and mapping accuracy that exceeds the state of the art, while simultaneously offering volumetric occupancy directly usable for downstream robotic planning and control in real-time.
QA:
coming soon
编辑整理: wanghaisheng 更新日期:2024 年 9 月 23 日