Tightly-Coupled Aided Inertial Navigation with Point and Plane Features算法
具備點和平面特徵的緊密耦合輔助慣性導航app
Yulin Yang∗, Patrick Geneva††, Xingxing Zuo†, Kevin Eckenhoff∗, Yong Liu†, and Guoquan Huang∗框架
This paper presents a tightly-coupled aided inertial navigation system (INS) with point and plane features, a general sensor fusion framework applicable to any visual and depth sensor (e.g., RGBD, LiDAR) configuration, in which the camera is used for point feature tracking and depth sensor for plane extraction. The proposed system exploits geometrical structures (planes) of the environments and adopts the closest point (CP) for plane parameterization. Moreover, we distinguish planar point features from non-planar point features in order to enforce point-on-plane constraints which are used in our state estimator, thus further exploiting structural information from the environment. We also introduce a simple but effective plane feature initialization algorithm for feature-based simultaneous localization and mapping (SLAM). In addition, we perform online spatial calibration between the IMU and the depth sensor as it is difficult to obtain this critical calibration parameter in high precision. Both Monte-Carlo simulations and real-world experiments are performed to validate the proposed approach.ide
本文提出了一種具備點和平面特徵的緊密耦合輔助慣性導航系統(INS),一種適用於任何視覺和深度傳感器(例如RGBD,LiDAR)配置的通用傳感器融合框架,其中相機用於點特徵 跟蹤和深度傳感器用於平面提取。所提出的系統利用環境的幾何結構(平面),並採用最接近點(CP)進行平面參數化。此外,咱們將平面點特徵與非平面點特徵區分開,以強制執行在咱們的狀態估計器中使用的點對平面約束,從而進一步利用環境中的結構信息。咱們還爲基於特徵的同時定位和建圖(SLAM)引入了一種簡單而有效的平面特徵初始化算法。 另外,因爲很難以高精度得到此關鍵校準參數,所以咱們在IMU和深度傳感器之間執行在線空間校準。 蒙特卡洛模擬和真實世界的實驗均可以驗證所提出的方法。ui