Abstract:
In light of the demand for high-precision positioning in lunar surface exploration missions, the current visual inertial positioning method suffers from error accumulation that cannot be mitigated without prior environmental information. The radio positioning method based on the principle of triangulation can determine the position of target objects by measuring the distance between the receiving end and at least three different transmitting ends. However, due to the requirement for more than three base stations, it poses a challenge in meeting large-scale positioning needs in lunar environment. The paper proposes a fusion positioning method that utilizes wireless ranging from base stations and visual inertial odometers to address the limitations of existing positioning methods. The proposed approach involves fusing WIFI, UHF, and UWB radio base station ranging information with an environment model constructed using visual inertial SLAM. Subsequently, the location of the lunar rover is determined through joint calculation of the ranging results and visual IMU data. Finally, simulation and verification were conducted on the Matlab and Gazebo software platform, resulting in a positioning error of 0.92meters at a distance of 1500meters from the base station. The experimental results demonstrate that the proposed method satisfies the requirements for precise positioning in lunar surface exploration missions.