CLC number: TP242.6
On-line Access: 2024-08-27
Received: 2023-10-17
Revision Accepted: 2024-05-08
Crosschecked: 2023-07-03
Cited: 0
Clicked: 2046
Citations: Bibtex RefMan EndNote GB/T7714
Gengyu GE, Yi ZHANG, Wei WANG, Lihe HU, Yang WANG, Qin JIANG. Visual-feature-assisted mobile robot localization in a long corridor environment[J]. Frontiers of Information Technology & Electronic Engineering, 2023, 24(6): 876-889.
@article{title="Visual-feature-assisted mobile robot localization in a long corridor environment",
author="Gengyu GE, Yi ZHANG, Wei WANG, Lihe HU, Yang WANG, Qin JIANG",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="24",
number="6",
pages="876-889",
year="2023",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2200208"
}
%0 Journal Article
%T Visual-feature-assisted mobile robot localization in a long corridor environment
%A Gengyu GE
%A Yi ZHANG
%A Wei WANG
%A Lihe HU
%A Yang WANG
%A Qin JIANG
%J Frontiers of Information Technology & Electronic Engineering
%V 24
%N 6
%P 876-889
%@ 2095-9184
%D 2023
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2200208
TY - JOUR
T1 - Visual-feature-assisted mobile robot localization in a long corridor environment
A1 - Gengyu GE
A1 - Yi ZHANG
A1 - Wei WANG
A1 - Lihe HU
A1 - Yang WANG
A1 - Qin JIANG
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 24
IS - 6
SP - 876
EP - 889
%@ 2095-9184
Y1 - 2023
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2200208
Abstract: localization plays a vital role in the mobile robot navigation system and is a fundamental capability for autonomous movement. In an indoor environment, the current mainstream localization scheme uses two-dimensional (2D) laser light detection and ranging (LiDAR) to build an occupancy grid map with simultaneous localization and mapping (SLAM) technology; it then locates the robot based on the known grid map. However, such solutions work effectively only in those areas with salient geometrical features. For areas with repeated, symmetrical, or similar structures, such as a long corridor, the conventional particle filtering method will fail. To solve this crucial problem, this paper presents a novel coarse-to-fine paradigm that uses visual features to assist mobile robot localization in a long corridor. First, the mobile robot is remote-controlled to move from the starting position to the end along a middle line. In the moving process, a grid map is built using the laser-based SLAM method. At the same time, a visual map consisting of special images which are keyframes is created according to a keyframe selection strategy. The keyframes are associated with the robot’s poses through timestamps. Second, a moving strategy is proposed, based on the extracted range features of the laser scans, to decide on an initial rough position. This is vital for the mobile robot because it gives instructions on where the robot needs to move to adjust its pose. Third, the mobile robot captures images in a proper perspective according to the moving strategy and matches them with the image map to achieve a coarse localization. Finally, an improved particle filtering method is presented to achieve fine localization. Experimental results show that our method is effective and robust for global localization. The localization success rate reaches 98.8% while the average moving distance is only 0.31 m. In addition, the method works well when the mobile robot is kidnapped to another position in the corridor.
[1]Chen RJ, Yin H, Jiao YM, et al., 2021. Deep samplable observation model for global localization and kidnapping. IEEE Robot Autom Lett, 6(2):2296-2303.
[2]Chen XYL, Läbe T, Nardi L, et al., 2020. Learning an overlap-based observation model for 3D LiDAR localization. Proc IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.4602-4608.
[3]Djosic S, Stojanovic I, Jovanovic M, et al., 2021. Fingerprinting-assisted UWB-based localization technique for complex indoor environments. Exp Syst Appl, 167:114188.
[4]Fox D, Burgard D, Thrun S, 1999a. Markov localization for mobile robots in dynamic environments. J Artif Intell Res, 11:391-427.
[5]Fox D, Burgard W, Dellaert F, et al., 1999b. Monte Carlo localization: efficient position estimation for mobile robots. Proc 16th National Conf on Artificial Intelligence and 11th Conf on Innovative Applications of Artificial Intelligence, p.343-349.
[6]Ge GY, Zhang Y, Wang W, et al., 2022. Text-MCL: autonomous mobile robot localization in similar environment using text-level semantic information. Machines, 10(3):169.
[7]Grisetti G, Stachniss C, Burgard W, 2007. Improved techniques for grid mapping with Rao-Blackwellized particle filters. IEEE Trans Robot, 23(1):34-46.
[8]Hess W, Kohler D, Rapp H, et al., 2016. Real-time loop closure in 2D LIDAR SLAM. Proc IEEE Int Conf on Robotics and Automation, p.1271-1278.
[9]Ito S, Endres F, Kuderer M, et al., 2014. W-RGB-D: floor-plan-based indoor global localization using a depth camera and WiFi. Proc IEEE Int Conf on Robotics and Automation, p.417-422.
[10]Katsikis VN, Mourtas SD, Stanimirović PS, et al., 2022. Solving complex-valued time-varying linear matrix equations via QR decomposition with applications to robotic motion tracking and on angle-of-arrival localization. IEEE Trans Neur Netw Learn Syst, 33(8):3415-3424.
[11]Kim J, Chung W, 2016. Localization of a mobile robot using a laser range finder in a glass-walled environment. IEEE Trans Ind Electron, 63(6):3616-3627.
[12]Liu X, Zhou BD, Huang PP, et al., 2021. Kalman filter-based data fusion of Wi-Fi RTT and PDR for indoor localization. IEEE Sens J, 21(6):8479-8490.
[13]Long SB, He X, Yao C, 2021. Scene text detection and recognition: the deep learning era. Int J Comput Vis, 129(1):161-184.
[14]Meng J, Wang ST, Xie YL, et al., 2021. Efficient re-localization of mobile robot using strategy of finding a missing person. Measurement, 176:109212.
[15]Motroni A, Buffi A, Nepa P, 2021. A survey on indoor vehicle localization through RFID technology. IEEE Access, 9:17921-17942.
[16]Muhammad A, Ali MAH, Turaev S, et al., 2022. Novel algorithm for mobile robot path planning in constrained environment. Comput Mater Contin, 71(2):2697-2719.
[17]Mur-Artal R, Montiel JMM, Tardós JD, 2015. ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans Robot, 31(5):1147-1163.
[18]Naseer T, Burgard W, Stachniss C, 2018. Robust visual localization across seasons. IEEE Trans Robot, 34(2):289-302.
[19]Qian C, Zhang HJ, Tang J, et al., 2019. An orthogonal weighted occupancy likelihood map with IMU-aided laser scan matching for 2D indoor mapping. Sensors, 19(7):1742.
[20]Rublee E, Rabaud V, Konolige K, et al., 2011. ORB: an efficient alternative to SIFT or SURF. Proc IEEE Conf on Computer Vision, p.2564-2571.
[21]Thrun S, Fox D, Burgard W, et al., 2001. Robust Monte Carlo localization for mobile robots. Artif Intell, 128(1-2):99-141.
[22]Thrun S, Burgard W, Fox D, 2005. Probabilistic Robotics. MIT Press, Cambridge, USA.
[23]Ullah I, Qian SY, Deng ZX, et al., 2021. Extended Kalman filter-based localization algorithm by edge computing in wireless sensor networks. Dig Commun Netw, 7(2):187-195.
[24]Valente M, Joly C, de La Fortelle A, 2019. Evidential SLAM fusing 2D laser scanner and stereo camera. Unmanned Syst, 7(3):149-159.
[25]Wang FS, Zhang JX, Lin BW, et al., 2018. Two stage particle filter for nonlinear Bayesian estimation. IEEE Access, 6:13803-13809.
[26]Wang XV, Wang LH, 2021. A literature survey of the robotic technologies during the COVID-19 pandemic. J Manuf Syst, 60:823-836.
[27]Wu N, Feng LH, Yang AY, 2017. Localization accuracy improvement of a visible light positioning system based on the linear illumination of LED sources. IEEE Photon J, 9(5):7905611.
[28]Xu LC, Feng C, Kamat VR, et al., 2019. An occupancy grid mapping enhanced visual SLAM for real-time locating applications in indoor GPS-denied environments. Autom Constr, 104:230-245.
[29]Yousuf S, Kadri MB, 2021. Information fusion of GPS, INS and odometer sensors for improving localization accuracy of mobile robots in indoor and outdoor applications. Robotica, 39(2):250-276.
[30]Zhang L, Chen ZH, Cui W, et al., 2020. WiFi-based indoor robot positioning using deep fuzzy forests. IEEE Int Things J, 7(11):10773-10781.
[31]Zhao JH, Zhao L, Huang SD, et al., 2020. 2D laser SLAM with general features represented by implicit functions. IEEE Robot Autom Lett, 5(3):4329-4336.
[32]Zhao JH, Li TC, Yang T, et al., 2021. 2D laser SLAM with closed shape features: Fourier series parameterization and submap joining. IEEE Robot Autom Lett, 6(2):1527-1534.
[33]Zhao ZQ, Zheng P, Xu ST, et al., 2019. Object detection with deep learning: a review. IEEE Trans Neur Netw Learn Syst, 30(11):3212-3232.
[34]Zimmerman N, Wiesmann L, Guadagnino T, et al., 2022. Robust onboard localization in changing environments exploiting text spotting.
Open peer comments: Debate/Discuss/Question/Opinion
<1>