Full Text:  <1112>

Summary:  <231>

CLC number: TP242.6

On-line Access: 2023-07-03

Received: 2022-05-14

Revision Accepted: 2022-11-15

Crosschecked: 2023-07-03

Cited: 0

Clicked: 1091

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Gengyu GE

https://orcid.org/0000-0001-9913-0785

Yi ZHANG

https://orcid.org/0000-0001-6935-5721

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering 

Accepted manuscript available online (unedited version)


Visual-feature-assisted mobile robot localization in a long corridor environment


Author(s):  Gengyu GE, Yi ZHANG, Wei WANG, Lihe HU, Yang WANG, Qin JIANG

Affiliation(s):  School of Computer Science and Technology, Chongqing University of Posts and Telecommunications,Chongqing 400065,China; more

Corresponding email(s):  gegengyu_2021@163.com, zhangyi@cqupt.edu.cn

Key Words:  Mobile robot; Localization; Simultaneous localization and mapping (SLAM); Corridor environment; Particle filter; Visual features


Share this article to: More <<< Previous Paper|Next Paper >>>

Gengyu GE, Yi ZHANG, Wei WANG, Lihe HU, Yang WANG, Qin JIANG. Visual-feature-assisted mobile robot localization in a long corridor environment[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2200208

@article{title="Visual-feature-assisted mobile robot localization in a long corridor environment",
author="Gengyu GE, Yi ZHANG, Wei WANG, Lihe HU, Yang WANG, Qin JIANG",
journal="Frontiers of Information Technology & Electronic Engineering",
year="in press",
publisher="Zhejiang University Press & Springer",
doi="https://doi.org/10.1631/FITEE.2200208"
}

%0 Journal Article
%T Visual-feature-assisted mobile robot localization in a long corridor environment
%A Gengyu GE
%A Yi ZHANG
%A Wei WANG
%A Lihe HU
%A Yang WANG
%A Qin JIANG
%J Frontiers of Information Technology & Electronic Engineering
%P 876-889
%@ 2095-9184
%D in press
%I Zhejiang University Press & Springer
doi="https://doi.org/10.1631/FITEE.2200208"

TY - JOUR
T1 - Visual-feature-assisted mobile robot localization in a long corridor environment
A1 - Gengyu GE
A1 - Yi ZHANG
A1 - Wei WANG
A1 - Lihe HU
A1 - Yang WANG
A1 - Qin JIANG
J0 - Frontiers of Information Technology & Electronic Engineering
SP - 876
EP - 889
%@ 2095-9184
Y1 - in press
PB - Zhejiang University Press & Springer
ER -
doi="https://doi.org/10.1631/FITEE.2200208"


Abstract: 
Localization plays a vital role in the mobile robot navigation system and is a fundamental capability for autonomous movement. In an indoor environment, the current mainstream localization scheme uses two-dimensional (2D) laser light detection and ranging (LiDAR) to build an occupancy grid map with simultaneous localization and mapping (SLAM) technology; it then locates the robot based on the known grid map. However, such solutions work effectively only in those areas with salient geometrical features. For areas with repeated, symmetrical, or similar structures, such as a long corridor, the conventional particle filtering method will fail. To solve this crucial problem, this paper presents a novel coarse-to-fine paradigm that uses visual features to assist mobile robot localization in a long corridor. First, the mobile robot is remote-controlled to move from the starting position to the end along a middle line. In the moving process, a grid map is built using the laser-based SLAM method. At the same time, a visual map consisting of special images which are keyframes is created according to a keyframe selection strategy. The keyframes are associated with the robot’s poses through timestamps. Second, a moving strategy is proposed, based on the extracted range features of the laser scans, to decide on an initial rough position. This is vital for the mobile robot because it gives instructions on where the robot needs to move to adjust its pose. Third, the mobile robot captures images in a proper perspective according to the moving strategy and matches them with the image map to achieve a coarse localization. Finally, an improved particle filtering method is presented to achieve fine localization. Experimental results show that our method is effective and robust for global localization. The localization success rate reaches 98.8% while the average moving distance is only 0.31 m. In addition, the method works well when the mobile robot is kidnapped to another position in the corridor.

长走廊环境下视觉特征辅助的移动机器人定位研究

葛耿育1,3,张毅2,王维1,胡立鹤1,王洋1,蒋勤1
1重庆邮电大学计算机科学与技术学院,中国重庆市,400065
2重庆邮电大学先进制造工程学院,中国重庆市,400065
3遵义师范学院信息工程学院,中国遵义市,563006
摘要:定位在移动机器人导航系统中起着至关重要的作用,是自主移动的基本能力。在室内环境中,当前主流的定位方案使用2D激光雷达,利用即时定位和建图(SLAM)技术来构建占据栅格地图;然后,基于已知的地图来定位。然而,此类方案仅在具有显著几何特征的区域有效。对于重复、对称或类似结构的区域,例如长走廊,常规粒子过滤方法将失效。为解决这一问题,本文提出一种从粗到细的模式,该模式使用视觉特征辅助长走廊中的移动机器人定位。首先,移动机器人被远程控制,沿着中线从起始位置移动到终点。在移动过程中,使用基于激光的SLAM方法建图。同时,根据关键帧选择策略创建关键帧图像组成的视觉地图。关键帧通过时间戳与机器人的姿势相关联。其次,基于提取的激光扫描距离特征,提出一种移动策略,确定初始粗略位置。这对于移动机器人来说至关重要,因为它给出了机器人需要移动到哪里才能调整姿势的指令。然后,移动机器人根据移动策略以适当的视角捕捉图像,并将其与图像地图进行匹配,以获得粗略的定位。最后,提出一种改进的粒子滤波方法来实现精细定位。实验结果表明,该方法对全局定位是有效和鲁棒的。定位成功率达98.8%,平均移动距离仅0.31米。此外,当移动机器人被绑架到走廊中的另一个位置时,该方法依然有效。

关键词组:移动机器人;定位;即时定位和建图(SLAM);走廊环境;粒子滤波;视觉特征

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Chen RJ, Yin H, Jiao YM, et al., 2021. Deep samplable observation model for global localization and kidnapping. IEEE Robot Autom Lett, 6(2):2296-2303.

[2]Chen XYL, Läbe T, Nardi L, et al., 2020. Learning an overlap-based observation model for 3D LiDAR localization. Proc IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.4602-4608.

[3]Djosic S, Stojanovic I, Jovanovic M, et al., 2021. Fingerprinting-assisted UWB-based localization technique for complex indoor environments. Exp Syst Appl, 167:114188.

[4]Fox D, Burgard D, Thrun S, 1999a. Markov localization for mobile robots in dynamic environments. J Artif Intell Res, 11:391-427.

[5]Fox D, Burgard W, Dellaert F, et al., 1999b. Monte Carlo localization: efficient position estimation for mobile robots. Proc 16th National Conf on Artificial Intelligence and 11th Conf on Innovative Applications of Artificial Intelligence, p.343-349.

[6]Ge GY, Zhang Y, Wang W, et al., 2022. Text-MCL: autonomous mobile robot localization in similar environment using text-level semantic information. Machines, 10(3):169.

[7]Grisetti G, Stachniss C, Burgard W, 2007. Improved techniques for grid mapping with Rao-Blackwellized particle filters. IEEE Trans Robot, 23(1):34-46.

[8]Hess W, Kohler D, Rapp H, et al., 2016. Real-time loop closure in 2D LIDAR SLAM. Proc IEEE Int Conf on Robotics and Automation, p.1271-1278.

[9]Ito S, Endres F, Kuderer M, et al., 2014. W-RGB-D: floor-plan-based indoor global localization using a depth camera and WiFi. Proc IEEE Int Conf on Robotics and Automation, p.417-422.

[10]Katsikis VN, Mourtas SD, Stanimirović PS, et al., 2022. Solving complex-valued time-varying linear matrix equations via QR decomposition with applications to robotic motion tracking and on angle-of-arrival localization. IEEE Trans Neur Netw Learn Syst, 33(8):3415-3424.

[11]Kim J, Chung W, 2016. Localization of a mobile robot using a laser range finder in a glass-walled environment. IEEE Trans Ind Electron, 63(6):3616-3627.

[12]Liu X, Zhou BD, Huang PP, et al., 2021. Kalman filter-based data fusion of Wi-Fi RTT and PDR for indoor localization. IEEE Sens J, 21(6):8479-8490.

[13]Long SB, He X, Yao C, 2021. Scene text detection and recognition: the deep learning era. Int J Comput Vis, 129(1):‍161-184.

[14]Meng J, Wang ST, Xie YL, et al., 2021. Efficient re-localization of mobile robot using strategy of finding a missing person. Measurement, 176:109212.

[15]Motroni A, Buffi A, Nepa P, 2021. A survey on indoor vehicle localization through RFID technology. IEEE Access, 9:17921-17942.

[16]Muhammad A, Ali MAH, Turaev S, et al., 2022. Novel algorithm for mobile robot path planning in constrained environment. Comput Mater Contin, 71(2):2697-2719.

[17]Mur-Artal R, Montiel JMM, Tardós JD, 2015. ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans Robot, 31(5):1147-1163.

[18]Naseer T, Burgard W, Stachniss C, 2018. Robust visual localization across seasons. IEEE Trans Robot, 34(2):289-302.

[19]Qian C, Zhang HJ, Tang J, et al., 2019. An orthogonal weighted occupancy likelihood map with IMU-aided laser scan matching for 2D indoor mapping. Sensors, 19(7):1742.

[20]Rublee E, Rabaud V, Konolige K, et al., 2011. ORB: an efficient alternative to SIFT or SURF. Proc IEEE Conf on Computer Vision, p.2564-2571.

[21]Thrun S, Fox D, Burgard W, et al., 2001. Robust Monte Carlo localization for mobile robots. Artif Intell, 128(1-2):99-141.

[22]Thrun S, Burgard W, Fox D, 2005. Probabilistic Robotics. MIT Press, Cambridge, USA.

[23]Ullah I, Qian SY, Deng ZX, et al., 2021. Extended Kalman filter-based localization algorithm by edge computing in wireless sensor networks. Dig Commun Netw, 7(2):‍187-195.

[24]Valente M, Joly C, de La Fortelle A, 2019. Evidential SLAM fusing 2D laser scanner and stereo camera. Unmanned Syst, 7(3):149-159.

[25]Wang FS, Zhang JX, Lin BW, et al., 2018. Two stage particle filter for nonlinear Bayesian estimation. IEEE Access, 6:13803-13809.

[26]Wang XV, Wang LH, 2021. A literature survey of the robotic technologies during the COVID-19 pandemic. J Manuf Syst, 60:823-836.

[27]Wu N, Feng LH, Yang AY, 2017. Localization accuracy improvement of a visible light positioning system based on the linear illumination of LED sources. IEEE Photon J, 9(5):7905611.

[28]Xu LC, Feng C, Kamat VR, et al., 2019. An occupancy grid mapping enhanced visual SLAM for real-time locating applications in indoor GPS-denied environments. Autom Constr, 104:230-245.

[29]Yousuf S, Kadri MB, 2021. Information fusion of GPS, INS and odometer sensors for improving localization accuracy of mobile robots in indoor and outdoor applications. Robotica, 39(2):250-276.

[30]Zhang L, Chen ZH, Cui W, et al., 2020. WiFi-based indoor robot positioning using deep fuzzy forests. IEEE Int Things J, 7(11):10773-10781.

[31]Zhao JH, Zhao L, Huang SD, et al., 2020. 2D laser SLAM with general features represented by implicit functions. IEEE Robot Autom Lett, 5(3):4329-4336.

[32]Zhao JH, Li TC, Yang T, et al., 2021. 2D laser SLAM with closed shape features: Fourier series parameterization and submap joining. IEEE Robot Autom Lett, 6(2):1527-1534.

[33]Zhao ZQ, Zheng P, Xu ST, et al., 2019. Object detection with deep learning: a review. IEEE Trans Neur Netw Learn Syst, 30(11):3212-3232.

[34]Zimmerman N, Wiesmann L, Guadagnino T, et al., 2022. Robust onboard localization in changing environments exploiting text spotting.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE