Full Text:  <3671>

Summary:  <1437>

CLC number: P232

On-line Access: 2021-07-12

Received: 2020-03-07

Revision Accepted: 2020-07-02

Crosschecked: 2021-06-01

Cited: 0

Clicked: 4263

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Zhilu Yuan

https://orcid.org/0000-0002-7431-6599

Shengjun Tang

https://orcid.org/0000-0002-8262-7397

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering 

Accepted manuscript available online (unedited version)


A survey on indoor 3D modeling and applications via RGB-D devices


Author(s):  Zhilu Yuan, You Li, Shengjun Tang, Ming Li, Renzhong Guo, Weixi Wang

Affiliation(s):  School of Architecture and Urban Planning, Research Institute for Smart Cities, Shenzhen University & China Guangdong–Hong Kong–Macau Joint Laboratory for Smart Cities & Key Laboratory of Urban Land Resources Monitoring and Simulation, Ministry of Natural Resources, Shenzhen 518060, China; more

Corresponding email(s):  shengjuntang@szu.edu.cn

Key Words:  3D indoor mapping, RGB-D, Indoor localization, Construction monitoring, Emergency evacuation


Share this article to: More <<< Previous Paper|Next Paper >>>

Zhilu Yuan, You Li, Shengjun Tang, Ming Li, Renzhong Guo, Weixi Wang. A survey on indoor 3D modeling and applications via RGB-D devices[J]. Frontiers of Information Technology & Electronic Engineering,in press.https://doi.org/10.1631/FITEE.2000097

@article{title="A survey on indoor 3D modeling and applications via RGB-D devices",
author="Zhilu Yuan, You Li, Shengjun Tang, Ming Li, Renzhong Guo, Weixi Wang",
journal="Frontiers of Information Technology & Electronic Engineering",
year="in press",
publisher="Zhejiang University Press & Springer",
doi="https://doi.org/10.1631/FITEE.2000097"
}

%0 Journal Article
%T A survey on indoor 3D modeling and applications via RGB-D devices
%A Zhilu Yuan
%A You Li
%A Shengjun Tang
%A Ming Li
%A Renzhong Guo
%A Weixi Wang
%J Frontiers of Information Technology & Electronic Engineering
%P 815-826
%@ 2095-9184
%D in press
%I Zhejiang University Press & Springer
doi="https://doi.org/10.1631/FITEE.2000097"

TY - JOUR
T1 - A survey on indoor 3D modeling and applications via RGB-D devices
A1 - Zhilu Yuan
A1 - You Li
A1 - Shengjun Tang
A1 - Ming Li
A1 - Renzhong Guo
A1 - Weixi Wang
J0 - Frontiers of Information Technology & Electronic Engineering
SP - 815
EP - 826
%@ 2095-9184
Y1 - in press
PB - Zhejiang University Press & Springer
ER -
doi="https://doi.org/10.1631/FITEE.2000097"


Abstract: 
With the fast development of consumer-level RGB-D cameras, real-world indoor three-dimensional (3D) scene modeling and robotic applications are gaining more attention. However, indoor 3D scene modeling is still challenging because the structure of interior objects may be complex and the RGB-D data acquired by consumer-level sensors may have poor quality. There is a lot of research in this area. In this survey, we provide an overview of recent advances in indoor scene modeling methods, public indoor datasets and libraries which can facilitate experiments and evaluations, and some typical applications using RGB-D devices including indoor localization and emergency evacuation.

基于RGB-D传感器的室内三维建模及应用研究综述

原志路1,李游1,汤圣君1,李明2,郭仁忠1,王伟玺1
1深圳大学建筑与城市规划学院,智慧城市研究院,粤港澳大湾区智慧城市联合实验室,自然资源部监测与仿真重点实验室,中国深圳市,518060
2测绘与遥感信息工程国家重点实验室,中国武汉市,430079
摘要:随着消费级RGB-D摄像机的快速发展,真实世界的室内三维场景建模和机器人应用越来越受到重视。然而,室内三维场景建模仍具有挑战性,因室内物体结构可能具有较高复杂性,在此情况下,消费者级传感器采集的RGB-D数据质量需进一步提升。近年来,在提高消费者级传感器采集的RGB-D数据质量方面,有很多值得关注的研究。本文介绍了室内场景建模方法的最新进展、室内公共数据集和库以及RGB-D设备的典型应用,包括室内定位和紧急疏散。

关键词组:三维室内制图;RGB-D;室内定位;施工监测;应急疏散

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Biswas J, Veloso M, 2010. WiFi localization and navigation for autonomous indoor mobile robots. IEEE Int Conf on robotics and automation, p.4379-4384.

[2]Chen AY, Chu JC, 2016. TDVRP and BIM integrated approach for in-building emergency rescue routing. J Comput Civil Eng, 30(5):C4015003.

[3]Chen AY, Huang T, 2015. Toward BIM-enabled decision making for in-building response missions. IEEE Trans Intell Transp Syst, 16(5):2765-2773.

[4]Chen C, Yang BS, Song S, et al., 2018. Calibrate multiple consumer RGB-D cameras for low-cost and efficient 3D indoor mapping. Remot Sens, 10(2):328.

[5]Chen JW, Bautembach D, Izadi S, 2013. Scalable real-time volumetric surface reconstruction. ACM Trans Graph, 32(4):113.

[6]Chen K, Lai YK, Wu YX, et al., 2014. Automatic semantic modeling of indoor scenes from low-quality RGB-D data using contextual information. ACM Trans Graph, 33(6):208.

[7]Chen SN, Zheng YL, Luo PP, et al., 2017. Visual search based indoor localization in low light via RGB-D camera. Int Comput Inform Eng, 11(3):403-406.

[8]Cheng JCP, Tan Y, Song YZ, et al., 2018. Developing an evacuation evaluation model for offshore oil and gas platforms using BIM and agent-based model. Autom Constr, 89:214-224.

[9]Chintalapudi K, Padmanabha IA, Padmanabhan VN, 2010. Indoor localization without the pain. Proc 16th Annual Int Conf on Mobile Computing and Networking, p.173-184.

[10]Choi J, Choi J, Kim I, 2014. Development of BIM-based evacuation regulation checking system for high-rise and complex buildings. Autom Constr, 46:38-49.

[11]Dai A, Chang AX, Savva M, et al., 2017. ScanNet: richly-annotated 3D reconstructions of indoor scenes. IEEE Conf on Computer Vision and Pattern Recognition, p.2432-2443.

[12]dos Santos DR, Basso MA, Khoshelham K, et al., 2016. Mapping indoor spaces by adaptive coarse-to-fine registration of RGB-D data. IEEE Geosci Remot Sens Lett, 13(2):262-266.

[13]Endres F, Hess J, Sturm J, et al., 2014. 3-D mapping with an RGB-D camera. IEEE Trans Robot, 30(1):177-187.

[14]Engelhard N, Endres F, Hess J, et al., 2011. Real-time 3D visual SLAM with a hand-held RGB-D camera. Proc RGB-D Workshop on 3D Perception in Robotics at the European Robotics Forum, 180:1-15.

[15]Fehr M, Furrer F, Dryanovski I, et al., 2017. TSDF-based change detection for consistent long-term dense reconstruction and dynamic object discovery. IEEE Int Conf on Robotics and Automation, p.5237-5244.

[16]Fox D, Burgard W, Thrun S, 1999. Markov localization for mobile robots in dynamic environments. J Artif Intell Res, 11:391-427.

[17]Handa A, Whelan T, McDonald J, et al., 2014. A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. IEEE Int Conf on Robotics and Automation, p.1524-1531.

[18]He FN, Habib A, 2018. Three-point-based solution for automated motion parameter estimation of a multi-camera indoor mapping system with planar motion constraint. ISPRS J Photogr Remot Sens, 142:278-291.

[19]Helbing D, Farkas I, Vicsek T, 2000. Simulating dynamical features of escape panic. Nature, 407(6803):487-490.

[20]Henry P, Krainin M, Herbst E, et al., 2012. RGB-D mapping: using Kinect-style depth cameras for dense 3D modeling of indoor environments. Int J Robot Res, 31(5):647-663.

[21]Huang AS, Bachrach A, Henry P, et al., 2017. Visual odometry and mapping for autonomous flight using an RGB-D camera. In: Christensen HI, Khatib O (Eds.), Robotics Research. Springer, Cham, p.235-252.

[22]Ikehata S, Yang H, Furukawa Y, 2015. Structured indoor modeling. IEEE Int Conf on Computer Vision, p.1323-1331.

[23]Kerl C, Sturm J, Cremers D, 2013. Robust odometry estimation for RGB-D cameras. IEEE Int Conf on Robotics and Automation, p.3748-3754.

[24]Konolige K, Agrawal M, 2008. FrameSLAM: from bundle adjustment to real-time visual mapping. IEEE Trans Robot, 24(5):1066-1077.

[25]Lai K, Bo L, Fox D, 2014. Unsupervised feature learning for 3D scene labeling. IEEE Int Conf on Robotics and Automation, p.3050-3057.

[26]Li M, Chen RZ, Liao X, et al., 2018. A real-time indoor visual localization and navigation method based on Tango smartphone. Ubiquitous Positioning Indoor Navigation and Location Based Service, p.1-6.

[27]Liu HM, Zhang GF, Bao HJ, 2016. Robust keyframe-based monocular SLAM for augmented reality. IEEE Int Symp on Mixed and Augmented Reality, p.1-10.

[28]Liu R, Du J, Issa RRA, 2014. Human library for emergency evacuation in BIM-based serious game environment. Int Conf on Computing in Civil and Building Engineering, p.544-551.

[29]Ma J, Jia W, Zhang J, 2017. Research of building evacuation path to guide based on BIM. 29th Chinese Control and Decision Conf, p.1814-1818.

[30]Mur-Artal R, Tardós JD, 2017. ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans Robot, 33(5):1255-1262.

[31]Newcombe RA, Izadi S, Hilliges O, et al., 2011. KinectFusion: real-time dense surface mapping and tracking. 10th IEEE Int Symp on Mixed and Augmented Reality, p.127-136.

[32]Nießner M, Zollhöfer M, Izadi S, et al., 2013. Real-time 3D reconstruction at scale using voxel hashing. ACM Trans Graph, 32(6):169.

[33]Ortiz LE, Cabrera VE, Gonçalves LMG, 2018. Depth data error modeling of the ZED 3D vision sensor from stereolabs. ELCVIA Electron Lett Comput Vis Imag Anal, 17(1):1-15.

[34]Rüppel U, Schatz K, 2011. Designing a BIM-based serious game for fire safety evacuation simulations. Adv Eng Inform, 25(4):600-611.

[35]Santos R, Costa AA, Grilo A, 2017. Bibliometric analysis and review of building information modelling literature published between 2005 and 2015. Autom Constr, 80:118-136.

[36]Segal A, Haehnel D, Thrun S, 2009. Generalized-ICP. Proc Robotics: Science and Systems, 2:4.

[37]Shi Y, Xu K, Nießner M, et al., 2018. PlaneMatch: patch coplanarity prediction for robust RGB-D reconstruction. LNCS, 11212:767-784.

[38]Song S, Lichtenberg SP, Xiao J, 2015. SUN RGB-D: a RGB-D scene understanding benchmark suite. IEEE Conf on Computer Vision and Pattern Recognition, p.567-576.

[39]Steinbrucker F, Sturm J, Cremers D, 2011. Real-time visual odometry from dense RGB-D images. IEEE Int Conf on Computer Vision Workshops, p.719-722.

[40]Sturm J, Engelhard N, Endres F, et al., 2012. A benchmark for the evaluation of RGB-D SLAM systems. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.573-580.

[41]Tang SJ, Li Y, Yuan ZL, et al., 2019a. A vertex-to-edge weighted closed-form method for dense RGB-D indoor SLAM. IEEE Access, 7:32019-32029.

[42]Tang SJ, Zhang YJ, Li Y, et al., 2019b. Fast and automatic reconstruction of semantically rich 3D indoor maps from low-quality RGB-D sequences. Sensors, 19(3):533.

[43]Thomas D, Sugimoto A, 2017. Modeling large-scale indoor scenes with rigid fragments using RGB-D cameras. Comput Vis Imag Underst, 157:103-116.

[44]Vestena KM, dos Santos DR, Oilveira EMJr, et al., 2016. A weighted closed-form solution for RGB-D data registration. Int Arch Photogr Remote Sens Spat Inform Sci, XLI-B3:403-409.

[45]Wang KK, Zhang GF, Bao HJ, 2014. Robust 3D reconstruction with an RGB-D camera. IEEE Trans Imag Process, 23(11):4893-4906.

[46]Westoby MJ, Brasington J, Glasser NF, et al., 2012. ‘Structure-from-motion’ photogrammetry: a low-cost, effective tool for geoscience applications. Geomorphology, 179:300-314.

[47]Whelan T, Michael K, Maurice F, et al., 2013. Kintinuous: Spatially Extended Kinectfusion. Technical Report No. MIT-CSAIL-TR-2012-020.

[48]Winterhalter W, Fleckenstein F, Steder B, et al., 2015. Accurate indoor localization for RGB-D smartphones and tablets given 2D floor plans. IEEE/RSJ Int Conf on Intelligent Robots and Systems, p.3138-3143.

[49]Yuan ZL, Jia HF, Liao MJ, et al., 2017. Simulation model of self-organizing pedestrian movement considering following behavior. Front Inform Technol Electron Eng, 18(8):1142-1150.

[50]Yuan ZL, Jia HF, Zhang LF, et al., 2018. A social force evacuation model considering the effect of emergency signs. Simulation, 94(8):723-737.

[51]Yuan ZL, Guo RZ, Tang SJ, et al., 2019. Simulation of the separating crowd behavior in a T-shaped channel based on the social force model. IEEE Access, 7:13668-13682.

[52]Zeng M, Zhao FK, Zheng JX, et al., 2012. A memory-efficient KinectFusion using octree. In: Hu SM, Martin RR (Eds.), Computational Visual Media. Springer, Berlin, p.234-241.

[53]Zou H, Li N, Cao L, 2016. Immersive virtual environments for investigating building emergency evacuation behaviors: a feasibility study. 33th Int Symp on Automation and Robotics in Construction, p.1-8.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE