CLC number: TP23
On-line Access: 2014-07-10
Received: 2013-10-23
Revision Accepted: 2014-01-22
Crosschecked: 2014-06-16
Cited: 2
Clicked: 8806
Jie Chen, Can-jun Yang, Jens Hofschulte, Wan-li Jiang, Cha Zhang. A robust optical/inertial data fusion system for motion tracking of the robot manipulator[J]. Journal of Zhejiang University Science C, 2014, 15(7): 574-583.
@article{title="A robust optical/inertial data fusion system for motion tracking of the robot manipulator",
author="Jie Chen, Can-jun Yang, Jens Hofschulte, Wan-li Jiang, Cha Zhang",
journal="Journal of Zhejiang University Science C",
volume="15",
number="7",
pages="574-583",
year="2014",
publisher="Zhejiang University Press & Springer",
doi="10.1631/jzus.C1300302"
}
%0 Journal Article
%T A robust optical/inertial data fusion system for motion tracking of the robot manipulator
%A Jie Chen
%A Can-jun Yang
%A Jens Hofschulte
%A Wan-li Jiang
%A Cha Zhang
%J Journal of Zhejiang University SCIENCE C
%V 15
%N 7
%P 574-583
%@ 1869-1951
%D 2014
%I Zhejiang University Press & Springer
%DOI 10.1631/jzus.C1300302
TY - JOUR
T1 - A robust optical/inertial data fusion system for motion tracking of the robot manipulator
A1 - Jie Chen
A1 - Can-jun Yang
A1 - Jens Hofschulte
A1 - Wan-li Jiang
A1 - Cha Zhang
J0 - Journal of Zhejiang University Science C
VL - 15
IS - 7
SP - 574
EP - 583
%@ 1869-1951
Y1 - 2014
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/jzus.C1300302
Abstract: We present an optical/inertial data fusion system for motion tracking of the robot manipulator, which is proved to be more robust and accurate than a normal optical tracking system (OTS). By data fusion with an inertial measurement unit (IMU), both robustness and accuracy of OTS are improved. The kalman filter is used in data fusion. The error distribution of OTS provides an important reference on the estimation of measurement noise using the kalman filter. With a proper setup of the system and an effective method of coordinate frame synchronization, the results of experiments show a significant improvement in terms of robustness and position accuracy.
[1]Aristidou, A., Cameron, J., Lasenby, J., 2008. Real-time estimation of missing markers in human motion capture. 2nd Int. Conf. on Bioinformatics and Biomedical Engineering, p.1343-1346.
[2]Bleser, G., Stricker, D., 2009. Advanced tracking through efficient image processing and visual–inertial sensor fusion. Comput. & Graph., 33(1):59-72.
[3]Caron, F., Duflos, E., Pomorski, D., et al., 2006. GPS/IMU data fusion using multisensor Kalman filtering: introduction of contextual aspects. Inform. Fus., 7(2):221-230.
[4]Claasen, G.C., Martin, P., Picard, F., 2011. Tracking and control for handheld surgery tools. IEEE Biomedical Circuits and Systems Conf., p.428-431.
[5]Dong, Y., Zwahlen, P., Nguyen, A.M., et al., 2010. High performance inertial navigation grade sigma-delta MEMS accelerometer. IEEE/ION Position Location and Navigation Symp., p.32-36.
[6]Foxlin, E., Altshuler, Y., Naimark, L., et al., 2004. Flighttracker: a novel optical/inertial tracker for cockpit enhanced vision. Proc. 3rd IEEE/ACM Int. Symp. on Mixed and Augmented Reality, p.212-221.
[7]Park, I., Lee, B., Cho, S., et al., 2012. Laser-based kinematic calibration of robot manipulator using differential kinematics. IEEE/ASME Trans. Mechatron., 17(6):1059-1067.
[8]Ren, H., Rank, D., Merdes, M., et al., 2012. Multisensor data fusion in an integrated tracking system for endoscopic surgery. IEEE Trans. Inform. Technol. Biomed., 16(1):106-111.
[9]Shirinzadeh, B., Teoh, P., Tian, Y., et al., 2010. Laser interferometry-based guidance methodology for high precision positioning of mechanisms and robots. Robot. Comput.-Integr. Manuf., 26(1):74-82.
[10]Soroush, A., Farahmand, F., Salarieh, H., 2012. Design and implementation of an improved real-time tracking system for navigation surgery by fusion of optical and inertial tracking methods. Appl. Mech. Mater., 186:273-279.
[11]Su, L., Shi, L., Yu, Y., 2009. Collaborative assembly operation between two modular robots based on the optical position feedback. J. Robot., Article ID 214154.
[12]Sukkarieh, S., Nebot, E.M., Durrant-Whyte, H.F., 1999. A high integrity IMU/GPS navigation loop for autonomous land vehicle applications. IEEE Trans. Robot. Autom., 15(3):572-578.
[13]Syed, Z.F., Aggarwal, P., Niu, X., et al., 2008. Civilian vehicle navigation: required alignment of the inertial sensors for acceptable navigation accuracies. IEEE Trans. Veh. Technol., 57(6):3402-3412.
[14]Tao, Y., Hu, H., 2008. A novel sensing and data fusion system for 3-D arm motion tracking in telerehabilitation. IEEE Trans. Instrum. Meas., 57(5):1029-1040.
[15]Wiles, A.D., Thompson, D.G., Frantz, D.D., 2004. Accuracy assessment and interpretation for optical tracking systems. SPIE, 5367:421-432.
[16]Xiong, N., Svensson, P., 2002. Multi-sensor management for information fusion: issues and approaches. Inform. Fus., 3(2):163-186.
[17]Yun, X., Calusdian, J., Bachmann, E.R., et al., 2012. Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements. IEEE Trans. Instrum. Meas., 61(7):2059-2072.
[18]Zhang, P., Gu, J., Milios, E.E., et al., 2005. Navigation with IMU/GPS/digital compass with unscented Kalman filter. IEEE Int. Conf. on Mechatronics and Automation, p.1497-1502.
[19]Zhang, Z.Q., Wu, J.K., 2011. A novel hierarchical information fusion method for three-dimensional upper limb motion estimation. IEEE Trans. Instrum. Meas., 60(11):3709-3719.
[20]Zhou, H., Hu, H., 2010. Reducing drifts in the inertial measurements of wrist and elbow positions. IEEE Trans. Instrum. Meas., 59(3):575-585.
Open peer comments: Debate/Discuss/Question/Opinion
<1>