Upper limb motion tracking attracts attentions from both academia and industry due to its value in a wide range of applications. Although existing optical-based tracking techniques can provide accurate tracking results, the product cost and complexity keep them away from most daily life applications. Recently, low-cost inertial measurement unit (IMU) and Kinect techniques provide a feasible/economical solution for such trajectory tracking problems while either of them still has its own limitations. In this paper, we investigated how to fuse data from internal sensors of IMU, and fuse IMU data with Kinect in order to provide robust hand position information compensated for the limitations of those sensors. The calculation of position is sequentially achieved by three fusion strategies: double integration of IMU internal sensors, IMU internal sensor fusion with geometrical constraints and unscented Kalman filter (UKF) based fusion of IMU and Kinect. Experimental results show that the first two approaches suffer from drifting effects, while the proposed IMU and Kinect fusion method can provide drift-free and smooth results. Comparing with using Kinect alone, this approach is able to achieve better results in terms of both accuracy as well as robustness.
Keywords