Hand motion is an important component of human motion, playing
a central role in communication. However, it is difficult to capture
hand motion optically, especially in conjunction with full body motion.
Due to a lack of appropriate calibration methods, data gloves
also do not provide sufficiently accurate hand motion. In this paper,
we present a novel glove calibration approach that can map
raw sensor readings to hand motion data with both accurate joint
rotations and fingertip positions. Our method elegantly handles the
sensor coupling problem by treating calibration as a flexible mapping
from sensor readings to joint rotations. A sampling process
collects data tuples according to accuracy requirements, and organizes
all the tuples in a training set. From these data, a specially
designed Gaussian Process Regression model is trained to infer the
calibration function, and the learned model can be used to calibrate
new sensor readings. For real-time hand motion capture, a sparse
approximation of the model is used to enhance performance. Evaluation
experiments demonstrate that our approach provides significantly
better results that have more accurate hand shapes and fingertip
positions, compared to other calibration methods.