The SDK hand tracking module provides real-time 3D hand motion tracking, using a single depth sensor. The hand module can track one or two hands, providing precise joint-level locations and positions. The module can also identify “gestures”, which are certain significant hand postures or motions, for example a wave, tap or thumbs-up sign.
Using the hand module, you can enable your application to be controlled by hand motions, using visual cues alone (without a touch interface). For instance, you can interpret a hand tap as selection, a hand swipe as scrolling, and so on.
Tracking Modes
The hand module has three main tracking modes, which differ by the information they provide and the computation resources that they require:
• Cursor mode – returns a single point on the hand allowing very accurate and responsive tracking and basic gestures.
• Extremities – returns the general location of the hand, its silhouette, and the extremities of the hand: the hand’s top-most, bottom-most, right-most, left-most, center and closest (to the sensor) points.
• Full-hand - returns the full 3D skeleton of the hand, including all 22 joints, fingers information, gestures, and more.
While the hand module is the best choice when hand specific features are required, it is not the only option for incorporating hand tracking into your application. Depending on the requirements of your application, you may wish to use the blob module instead (see Blob Tracking). If your application requires tracking of any object and not necessarily the hand, you may prefer to use the blob module. If you specifically require hand identification and more detailed information such as hand side, joint positions and gesture recognition, you will need to use the hand module.