More recently, Microsoft Research's MirageTable was proposed in May 2012 by Benko et al [5] at the CHI conference. This interactive system was designed to merge physical and virtual worlds into a single spatially registered experience on top of a table. The instrument is composed of a single depth camera, a stereoscopic projector, and a curve screen. Color and Depth images are calculated in each frame. The user’s eyes are occluded by the shutter glasses. Therefore, rather than track the eyes, they decided to track the location of the glasses in the depth image instead of tracking the eyes. They used that information to compute the user’s viewpoint. This camera tracks the user's eyes and performs a real time capturing of both shape and appearance of any object placed in front of the camera. The system enables perspective stereoscopic 3D visualization to a single user. Figure 4 shows the experimental results. This allows the user to interact with virtual objects through freehand actions without handling. The system shows the potential of using the projector/depth camera system to simulate such scenarios.