Abstract
This paper proposes a real time image processing approach and a virtual dressing room application to enable users to try virtual garments and shoes on in front of a virtual mirror. A virtual representation of the user appears in a virtual changing room. The user’s hand motions select the clothes from a list on the screen. Afterwards the selected virtual clothes appears on a humanoid model in the virtual mirror. For the purpose of aligning the 3D garments and shoes with the model, 3D locations of the joints are used for positioning, scaling and rotating. Then, we apply skin colour detection on video to handle the unwanted occlusions of the user and the model. To create a more realistic effect, the system takes into account different images of the clothes according to different human poses and movements. By using optional mirror selection buttons, it is possible to have multiple viewing angles on the model. Additionally, we developed an algorithm for matching up all motions between the virtual clothes and model. In this study, we benefit from the Microsoft Kinect SDK in order to follow the user’s movements, coordinate the suitable clothe try-ons and provide depth sort effect to the human body and clothes.