Abstract
Thecomputationalcapabilityofmobilephoneshasbeen rapidlyincreasing,tothepointwhereaugmentedrealityhas become feasible on cell phones. We present an approach to indoor localization and pose estimation in order to support augmentedrealityapplicationsonamobilephoneplatform. Using the embedded camera, the application localizes the device in a familiar environment and determines its orientation. Once the 6 DOF pose is determined, 3D virtual objects from a database can be projected into the image and displayedforthemobileuser. Off-linedataacquisitionconsists of acquiring images at different locations in the environment. The online pose estimation is done by a featurebasedmatchingbetweenthecellphoneimageandanimage selected from the precomputed database using the phone’s sensors (accelerometer and magnetometer). The application enables the user both to visualize virtual objects in the camera image and to localize the user in a familiar environment. We describe in detail the process of building the database and the pose estimation algorithm used on the mobile phone. We evaluate the algorithm performance as well as its accuracy in terms of reprojection distance of the 3D virtual objects in the cell phone image