Orientation estimation and pedestrian tracking in mobiles have been examined by many in the last few years. Using sensors filtering, sensors fusion and even extended Kalman filters, there are still problems with orientation estimation. Although MEMS based gyroscopes are relatively cheap, many phones still lack these devices, for this an optical flow based solution is presented. In this paper we demonstrate the theory and feasibility of optical flow based virtual gyroscopes on an Android based handset. We examined two methods of feature detection, measured tracking results and constructed fusion of results with regular inertial sensor values. This way, we have not only substituted real gyroscope to a virtual one, but created a fused gyroscope sensor as well. Tests were recorded using a robotic arm to provide reliable results.