When a scene is captured at different times, 3D elements are mapped into corresponding pixelsin the images. Thus, if image features are not occluded, they can be related to each otherand motion can be characterized as a collection of displacements in the image plane. Thedisplacement corresponds to the project movement of the objects in the scene and it is referredto as theoptical flow. If you were to take an image, and its optical flow, you should be ableto construct thenextframe in the image sequence. So optical flow is like a measurement ofvelocity, the movement in pixels/unit of time, more simply pixels/frame. Optical flow can befound by looking for corresponding features in images. We can consider alternative featuressuch as points, pixels, curves or complex descriptions of objects.The problem of finding correspondences in images has motivated the development of manytechniques that can be distinguished by the features, the constraints imposed and the optimizationor searching strategy (Dhond and Aggarwal, 1989). When features are pixels, the correspon-dence can be found by observing the similarities between intensities in image regions (localneighbourhood). This approach is known as area-based matching and it is one of the mostcommon techniques used in computer vision (Barnard and Fichler, 1987). In general, pixelsin non-occluded regions can be related to each other by means of a general transformation ofthe form