Introduction

Errors in Image Intensity

Errors in Line Estimation

Errors in Movement

Erroneous Shape Estimation

Errors in Movement
How image movement is estimated

It is believed that the human visual system computes a representation of image motion by comparing sequential images and estimating how patterns move between images. This is referred to as optical flow. The optical flow at a point represents the movement of that point from the first image to the second image. It can only be computed where there is detail, or edges, in the image. And it requires two computational stages to estimate optical flow.

First using only local information we compute how image features move. The situation is illustrated in the following Figure. If we view the image through a small aperture, we cannot tell exactly where a point (xi, yi) on an edge moves to. We can only compute the component of the motion vector perpendicular the edge; we call it the normal flow . In other words local information only provides information about the line - we call it the constraint line - on which the optical flow vector lies.

Figure 1: The Aperture Problem

In order to estimate the movement of points, in a second stage we combine the motion components from differently oriented edges within a small patch. Assuming that the optical flow is constant within the patch, this amounts to intersecting the constraint lines. Since the normal flow vectors are noisy, we estimate the optical flow vector closest to all the constraint lines as illustrated below.

Figure 2: Click on the figure to see the Estimation of Optical Flow as Intersection of the Constraint Lines

Without knowing the noise the best we can do is defining closest as the minimum squared distance from the lines. Algebraically, this amounts to solving an over-determined system of equations using least squares. Its solution is biased. That is, the estimated optical flow does not correspond to the actual flow. The error depends on the features in the patch, that is the texture.