An Evaluation Approach for Scene Flow with Decoupled Motion and Position
Andreas Wedel, Tobi Vaudrey, Annemarie Meissner, Clemens Rabe, Thomas Brox, Uwe Franke, Daniel Cremers
This chapter presents a technique for estimating the three-dimensional displacement vector field that describes the motion of each visible scene point. This displacement field consists of the change in image position, also known as optical flow, and the change in disparity and is called the scene flow. The technique presented uses two consecutive image pairs from a stereo sequence. The main contribution is to decouple the disparity (3D position) and the scene flow (optical flow and disparity change) estimation steps. Thus we present a technique to estimate a dense scene flow field using a variational approach from the gray value images and a given stereo disparity map.
Although the two subproblems disparity and scene flow estimation are decoupled, we enforce the scene flow to yield consistent displacement vectors in all four stereo images involved at two time instances. The decoupling strategy has two benefits: Firstly, we are independent in choosing a disparity estimation technique, which can yield either sparse or dense correspondences, and secondly, we can achieve frame rates of 5 fps on standard consumer hardware. This approach is then expanded to real-world displacements, and two metrics are presented that define likelihoods of movement with respect to the background. Furthermore, an evaluation approach is presented to compare scene flow algorithms on long image sequences, using synthetic data as ground truth.