Use of Motion from Orthographic Projections to Prune 3-D Point Matches.

01 January 1989

New Image

A new method for pruning 3-D point matchings is presented in this paper. The method is specially designed to handle cases where point data are embedded with large depth errors. The method throws out depth data and uses only x-and y-data of the points (equivalent to orthographically projecting the points onto the x-y plane) to compute the motion parameters. Experimental results show that this new method has much stronger pruning power than traditional techniques which use complete 3-d data to compute motions. On the theoretical part of this work, it is shown that the determination of rotation and translation can be decoupled. It is also shown that the least-squares error of a match is controlled only by the rotation parameters. The trajectory of the rotation parameters, in the two-view motion case, is a great circle lying at the intersection of a plane with a unit 4-D sphere. This interesting geometrical property provides a new way for identifying consistent pairings.