Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 1994 | Published
Book Section - Chapter Open

Three dimensional transparent structure segmentation and multiple 3D motion estimation from monocular perspective image sequences


A three dimensional scene can be segmented using different cues, such as boundaries, texture, motion, discontinuities of the optical flow, stereo, models for structure, etc. We investigate segmentation based upon one of these cues, namely three dimensional motion. If the scene contain transparent objects, the two dimensional (local) cues are inconsistent, since neighboring points with similar optical flow can correspond to different objects. We present a method for performing three dimensional motion-based segmentation of (possibly) transparent scenes together with recursive estimation of the motion of each independent rigid object from monocular perspective images. Our algorithm is based on a recently proposed method for rigid motion reconstruction and a validation test which allows us to initialize the scheme and detect outliers during the motion estimation procedure. The scheme is tested on challenging real and synthetic image sequences. Segmentation is performed for the Ullmann's experiment of two transparent cylinders rotating about the same axis in opposite directions.

Additional Information

© 1994 IEEE. Date of Current Version: 06 August 2002. Research funded by the California Institute of Technology, a scholarship from the University of Padova, a fellowship from the "A. Gini" Foundation, an AT&T Foundation Special Purpose grant, ONR grant N0014-93-1-0990, grant ASI-RS-103 from the Italian Space Agency and the NSF National Young Investigator Award (P.P.). This work is registered as CDS Technical Report CIT-CDS 93-022. California Institute of Technology, 1993.

Attached Files

Published - SOAwmnaro94.pdf


Files (733.3 kB)
Name Size Download all
733.3 kB Preview Download

Additional details

August 20, 2023
August 20, 2023