Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published August 1999 | public
Book Section - Chapter

Creating generative models from range images


We describe a new approach for creating concise high-level generative models from range images or other approximate representations of real objects. Using data from a variety of acquisition techniques and a user-defined class of models, our method produces a compact object representation that is intuitive and easy to edit. The algorithm has two inter-related phases: recognition, which chooses an appropriate model within a user-specified hierarchy, and parameter estimation, which adjusts the model to best fit the data. Since the approach is model-based, it is relatively insensitive to noise and missing data. We describe practical heuristics for automatically making tradeoffs between simplicity and accuracy to select the best model in a given hierarchy. We also describe a general and efficient technique for optimizing a model by refining its constituent curves. We demonstrate our approach for model recovery using both real and synthetic data and several generative model hierarchies.

Additional Information

© 1999 ACM. Special thanks to Jean-Yves Bouguet for reviewing early drafts, and for help with data acquisition. Preliminary discussions with Al Barr were of immense help. We are also grateful to the anonymous Siggraph reviewers (especially #2) and committee for their helpful comments, and to members of the graphics groups at Caltech and Stanford, for their support. This work was supported by the NSF Science and Technology Center for Computer Graphics and Scientific Visualization (ASC-8920219), an Army Research Office Young Investigator award (DAAH04-96-100077), the Alfred P. Sloan Foundation, and a Reed-Hodgson Stanford Graduate Fellowship. All opinions, findings, conclusions or recommendations expressed here are those of the authors only and do not necessarily reflect the views of the sponsoring agencies and individuals.

Additional details

August 19, 2023
October 23, 2023