A Bayesian framework for simultaneous matting and 3D reconstruction
Conventional approaches to 3D scene reconstruction often treat matting and reconstruction as two separate problems, with matting a prerequistie to reconstruction. The problem with such an approach is that it requires taking irreversible decisions at the first stage, whcih may translate into reconstruction errors at the second stage. In this paper, we propose an approach which attempts to solve both problems jointly, thereby avoiding this limitation. A general Bayesian formulation for estimating opacity and depth with respect to a reference camera is developed. In addtion, it is demonstrated that in the special case of binary opacity values (background/foreground) and discrete depth values, a global solution can be obtained via a single graph-cut computation. We demonstrate the application of the method to novel view synthesis in the case of large-scale outdoor scene. An experimental comparison with a two-stage approach based on chroma-keying and shape-from-silhouette illustrates the advantages of the new method. This document was orginally published in Proc. of The 6th International Conference on 3D Digital Imaging and Modeling (3DM'07), August 21-23 2007, Montreal, Quebec, Canada.
White Paper copyright
© BBC. All rights reserved. Except as provided below, no part of a White Paper may be reproduced in any material form (including photocopying or storing it in any medium by electronic means) without the prior written permission of BBC Research except in accordance with the provisions of the (UK) Copyright, Designs and Patents Act 1988.
The BBC grants permission to individuals and organisations to make copies of any White Paper as a complete document (including the copyright notice) for their own internal use. No copies may be published, distributed or made available to third parties whether by paper, electronic or other means without the BBC's prior written permission.