Structured Indoor Modeling

Satoshi Ikehata   Hang Yan   Yasutaka Furukawa 

Washington University in St. Louis   

ICCV 2015, Santiago Chile



 



Abstract

This paper presents a novel 3D modeling framework that reconstructs an indoor scene as a structured model from panorama RGBD images. A scene geometry is represented as a graph, where nodes correspond to structural elements such as rooms, walls, and objects. The approach devises a structure grammar that defines how a scene graph can be manipulated. The grammar then drives a principled new reconstruction algorithm, where the grammar rules are sequentially applied to recover a structured model. The paper also proposes a new room segmentation algorithm and an offset-map reconstruction algorithm that are used in the framework and can enforce architectural shape priors far beyond existing state-of-the-art. The structured scene representation enables a variety of novel applications, ranging from indoor scene visualization, automated floorplan generation, Inverse-CAD, and more. We have tested our framework and algorithms on six synthetic and five real datasets with qualitative and quantitative evaluations.

 

 

Paper and Presentation Materials

 

Floored Panorama RGB-D Dataset

 

Datasets: Apartment1 (142MB), Apartment2 (132MB), Apartment3 (123MB), Office1 (289MB), Office2 (293MB)

Data format: readme.txt


Source Code

 

- matlab and mex: 0.9.1 (Currently, the detail and object reconstruction are not included)


- 2016/01/19: 0.9.1 is released. Small bug fix for Unix compatibility

 

Acknowledgements

This research is supported by National Science Foundation under grant IIS 1540012. We thank Floored Inc. for providing us with the dataset and support. We thank Eric Turner and Claudio Mura for running their code on our datasets for evaluation.