Thesis type
(Thesis) M.Sc.
Date created
2021-07-27
Authors/Contributors
Author: Vidanapathirana, Madhawa
Abstract
We address the Plan2Scene task: converting a floorplan and associated photos into a textured 3D mesh model of a residence. Our method 1) lifts a floorplan image to a 3D mesh, 2) synthesizes textures for observed surfaces based on input photos, and 3) generates textures for unobserved surfaces using a graph neural network architecture. We address the challenge of producing tileable textures for all architectural surfaces (floors, walls, and ceilings) from a sparse set of photos that only partially cover a house. To train and evaluate our system, we curate two texture datasets and extend a dataset of floorplans + photos from prior work with rectified surface crops and additional annotations. Our system produces realistic 3D models that outperform baseline approaches, as identified by a holistic user study and quantified by a suite of texture quality metrics. We release all our code, data, and trained models to the community.
Document
Identifier
etd21480
Copyright statement
Copyright is held by the author(s).
Supervisor or Senior Supervisor
Thesis advisor: Savva, Manolis
Language
English
Member of collection
Download file | Size |
---|---|
input_data\22253\etd21480.pdf | 21.33 MB |