Neural radiance field (NeRF) has recently achieved impressive results in novel view synthesis. However, previous works on NeRF mainly focus on object-centric scenarios. In this work, we propose 360Roam, a novel scene-level NeRF system that can synthesize images of large-scale indoor scenes in real time and support VR roaming. Our system first builds an omnidirectional neural radiance field 360NeRF from multiple input 360ᵒ images. Using 360NeRF, we then progressively estimate a 3D probabilistic occupancy map which represents the scene geometry in the form of spacial density. Skipping empty spaces and upsampling occupied voxels essentially allows us to accelerate volume rendering by using 360NeRF in a geometry-aware fashion. Furthermore, we use an adaptive divide-and-conquer strategy to slim and fine-tune the radiance fields for further improvement. The floorplan of the scene extracted from the occupancy map can provide guidance for ray sampling and facilitate a realistic roaming experience. To show the efficacy of our system, we collect a 360ᵒ image dataset in a large variety of scenes and conduct extensive experiments. Quantitative and qualitative comparisons among baselines illustrated our predominant performance in novel view synthesis for complex indoor scenes.
Results
360Roam demos in different scenes.
Roaming comparisons
Dataset (available soon)
Bar
Base
Cafe
Canteen
Center
Corridor
Inno
Lab
Library
Office
Citation
@article{huang2022360roam,
title = {360Roam: Real-Time Indoor Roaming Using Geometry-Aware 360$^\circ$ Radiance Fields},
author = {Huang, Huajian and Chen, Yingshu and Zhang, Tianjia and Yeung, Sai-Kit},
journal = {arXiv preprint arXiv:2208.02705},
year = {2022}
}
or you can cite SIGGRAPH Asia 2022 Technical Communications version:
@inproceedings{huang2022tc360roam,
title = {Real-Time Omnidirectional Roaming in Large Scale Indoor Scenes},
author = {Huang, Huajian and Chen, Yingshu and Zhang, Tianjia and Yeung, Sai-Kit},
booktitle = {SIGGRAPH Asia 2022 Technical Communications},
url = {https://doi.org/10.1145/3550340.3564222},
year = {2022},
publisher = {Association for Computing Machinery}
}
Acknowledgements
This research project is partially supported by an internal grant from HKUST (R9429) and the Innovation and Technology Support Programme of the Innovation and Technology Fund (Ref: ITS/200/20FP).