Please use this identifier to cite or link to this item:
http://dx.doi.org/10.25673/101943
Title: | 3D Scene Reconstruction with Neural Radiance Fields (NeRF) Considering Dynamic Illumination Conditions |
Author(s): | Kolodiazhna, Olena Savin, Volodymyr Uss, Mykhailo Kussul, Nataliia |
Granting Institution: | Hochschule Anhalt |
Issue Date: | 2023 |
Extent: | 1 Online-Ressource (6 Seiten) |
Language: | English |
Abstract: | This paper addresses the problem of novel view synthesis using Neural Radiance Fields (NeRF) for scenes with dynamic illumination. NeRF training utilizes photometric consistency loss that is pixel-wise consistency between a set of scene images and intensity values rendered by NeRF. For reflective surfaces, image intensity depends on viewing angle and this effect is taken into account by using ray direction as NeRF input. For scenes with dynamic illumination, image intensity depends not only on position and viewing direction but also on time. We show that this factor affects NeRF training with standard photometric loss function effectively decreasing quality of both image and depth rendering. To cope with this problem, we propose to add time as additional NeRF input. Experiments on ScanNet dataset demonstrate that NeRF with modified input outperforms original model version and renders more consistent 3D structures. Results of this study could be used to improve quality of training data augmentation for depth prediction models (e.g. depth-from-stereo models) for scenes with non-static illumination. |
URI: | https://opendata.uni-halle.de//handle/1981185920/103896 http://dx.doi.org/10.25673/101943 |
Open Access: | Open access publication |
License: | (CC BY-SA 4.0) Creative Commons Attribution ShareAlike 4.0 |
Appears in Collections: | International Conference on Applied Innovations in IT (ICAIIT) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
4_6 ICAIIT_2023_paper_4186.pdf | 2.1 MB | Adobe PDF | View/Open |