3.8 Proceedings Paper

Deep Parametric Indoor Lighting Estimation

Publisher

IEEE
DOI: 10.1109/ICCV.2019.00727

Keywords

-

Funding

  1. NSERC
  2. REPARTI Strategic Network
  3. NSERC Discovery Grant [RGPIN-2014-05314]
  4. MITACS
  5. Prompt-Quebec
  6. E Machine Learning
  7. Adobe

Ask authors/readers for more resources

We present a method to estimate lighting from a single image of an indoor scene. Previous work has used an environment map representation that does not account for the localized nature of indoor lighting. Instead, we represent lighting as a set of discrete 3D lights with geometric and photometric parameters. We train a deep neural network to regress these parameters from a single image, on a dataset of environment maps annotated with depth. We propose a differentiable layer to convert these parameters to an environment map to compute our loss; this bypasses the challenge of establishing correspondences between estimated and ground truth lights. We demonstrate, via quantitative and qualitative evaluations, that our representation and training scheme lead to more accurate results compared to previous work, while allowing for more realistic 3D object compositing with spatially-varying lighting.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available