All-Weather Deep Outdoor Lighting Estimation

We present a neural network that predicts HDR outdoor illumination from a single LDR image. At the heart of our work is a method to accurately learn HDR lighting from LDR panoramas under any weather condition. We achieve this by training another CNN (on a combination of synthetic and real images) to take as input an LDR panorama, and regress the parameters of the Lalonde-Matthews outdoor illumination model. This model is trained such that it a) reconstructs the appearance of the sky, and b) renders the appearance of objects lit by this illumination. We use this network to label a large-scale dataset of LDR panoramas with lighting parameters and use them to train our single image outdoor lighting estimation network. We demonstrate, via extensive experiments, that both our panorama and single image networks outperform the state of the art, and unlike prior work, are able to handle weather conditions ranging from fully sunny to overcast skies.


Paper

Jinsong Zhang, Kalyan Sunkavalli, Yannick Hold-Geoffroy, Sunil Hadap, Jonathan Eisenmann, and Jean-François Lalonde
All-Weather Deep Outdoor Lighting Estimation
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019.
[arXiv pre-print] [BibTeX]


Supplementary material

We provide additional results in this supplementary page.

Poster

Demo

Coming soon!

Data

The LM parameters for SUN360 panoramas is available here.

The outdoor panorama dataset is available here. The LM parameters for this dataset can be found here.

The LM parameters are released under the MIT license.

Video

Acknowledgements

The authors gratefully acknowledge the following funding sources:

  • A generous donation from Adobe to Jean-Francois Lalonde
  • NVIDIA Corporation with the donation of the Titan X GPU used for this research.
  • NSERC Discovery Grant RGPIN-2014-05314
  • REPARTI Strategic Network

Ulaval logo
Adobe logo