GaSLight: Gaussian Splats for Spatially-Varying Lighting in HDR
Christophe Bolduc     Yannick Hold-Geoffroy     Zhixin Shu     Jean-François Lalonde    





description Paper assignment Supplementary code Code insert_comment BibTeX


Accepted at International Conference on Computer Vision (ICCV), 2025!


Abstract

We present GaSLight, a method that generates spatially-varying lighting from regular images. Our method proposes using HDR Gaussian Splats as light source representation, marking the first time regular images can serve as light sources in a 3D renderer. Our two-stage process first enhances the dynamic range of images plausibly and accurately by leveraging the priors embedded in diffusion models. Next, we employ Gaussian Splats to model 3D lighting, achieving spatially variant lighting. Our approach yields state-of-the-art results on HDR estimations and their applications in illuminating virtual objects and scenes. To facilitate the benchmarking of images as light sources, we introduce a novel dataset of calibrated and unsaturated HDR to evaluate images as light sources. We assess our method using a combination of this novel dataset and an existing dataset from the literature.


HDR reconstruction

Training

Inference


3D lighting representation



Evaluation Datasets

  • SI-HDR: We share our reconstructions for the clip_95 images of the SI-HDR dataset using our gaslight method here. We found some HDR images to be saturated list
  • BtP-HDR: We adapt the Theta Dataset from Beyond the Pixel to obtain a HDR dataset with reference HDR images, input LDR images directly produced by the camera and reconstructions from publicly available methods (ExpandNet, HDRCNN, MaskHDR, SingleHDR) as well as our own gaslight. The full dataset is available here.


Citation

 @article{bolduc2025GaSLight,
	title={GaSLight: Gaussian Splats for Spatially-Varying Lighting in HDR},
	author={Bolduc, Christophe and Hold-Geoffroy, Yannick and Shu, Zhixin and Lalonde, Jean-Fran{\c{c}}ois},
	journal={ArXiv},
	year={2025}
}


Acknowledgements

This research was supported by Sentinel North, NSERC grant RGPIN 2020-04799, and the Digital Research Alliance Canada.