Guided Co-Modulated GAN for 360° Field of View Extrapolation
Mohammad Reza Karimi Dastjerdi     Yannick Hold-Geoffroy     Jonathan Eisenmann     Siavash Khodadadeh    
Jean-François Lalonde    




[Paper]
[Supplementary]
[Poster]
[Bibtex]


Accepted as an oral presentation in International Conference on 3D Vision (3DV), 2022!
Accepted in Sixth Workshop on Computer Vision for AR/VR (CV4ARVR), 2022!


This work is featured at Adobe Max Sneaks 2022!
Media Coverage:
  • Adobe Blog
  • Popular Science
  • PetaPixel
  • DigitalCameraWorld


  • Abstract

    We propose a method to extrapolate a 360° field of view from a single image that allows for user-controlled synthesis of the out-painted content. To do so, we propose improvements to an existing GAN-based in-painting architecture for out-painting panoramic image representation. Our method obtains state-of-the-art results and outperforms previous methods on standard image quality metrics. To allow controlled synthesis of out-painting, we introduce a novel guided co-modulation framework, which drives the image generation process with a common pretrained discriminative model. Doing so maintains the high visual quality of generated panoramas while enabling user-controlled semantic content in the extrapolated field of view. We demonstrate the state-of-the-art results of our method on field of view extrapolation both qualitatively and quantitatively, providing thorough analysis of our novel editing capabilities. Finally, we demonstrate that our approach benefits the photorealistic virtual insertion of highly glossy objects in photographs.


    Paper and Supplementary Material

    Mohammad Reza Karimi Dastjerdi, Yannick Hold-Geoffroy, Jonathan Eisenmann, Siavash Khodadadeh, Jean-François Lalonde
    Guided Co-Modulated GAN for 360° Field of View Extrapolation
    (hosted on ArXiv)


    [Bibtex]
    [Supplementary]
    [Poster]

    Video - Adobe Max Sneaks 2022



    Video - 3DV



    Citation

    @INPROCEEDINGS{10044439,
      author={Dastjerdi, Mohammad Reza Karimi and Hold-Geoffroy, Yannick and Eisenmann, Jonathan and Khodadadeh, Siavash and Lalonde, Jean-François},
      booktitle={2022 International Conference on 3D Vision (3DV)}, 
      title={Guided Co-Modulated GAN for 360° Field of View Extrapolation}, 
      year={2022},
      volume={},
      number={},
      pages={475-485},
      doi={10.1109/3DV57658.2022.00059}
    }


    Data and results

    We provide results of this technique applied on 2240 test images from the Laval Indoor HDR Dataset. Please follow this link for more details.



    Acknowledgements

    This work was partially supported by NSERC grant ALLRP 557208-20. We would like to thank Vova Kim, Sohrab Amirghodsi, Eli Shechtman and Kuldeep Kulkarni for the helpful discussion and comments. In addition, thanks to everyone at Laboratoire de Vision et Systèmes Numériques of Université Laval who helped with proofreading.