NUK - logo
E-resources
Full text
Peer reviewed Open access
  • SkyGAN: Realistic Cloud Ima...
    Mirbauer, Martin; Rittig, Tobias; Iser, Tomáš; Křivánek, Jaroslav; Šikudová, Elena

    Computer graphics forum, February 2024, Volume: 43, Issue: 1
    Journal Article

    Achieving photorealism when rendering virtual scenes in movies or architecture visualizations often depends on providing a realistic illumination and background. Typically, spherical environment maps serve both as a natural light source from the Sun and the sky, and as a background with clouds and a horizon. In practice, the input is either a static high‐resolution HDR photograph manually captured on location in real conditions, or an analytical clear sky model that is dynamic, but cannot model clouds. Our approach bridges these two limited paradigms: a user can control the sun position and cloud coverage ratio, and generate a realistically looking environment map for these conditions. It is a hybrid data‐driven analytical model based on a modified state‐of‐the‐art GAN architecture, which is trained on matching pairs of physically‐accurate clear sky radiance and HDR fisheye photographs of clouds. We demonstrate our results on renders of outdoor scenes under varying time, date and cloud covers. Our source code and a dataset of 39 000 HDR sky images are publicly available at https://github.com/CGGMFF/SkyGAN. SkyGAN generates cloudy sky images from a user‐chosen sun position that are readily usable as an environment map in any rendering system. We leverage an existing clear sky model to produce the input to our neural network which enhances the sky with clouds, haze and horizons learned from real photographs.