E-viri
Recenzirano
Odprti dostop
-
Tajima, D.; Kanamori, Y.; Endo, Y.
Computer graphics forum, October 2021, 2021-10-00, 20211001, Letnik: 40, Številka: 7Journal Article
The modern supervised approaches for human image relighting rely on training data generated from 3D human models. However, such datasets are often small (e.g., Light Stage data with a small number of individuals) or limited to diffuse materials (e.g., commercial 3D scanned human models). Thus, the human relighting techniques suffer from the poor generalization capability and synthetic‐to‐real domain gap. In this paper, we propose a two‐stage method for single‐image human relighting with domain adaptation. In the first stage, we train a neural network for diffuse‐only relighting. In the second stage, we train another network for enhancing non‐diffuse reflection by learning residuals between real photos and images reconstructed by the diffuse‐only network. Thanks to the second stage, we can achieve higher generalization capability against various cloth textures, while reducing the domain gap. Furthermore, to handle input videos, we integrate illumination‐aware deep video prior to greatly reduce flickering artifacts even with challenging settings under dynamic illuminations.
![loading ... loading ...](themes/default/img/ajax-loading.gif)
Vnos na polico
Trajna povezava
- URL:
Faktor vpliva
Dostop do baze podatkov JCR je dovoljen samo uporabnikom iz Slovenije. Vaš trenutni IP-naslov ni na seznamu dovoljenih za dostop, zato je potrebna avtentikacija z ustreznim računom AAI.
Leto | Faktor vpliva | Izdaja | Kategorija | Razvrstitev | ||||
---|---|---|---|---|---|---|---|---|
JCR | SNIP | JCR | SNIP | JCR | SNIP | JCR | SNIP |
Baze podatkov, v katerih je revija indeksirana
Ime baze podatkov | Področje | Leto |
---|
Povezave do osebnih bibliografij avtorjev | Povezave do podatkov o raziskovalcih v sistemu SICRIS |
---|
Vir: Osebne bibliografije
in: SICRIS
To gradivo vam je dostopno v celotnem besedilu. Če kljub temu želite naročiti gradivo, kliknite gumb Nadaljuj.