Akademska digitalna zbirka SLovenije - logo
E-resources
Peer reviewed Open access
  • CNN-based burned area mappi...
    Belenguer-Plomer, Miguel A.; Tanase, Mihai A.; Chuvieco, Emilio; Bovolo, Francesca

    Remote sensing of environment, July 2021, 2021-07-00, 20210701, Volume: 260
    Journal Article

    In this paper, we present an in-depth analysis of the use of convolutional neural networks (CNN), a deep learning method widely applied in remote sensing-based studies in recent years, for burned area (BA) mapping combining radar and optical datasets acquired by Sentinel-1 and Sentinel-2 on-board sensors, respectively. Combining active and passive datasets into a seamless wall-to-wall cloud cover independent mapping algorithm significantly improves existing methods based on either sensor type. Five areas were used to determine the optimum model settings and sensors integration, whereas five additional ones were utilised to validate the results. The optimum CNN dimension and data normalisation were conditioned by the observed land cover class and data type (i.e., optical or radar). Increasing network complexity (i.e., number of hidden layers) only resulted in rising computing time without any accuracy enhancement when mapping BA. The use of an optimally defined CNN within a joint active/passive data combination allowed for (i) BA mapping with similar or slightly higher accuracy to those achieved in previous approaches based on Sentinel-1 (Dice coefficient, DC of 0.57) or Sentinel-2 (DC 0.7) only and (ii) wall-to-wall mapping by eliminating information gaps due to cloud cover, typically observed for optical-based algorithms. •Convolutional Neural Networks were used to map burned areas in several ecosystems.•Optimum CNN configuration depended on land cover and input data.•Combining SAR and optical datasets improved accuracy based on single datasets.•Combining SAR and optical images provided wall-to-wall burned area maps.•The most accurate maps were observed over forests, the least over grasslands.