kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Large-scale burn severity mapping in multispectral imagery using deep semantic segmentation models
KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, Geoinformatics.ORCID iD: 0000-0002-1077-2560
KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, Geoinformatics.ORCID iD: 0000-0001-9907-0989
KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, Geoinformatics.ORCID iD: 0000-0003-1369-3216
2023 (English)In: ISPRS journal of photogrammetry and remote sensing (Print), ISSN 0924-2716, E-ISSN 1872-8235, Vol. 196, p. 228-240Article in journal (Refereed) Published
Abstract [en]

Nowadays Earth observation satellites provide forest fire authorities and resource managers with spatial and comprehensive information for fire stabilization and recovery. Burn severity mapping is typically performed by classifying bi-temporal indices (e.g., dNBR, and RdNBR) using thresholds derived from parametric models incorporating field-based measurements. Analysts are currently expending considerable manual effort using prior knowledge and visual inspection to determine burn severity thresholds. In this study, we aim to employ highly automated approaches to provide spatially explicit damage level estimates. We first reorganize a large-scale Landsat-based bi-temporal burn severity assessment dataset (Landsat-BSA) by visual data cleaning based on annotated MTBS data (approximately 1000 major fire events in the United States). Then we apply state-of-the-art deep learning (DL) based methods to map burn severity based on the Landsat-BSA dataset. Experimental results emphasize that multi-class semantic segmentation algorithms can approximate the threshold-based techniques used extensively for burn severity classification. UNet-like models outperform other region-based CNN and Transformer-based models and achieve accurate pixel-wise classification results. Combined with the online hard example mining algorithm to reduce class imbalance issue, Attention UNet achieves the highest mIoU (0.78) and the highest Kappa coefficient close to 0.90. The bi-temporal inputs with ancillary spectral indices work much better than the uni-temporal multispectral inputs. The restructured dataset will be publicly available and create opportunities for further advances in remote sensing and wildfire communities.

Place, publisher, year, edition, pages
Elsevier BV , 2023. Vol. 196, p. 228-240
Keywords [en]
Landsat data, Burn severity dataset, Deep learning, Semantic segmentation, Burn severity assessment
National Category
Earth Observation
Identifiers
URN: urn:nbn:se:kth:diva-326865DOI: 10.1016/j.isprsjprs.2022.12.026ISI: 000974595100001Scopus ID: 2-s2.0-85146056049OAI: oai:DiVA.org:kth-326865DiVA, id: diva2:1756893
Note

QC 20230515

Available from: 2023-05-15 Created: 2023-05-15 Last updated: 2025-02-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Hu, XikunZhang, PuzhaoBan, Yifang

Search in DiVA

By author/editor
Hu, XikunZhang, PuzhaoBan, Yifang
By organisation
Geoinformatics
In the same journal
ISPRS journal of photogrammetry and remote sensing (Print)
Earth Observation

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 170 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf