kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Large-Scale Multi-Source Satellite Data for Wildfire Detection and Assessment Using Deep Learning
KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, Geoinformatics.ORCID iD: 0000-0002-1077-2560
2022 (English)Doctoral thesis, comprehensive summary (Other academic)Alternative title
Storskalig satellitdata med flera källor för upptäckt och bedömning av skogsbränder med hjälp av djupinlärning (Swedish)
Abstract [en]

Earth Observation (EO) satellites have great potential in wildfire detection and assessment at fine spatial, temporal, and spectral resolutions. For a long time, satellite data have been employed to systematically monitor wildfire dynamics and assess wildfire impacts, including (i) to detect the location of actively burning spots, (ii) to map the spatial extent of the burn scars, (iii) to assess the wildfire damage levels. Active fire detection plays an important role in wildfire early warning systems. Accurate and timely burned area mapping is critical to delineate the fire perimeter and enables the analysis of fire suppression efforts and potential drivers of fire spread. Subsequently, burn severity assessment aims to infer the degree of environmental change caused by fire. Recent advances in deep learning (DL) empower the automatic interpretation of a huge amount of remote sensing data. The objective of the thesis is to employ large-scale multi-source satellite data that are publicly available, e.g., Landsat, Sentinel-1, and Sentinel-2, for detecting active fires and spatial delineation of burned areas and analyzing fire impacts using DL-based approaches.

A biome-based multi-criteria approach is developed to extract unambiguous active fire pixels using the reflectance of Sentinel-2 MultiSpectral Instrument (MSI) data at a 20-m resolution. The adaptive thresholds are statistically determined from 11 million observation samples acquired over summertime across broad geographic regions and fire regimes. The primary criterion takes advantage of the significant increase in fire reflectance in Sentinel-2 band 12 (2.20 μm) relative to band 4 (0.66 μm) over representative biome property. It proves to be effective in cool smoldering fire detection. Multiple conditional constraints that threshold the reflectance of band 11 (1.61 μm) and band 12 can decrease the commission errors caused by extremely bright flames around the hot cores. The overall commission and omission errors can be kept at a relatively low level around 0.14 and 0.04, respectively. The proposed algorithm is suitable for rapid active fire detection based on uni-temporal imagery without the requirement of time series.

Burned area mapping algorithms have been developed based on Landsat series, Sentinel-2 MSI, and Sentinel-1 SAR data. On one hand, the thesis expounds on the capability of DL-based methods for automatically mapping burn scars from uni-temporal (post-fire) Sentinel-2 imagery. The validation results demonstrate that deep semantic segmentation algorithms outperform traditional machine learning (ML) methods (e.g., random forest) and threshold-based methods (empirical and automatic) in detecting compact burn scars. When directly transferred to corresponding Landsat-8 test data, HRNet preserves the high accuracy. On the other hand, a large-scale annotated dataset for wildfire analysis (SAR-OPT-Wildfire) is proposed, which includes bi-temporal Sentinel-1 SAR imagery, Sentinel-2 MSI imagery, and rasterized fire perimeters across over 300 large wildfire events in Canada. These multi-source data are used for burned area mapping under three UNet-based change detection architectures, i.e., Early Fusion (EF) and two Siamese (Siam) variants. UNet-EF achieves the highest IoU score on Sentinel-1 data, while UNet-Siam-Difference performs best on the Sentinel-2 data with weight-sharing encoders. Bi-temporal scenes can significantly boost the IoU score to 0.86 and 0.80 for Sentinel-2 and Sentinel-1, respectively. By fusing bi-temporal Sentinel-1 backscatter with Sentinel-2 data, no improvement is observed compared to standalone optical-based results. This multi-source integration may provide new opportunities for near real-time wildfire progression mapping and could reduce the impacts of cloud cover.

Mapping burn severity with multispectral satellite data is typically performed through classifying bi-temporal indices (e.g., dNBR and RdNBR) using thresholds derived from parametric models incorporating field measures. The thesis re-organizes a large-scale Landsat-based bi-temporal burn severity dataset (LandSat-BSA) through visual data cleaning based on annotated MTBS data (around 1000 large fire events across the United States). The study emphasizes that multi-class semantic segmentation architectures can approximate the thresholding techniques used extensively for burn severity assessment. Specifically, UNet-like models substantially outperform other region-based CNN and Transformer-based models. Combined with the online hard example mining algorithm, Attention UNet can achieve the highest mIoU (0.7832) and Kappa coefficient close to 0.9. The bi-temporal inputs with multispectral bands and ancillary spectral indices perform much better than the uni-temporal inputs. When transferred to the Sentinel-2 data, Attention UNet maintains a Kappa value over 0.815 with high overall accuracy after the scaling operation.

Considering that SAR can effectively penetrate clouds and images under all-weather conditions during day and night, complementary use of both optical and SAR data is investigated for precise interpretation of fire-induced sites. Nevertheless, the widely used burn-sensitive spectral indices cannot be applied to SAR data because of the inherent difference between optical and SAR sensors in physical imaging mechanisms. The thesis proposes a new wildfire mapping framework by transforming SAR and optical data into a common domain based on Generative Adversarial Networks. Several experiments are conducted on the paired Sentinel-1 and Sentinel-2 images (SAR-OPT-Wildfire) using the ResNet-based Pix2Pix model. Translated optical images from SAR images preserve similar spectral characteristics and the corresponding generated spectral indices (i.e., dNBR, RdNBR, and Relativized Burn Ratio) also show good agreement with real optical ones. Regarding burned area detection using the generated indices, their medium values of the area under the receiver operating characteristics curves are over 0.85, which achieves competitive performance against the truly optical-based indices and significantly outperforms SAR-based ones. Furthermore, the derived burn severity maps from multi-source data can reach high accuracy (Kappa coefficient: 0.77). This study validates the feasibility and effectiveness of SAR-to-optical translation for the wildfire impacts assessment, which may have the potential to promote the multi-source fusion of optical and SAR data.

This thesis contributes to the development of approaches for detecting, mapping, and assessing wildfire through the large-scale publicly available EO data across fire-prone regions around the world. The research output compiled in this thesis demonstrates that the open-access medium-resolution EO data are convenient and efficient to monitor wildfires and assess the impact of fire damage. The frameworks developed in this thesis can be easily adapted to other SAR or optical data. The thesis mainly demonstrates that DL models can make full use of contextual information and capture spatial details on multiple scales from fire-sensitive spectral bands to map burned areas or burn severity. Combining with multi-source data will substantially increase the temporal observation frequency. The future work will focus on improving the generalization capability of DL models in wildfire applications by exploiting more diverse and complex study areas and the use of multi-frequency SAR (Sentinel-1 in C band, PALSAR-2 in L band, and future Biomass in P band) and multispectral data (Landsat-8, Landsat-9, and Sentinel-2).

Abstract [sv]

Jordobservationssatelliter (EO) har stor potential för detektering och bedömning av skogsbränder vid fina rumsliga, tidsmässiga och spektrala upplösningar. Under lång tid har satellitdata använts för att systematiskt övervaka bränders dynamik och bedöma effekterna av skogsbränder, inklusive (i) för att upptäcka platsen för aktivt brinnande fläckar, (ii) för att kartlägga den rumsliga omfattningen av brännärren, (iii) för att bedöma skadenivåerna vid skogsbrand. Aktiv branddetektering spelar en viktig roll i system för tidig varning för skogsbränder. Noggrann kartläggning av brända områden i rätt tid är avgörande för att avgränsa brandens omkrets och möjliggör analys av brandbekämpningsinsatser och potentiella orsaker till brandspridning. Därefter syftar bedömningen av brännskador till att sluta sig till graden av miljöförändringar orsakade av brand. De senaste framstegen inom djupinlärning (DL) möjliggör automatisk tolkning av en enorm mängd fjärranalysdata. Syftet med avhandlingen är att använda storskalig satellitdata från flera källor som är allmänt tillgänglig, t.ex., Landsat, Sentinel-1 och Sentinel-2, för att upptäcka aktiva bränder och rumslig avgränsning av brända områden och analysera brandpåverkan med DL-baserade metoder.

En biombaserad multikriteria-metod har utvecklats för att extrahera entydiga aktiva brandpixlar med hjälp av reflektansen av Sentinel-2 MultiSpectral Instrument (MSI)-data vid en 20-m upplösning. De adaptiva tröskelvärdena bestäms statistiskt från 11 miljoner observationsprover som förvärvats under sommaren över breda geografiska regioner och brandregimer. Det primära kriteriet drar fördel av den signifikanta ökningen av brandreflektans i Sentinel-2 band 12 (2,20 μm) i förhållande till band 4 (0,66 μm) jämfört med representativ biomegenskap. Det visar sig vara effektivt för att upptäcka sval glödande brand. Flera villkorliga begränsningar som trösklar reflektansen för band 11 (1,61 μm) och band 12 kan minska de kommissionsfel som orsakas av extremt ljusa lågor runt de heta kärnorna. De totala provisions- och underlåtelsefelen kan hållas på en relativt låg nivå kring 0,14 respektive 0,04. Den föreslagna algoritmen är lämplig för snabb aktiv branddetektering baserad på entidsbilder utan krav på tidsserier.

Algoritmer för kartläggning av brända områden har utvecklats baserat på Landsat-serien, Sentinel-2 MSI och Sentinel-1 SAR-data. Å ena sidan förklarar avhandlingen förmågan hos DL-baserade metoder för att automatiskt kartlägga brännärr från en-temporala (post-fire) Sentinel-2-bilder. Valideringsresultaten visar att algoritmer för djup semantisk segmentering överträffar traditionella metoder för maskininlärning (ML) (t.ex., random forest) och tröskelbaserade metoder (empiriska och automatiska) för att upptäcka kompakta brännskador. När den överförs direkt till motsvarande Landsat-8-testdata dominerar HRNet och bevarar den höga noggrannheten. Å andra sidan föreslås en storskalig kommenterad datauppsättning för analys av skogsbränder (SAR-OPT-Wildfire), som inkluderar bi-temporala Sentinel-1 SAR-bilder, Sentinel-2 MSI-bilder och rastrerade brandperimetrar över över 300 stora skogsbränder händelser i Kanada. Dessa multikälldata används för kartläggning av brända områden under tre UNet-baserade ändringsdetekteringsarkitekturer, i.e., Early Fusion (EF) och två siamesiska (Siam) varianter. UNet-EF uppnår högsta IoU-poäng på Sentinel-1-data, medan UNet-Siam-Difference presterar bäst på Sentinel-2-data med viktdelningskodare. Bi-temporala scener kan avsevärt öka IoU-poängen till 0,86 och 0,80 för Sentinel-2 respektive Sentinel-1. Genom att sammansmälta bi-temporal Sentinel-1 backscatter med Sentinel-2-data observeras ingen förbättring jämfört med fristående optiskt baserade resultat. Denna integration med flera källor kan ge nya möjligheter för kartläggning av fortskridande av skogsbränder i nästan realtid och för att minska effekterna av molntäcke.

Kartläggning av brännskador med multispektrala satellitdata utförs vanligtvis genom att klassificera bi-temporala index (t.ex., dNBR och RdNBR) med hjälp av trösklar härledda från parametriska modeller som inkluderar fältmått. Avhandlingen omorganiserar en storskalig Landsat-baserad datauppsättning för bi-temporal brännskada (LandSat-BSA) genom visuell datarensning baserad på kommenterade MTBS-data (cirka 1000 stora brandhändelser över hela USA). Studien betonar att flerklassiga semantiska segmenteringsarkitekturer kan approximera de tröskeltekniker som används i stor utsträckning för bedömning av brännskador. Specifikt överträffar UNet-liknande modeller avsevärt andra regionbaserade CNN- och transformatorbaserade modeller. I kombination med den hårda gruvalgoritmen online kan Attention UNet uppnå den högsta mIoU (0,7832) och Kappa-koefficienten nära 0,9. De bi-temporala ingångarna med multispektrala band och underordnade spektrala index presterar mycket bättre än de en-temporala ingångarna. När den överförs till Sentinel-2-data, bibehåller Attention UNet ett Kappa-värde över 0,815 med hög övergripande noggrannhet efter skalningsoperationen.

Med tanke på att SAR effektivt kan penetrera moln och bilder under alla väderförhållanden under dag och natt, undersöks kompletterande användning av både optisk och SAR-data för exakt tolkning av brand-inducerade platser. Icke desto mindre kan de allmänt använda brännkänsliga spektrala indexen inte tillämpas på SAR-data på grund av den inneboende skillnaden mellan optiska och SAR-sensorer i fysiska avbildningsmekanismer. Avhandlingen föreslår ett nytt ramverk för kartläggning av skogsbränder genom att omvandla SAR och optisk data till en gemensam domän baserad på Generative Adversarial Networks. Flera experiment utförs på de parade Sentinel-1- och Sentinel-2-bilderna (SAR-OPT-Wildfire) med den ResNet-baserade Pix2Pix-modellen. Översatta optiska bilder från SAR-bilder bevarar liknande spektrala egenskaper och motsvarande genererade spektrala index (i.e., dNBR, RdNBR och Relativized Burn Ratio) visar också god överensstämmelse med verkliga optiska. När det gäller detektering av bränt område med de genererade indexen, är deras medelvärden för arean under mottagarens funktionskurvor över 0,85, vilket uppnår konkurrenskraftiga prestanda mot de verkligt optiskt baserade indexen och avsevärt bättre än SAR-baserade. Dessutom kan de härledda bränningsgradskartorna från data från flera källor nå hög noggrannhet (Kappa-koefficient: 0,77). Den här studien validerar genomförbarheten och effektiviteten av SAR-till-optisk översättning för konsekvensbedömningen av skogsbränder, som kan ha potential att främja fusion av optisk och SAR-data med flera källor.

Den här avhandlingen bidrar till utvecklingen av metoder för att upptäcka, kartlägga och utvärdera skogsbränder genom storskalig allmänt tillgänglig EO-data över brandutsatta regioner runt om i världen. Forskningsresultatet som sammanställts i denna avhandling visar att öppen tillgång med medelupplösning EO-data är bekväma och effektiva för att övervaka skogsbränder och bedöma effekterna av brandskador. De ramverk som utvecklats i denna forskning kan enkelt överföras till annan SAR eller optisk data. Avhandlingen visar huvudsakligen att DL-modeller kan dra full nytta av kontextuell information och fånga rumsliga detaljer på flera skalor från brandkänsliga spektralband till kartläggning av brända områden eller brännskador. Integrering med data från flera källor kommer att avsevärt öka den tidsmässiga observationsfrekvensen. Det framtida arbetet kommer att fokusera på att förbättra generaliseringsförmågan hos DL-modeller i applikationer för skogsbränder genom att utnyttja mer varierande och komplexa studieområden och användningen av olika SAR-våglängder (Sentinel-1 C-band, PALSAR-2 L-band och framtida Biomass P-band) och multispektral data (Landsat-8, Landsat-9 och Sentinel-2).

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2022. , p. 95
Series
TRITA-ABE-DLT ; 2212
Keywords [en]
Wildfire, Remote Sensing, Active Fire Detection, Burned Area Mapping, Burn Severity Assessment, Sentinel-2, Landsat, Sentinel-1, Deep Learning, Semantic Segmentation, Image Translation
Keywords [sv]
Skogsbrand, fjärranalys, aktiv branddetektering, kartläggning av bränt område, bedömning av brännskador, Sentinel-2, Landsat, Sentinel-1, djupinlärning, semantisk segmentering, bildöversättning.
National Category
Earth Observation Earth and Related Environmental Sciences
Research subject
Geodesy and Geoinformatics, Geoinformatics
Identifiers
URN: urn:nbn:se:kth:diva-312283ISBN: 978-91-8040-192-0 (print)OAI: oai:DiVA.org:kth-312283DiVA, id: diva2:1658529
Public defence
2022-06-02, Kollegiesalen, Brinellvägen 6, KTH Campus, Videolänk https://kth-se.zoom.us/j/64389993343, Stockholm, 09:00 (English)
Opponent
Supervisors
Note

QC220517

Available from: 2022-05-17 Created: 2022-05-16 Last updated: 2025-02-10Bibliographically approved
List of papers
1. Sentinel-2 MSI data for active fire detection in major fire-prone biomes: A multi-criteria approach
Open this publication in new window or tab >>Sentinel-2 MSI data for active fire detection in major fire-prone biomes: A multi-criteria approach
2021 (English)In: International Journal of Applied Earth Observation and Geoinformation, ISSN 1569-8432, E-ISSN 1872-826X, Vol. 101, article id 102347Article in journal (Refereed) Published
Abstract [en]

Sentinel-2 MultiSpectral Instrument (MSI) data exhibits the great potential of enhanced spatial and temporal coverage for monitoring biomass burning which could complement other coarse active fire detection products. This paper aims to investigate the use of reflective wavelength Sentinel-2 data to classify unambiguous active fire areas from inactive areas at 20 m spatial resolution. A multi-criteria approach based on the reflectance of several bands (i.e. B4, B11, and B12) is proposed to demonstrate the boundary constraints in several representative biomes. It is a fully automatic algorithm based on adaptive thresholds that are statistically determined from 11 million Sentinel-2 observations acquired over corresponding summertime (June 2019 to September 2019) across 14 regions or countries. Biome-based parameterizations avoid high omission errors (OE) caused by small and cool fires in different landscapes. It also takes advantage of the multiple criteria whose intersection could reduce the potential commission errors (CE) due to soil dominated pixels or highly reflective building rooftops. Active fire detection performance was mainly evaluated through visual inspection on eight illustrative subsets because of unavailable ground truth. The detection results revealed that CE and OE could be kept at a low level with 0.14 and 0.04 as an acceptable trade-off. The proposed algorithm can be employed for rapid active fire detection as soon as the image is obtained without the requirement of using multi-temporal imagery, and can even be adapted to onboard processing in the future.

Place, publisher, year, edition, pages
Elsevier BV, 2021
Keywords
MultiSpectral instrument, Active fire detection, Biome, Multi-criteria, Sentinel-2
National Category
Earth Observation
Identifiers
urn:nbn:se:kth:diva-298002 (URN)10.1016/j.jag.2021.102347 (DOI)000658898600002 ()2-s2.0-85114877598 (Scopus ID)
Note

QC 20210628

Available from: 2021-06-28 Created: 2021-06-28 Last updated: 2025-02-10Bibliographically approved
2. Uni-Temporal Multispectral Imagery for Burned Area Mapping with Deep Learning
Open this publication in new window or tab >>Uni-Temporal Multispectral Imagery for Burned Area Mapping with Deep Learning
2021 (English)In: Remote Sensing, E-ISSN 2072-4292, Vol. 13, no 8, p. 1509-Article in journal (Refereed) Published
Abstract [en]

Accurate burned area information is needed to assess the impacts of wildfires on people, communities, and natural ecosystems. Various burned area detection methods have been developed using satellite remote sensing measurements with wide coverage and frequent revisits. Our study aims to expound on the capability of deep learning (DL) models for automatically mapping burned areas from uni-temporal multispectral imagery. Specifically, several semantic segmentation network architectures, i.e., U-Net, HRNet, Fast-SCNN, and DeepLabv3+, and machine learning (ML) algorithms were applied to Sentinel-2 imagery and Landsat-8 imagery in three wildfire sites in two different local climate zones. The validation results show that the DL algorithms outperform the ML methods in two of the three cases with the compact burned scars, while ML methods seem to be more suitable for mapping dispersed burn in boreal forests. Using Sentinel-2 images, U-Net and HRNet exhibit comparatively identical performance with higher kappa (around 0.9) in one heterogeneous Mediterranean fire site in Greece; Fast-SCNN performs better than others with kappa over 0.79 in one compact boreal forest fire with various burn severity in Sweden. Furthermore, directly transferring the trained models to corresponding Landsat-8 data, HRNet dominates in the three test sites among DL models and can preserve the high accuracy. The results demonstrated that DL models can make full use of contextual information and capture spatial details in multiple scales from fire-sensitive spectral bands to map burned areas. Using only a post-fire image, the DL methods not only provide automatic, accurate, and bias-free large-scale mapping option with cross-sensor applicability, but also have potential to be used for onboard processing in the next Earth observation satellites.

Place, publisher, year, edition, pages
MDPI AG, 2021
Keywords
Sentinel-2; Landsat-8; burned area mapping; deep learning; semantic segmentation; machine learning
National Category
Earth Observation
Identifiers
urn:nbn:se:kth:diva-295653 (URN)10.3390/rs13081509 (DOI)000644685300001 ()2-s2.0-85105761915 (Scopus ID)
Note

QC 20210526

Available from: 2021-05-24 Created: 2021-05-24 Last updated: 2025-02-10Bibliographically approved
3. Large-Scale Burn Severity Mapping in Multispectral Imagery Using Deep Semantic Segmentation Models
Open this publication in new window or tab >>Large-Scale Burn Severity Mapping in Multispectral Imagery Using Deep Semantic Segmentation Models
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Nowadays Earth observation satellites provide valuable information for wildfire agencies and resource managers in post-fire stabilization and restoration. Mapping burn severity with multispectral satellite data is typically performed through classifying bi-temporal indices (e.g., dNBR and RdNBR) using thresholds derived from parametric models incorporating field-based measurement. There are currently many manual efforts taken to determine burn severity thresholds with prior knowledge and visual inspection. In a practical context, a highly automated approach is needed to provide a refined, spatially-explicit estimation of damage levels. In the study, we re-organize a large-scale Landsat-based bi-temporal burn severity dataset (Landsat-BSA) through visual data cleaning based on annotated MTBS data (around 1000 large fire events in the United States). We adopt state-of-the-art deep learning (DL) based methods for mapping burn severity on the Landsat-BSA dataset. Experimental results emphasize that multi-class semantic segmentation algorithms can approximate the threshold-based techniques used extensively for burn severity estimation. Specifically, UNet-like models substantially outperform other region-based CNN and Transformer-based ones for performing fine pixel-wise classification. Combining with the online hard example mining algorithm to alleviate class imbalance, Attention UNet can achieve the highest mIoU (0.7832) and Kappa coefficient close to 0.9. The bi-temporal inputs with ancillary spectral indices perform much better than the uni-temporal multispectral inputs. When transferred to the Sentinel-2 data, Attention UNet still maintains a Kappa value over 0.81 with a pretty high overall accuracy. The re-structured dataset, source codes, and trained models are publicly available, creating opportunities for further advances in the field.

Keywords
Landsat Data; Burn Severity Dataset; Deep Learning; Semantic Segmentation; Burn Severity Assessment
National Category
Earth Observation
Research subject
Geodesy and Geoinformatics, Geoinformatics
Identifiers
urn:nbn:se:kth:diva-312243 (URN)
Note

QC 20220523

Available from: 2022-05-16 Created: 2022-05-16 Last updated: 2025-02-10Bibliographically approved
4. GAN-Based SAR and Optical Image Translation for Wildfire Impacts Assessment Using Multi-Source Remote Sensing Data
Open this publication in new window or tab >>GAN-Based SAR and Optical Image Translation for Wildfire Impacts Assessment Using Multi-Source Remote Sensing Data
(English)Manuscript (preprint) (Other academic)
Abstract [en]

Despite the popularity and success in fire-disturbed areas detection and assessment, multispectral satellite images are often affected by poor atmospheric conditions, especially at high latitudes. Considering that SAR can effectively penetrate clouds and image at all-weather conditions during day and night, complementary use of both optical and SAR data are investigated for precise interpretation of fire-induced sites. Nevertheless, the widely used burn sensitive spectral indices cannot be applied to SAR data because of the inherent difference between optical and SAR sensors in physical imaging mechanisms. In this study, we aim to exploit multi-source data for wildfire mapping and assessment by transforming SAR and optical data into a common domain based on Generative Adversarial Networks. The experiments were conducted on the paired Sentinel-1 and Sentinel-2 images using the ResNet-based Pix2Pix model, which was trained on 281 large wildfire events and validated on the rest 23 events that occurred in Canada from 2017 to 2018. The translated optical images from SAR images preserved similar spectral characteristics and the corresponding generated spectral indices (i.e., dNBR, RdNBR, and Relativized Burn Ratio) also showed good agreement with real optical ones. Regarding burned area detection using the generated indices, their medium values of the area under the receiver operating characteristics curves were over 0.85, which achieved competitive performance against the truly optical-based indices and significantly outperformed SAR-based ones.    Furthermore, the derived burn severity maps from multi-source data can reach high accuracy (Kappa coefficient: 0.77). The study validates the feasibility and effectiveness of SAR-to-optical translation for the wildfire impacts assessment, which may have the potential to promote the multi-source fusion of optical and SAR data.

Keywords
Wildfire; Burned Area; Burn Severity; Deep Learning; Image Translation; GAN; Sentinel-1; Sentinel-2
National Category
Earth Observation
Research subject
Geodesy and Geoinformatics, Geoinformatics
Identifiers
urn:nbn:se:kth:diva-312054 (URN)
Note

QC 20220523

Available from: 2022-05-10 Created: 2022-05-10 Last updated: 2025-02-10Bibliographically approved

Open Access in DiVA

kappa(60759 kB)1879 downloads
File information
File name FULLTEXT01.pdfFile size 60759 kBChecksum SHA-512
27c16dc5d94d791bf816f0ca9e83009e9ad3698cbed7a4fbcec81f2bccfc3d6a118e0276e1188ae56210852ce7feedeec75c703c1529a0db04fb7896f8ef8b2c
Type fulltextMimetype application/pdf

Authority records

Hu, Xikun

Search in DiVA

By author/editor
Hu, Xikun
By organisation
Geoinformatics
Earth ObservationEarth and Related Environmental Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 1882 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 941 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf