Change search
ReferencesLink to record
Permanent link

Direct link
Potential of multisensor SAR for land use/land cover mapping in Sweden
KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, Geoinformatics (closed 20110301).
2009 (English)In: Proceedings of the 1st International Postgraduate Conference on Infrastructure and Environment, IPCIE 2009, 2009, 312-319 p.Conference paper (Refereed)
Abstract [en]

The idea of multisensor data application has introduced a new dimension in the field of remote sensing in recent years. Very few attempts, however, have been made to use a multisensor approach, particularly multisensor Synthetic Aperture Radar (SAR) data for land use/land cover mapping. This research investigates the capability of spacebome multisensor SAR data, including RADARSAT fine-beam, RADARSAT standard-beam, ERS-2, and JERS-1, for extracting land use/land cover information in Sweden considering different sensor combinations (single sensor, double sensor, triple sensor and multisensor combinations), different image processing techniques (Raw, texture measures, filtered and combination of texture & filtered measures) and performance of various classification algorithms (MLC, ANN, k-NN, Sequential Masking and Object based classifier e-Cognition). The results demonstrate that despite the potential of multi-temporal single sensor SAR, the double, triple and multisensor SAR has a greater potential for land use/land cover mapping. But the potential of multisensor SAR for land use/land cover mapping depends on the characteristics of the combined sensors themselves, as well as the number of images. Among the different combinations, the best results was achieved using triple sensor combination (RADARSAT fine-beam, ERS-2, and JERS-1) because of its capability of providing the best complementary information, while the second and the third best results were obtained from multisensor combination (RADARSAT fine-beam, ERS-2, JERS-1 and RADARSAT standard-beam) and double sensor combination (ERS-2 and JERS-1) respectively. It is also revealed that the raw images produced very poor results using all combinations and all classifiers due to speckle noise, while the mean texture measures produced the best results using almost all combinations and classifiers. Remarkable variation was found among the performance of different classifiers with respect to different sensor combinations and different image processing techniques but ANN was clearly superior to the other classifiers with respect to all combinations and image processing techniques as a result of its non-parametric nature and its high mathematical base. The results indicate that the pixel-based classifier namely ANN is more accurate (around 90% overall accuracy and 0.90 Kappa coefficient) compared with object-based classification for extracting land use and land cover information from multiple sensor SAR. Overall it was found that the best performance (more than 90% overall accuracy and more than.90 Kappa coefficient) can be achieved using a sequential masking approach because of its step by step classification technique.

Place, publisher, year, edition, pages
2009. 312-319 p.
Keyword [en]
Multiple sensors SAR, Object-based, Pixel-based, Sequential masking and ANN
National Category
Civil Engineering
URN: urn:nbn:se:kth:diva-152000ISI: 000268249600040ScopusID: 2-s2.0-84886663485ISBN: 978-988173112-8OAI: diva2:749256
1st International Postgraduate Conference on Infrastructure and Environment, IPCIE 2009, 5 June 2009 through 6 June 2009, Hong Kong, China

QC 20140923

Available from: 2014-09-23 Created: 2014-09-23 Last updated: 2014-09-23Bibliographically approved

Open Access in DiVA

No full text


Search in DiVA

By author/editor
Ban, Yifang
By organisation
Geoinformatics (closed 20110301)
Civil Engineering

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 7 hits
ReferencesLink to record
Permanent link

Direct link