Most of the existing studies regarding cross-modal content-based remote sensing image retrieval (CM-CBRSIR) focus on reducing/enlarging the Euclidean distances of cross-modal (CM) data with the same/different content in a common feature space. The advantages of using Euclidean distance lie in its straightforwardness. However, the Euclidean distances of CM data features are sensitive to the outlier data and may lead to non-robust retrieval performance, particularly in the case of noisy images with low quality. To address this issue, we propose a robust Hirschfeld-Gebelein-Rényi maximal correlation (HGRMC) augmented algorithm for CM-CBRSIR in this work, named by HGRMC augmented CM-CBRSIR (HAC). In HAC, not only the projected features of CM data in Euclidean distance space but also maximal correlation information of HGRMC are learned during the training phase of the retrieval model, where HGRMC is additionally used to capture the statistical dependency between CM data to enhance the retrieval performance with the strongly noisy input data. In the retrieval phase, we also develop a fusion scheme based on the Dempster-Shafer (DS) evidence theory to combine the superiorities of Euclidean distance and HGRMC correlation criteria. Extensive experimental results demonstrate that our proposed HAC algorithm provides better and more robust retrieval performance in comparison with existing state-of-the-art CM-CBRSIR methods.
QC 20241002