Enhancing Visual Domain Robustness in Behaviour Cloning via Saliency-Guided AugmentationShow others and affiliations
2024 (English)In: Proceedings of Machine Learning Research, ML Research Press , 2024, Vol. 270, p. 4314-4331Conference paper, Published paper (Refereed)
Abstract [en]
In vision-based behaviour cloning (BC), conventional image augmentations like Random Crop and Colour Jitter often fall short when addressing substantial visual domain shifts, such as variations in shadow, distractors and backgrounds. Superimposition-based augmentations, which blend in-domain and out-of-domain images, have shown promise for improving model generalisation in the computer vision community, but their suitability for BC remains uncertain due to the need to preserve task-critical semantics, spatial-temporal relationships, and agent-target interactions. To address this, we introduce RoboSaGA-a Saliency-Guided Augmentation method within the superimposition family, tailored for vision-based BC. RoboSaGA dynamically adjusts augmentation intensity per pixel based on policy-driven saliency, enabling aggressive augmentation in task-trivial areas while preserving task-critical information. Moreover, it integrates seamlessly into existing architectures without requiring structural changes or additional learning objectives. Empirical evaluations in both simulated and real-world settings show that RoboSaGA maintains in-domain performance while significantly enhancing robustness to visual domain shifts, including distractors and background variations, as well as handling lighting and shadow variations. Code available at: https://github.com/Zheyu-Zhuang/RoboSaGA.
Place, publisher, year, edition, pages
ML Research Press , 2024. Vol. 270, p. 4314-4331
Keywords [en]
Behaviour Cloning, Data Augmentation, Visual Generalisation
National Category
Computer graphics and computer vision
Identifiers
URN: urn:nbn:se:kth:diva-361724Scopus ID: 2-s2.0-86000793083OAI: oai:DiVA.org:kth-361724DiVA, id: diva2:1947991
Conference
8th Conference on Robot Learning, CoRL 2024, Munich, 6 November 2024
Note
QC 20250331
2025-03-272025-03-272025-03-31Bibliographically approved