DragTraffic: Interactive and Controllable Traffic Scene Generation for Autonomous DrivingShow others and affiliations
2024 (English)In: 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024, Institute of Electrical and Electronics Engineers (IEEE) , 2024, p. 14241-14247Conference paper, Published paper (Refereed)
Abstract [en]
Evaluating and training autonomous driving systems require diverse and scalable corner cases. However, most existing scene generation methods lack controllability, accuracy, and versatility, resulting in unsatisfactory generation results. Inspired by DragGAN in image generation, we propose DragTraffic, a generalized, interactive, and controllable traffic scene generation framework based on conditional diffusion. DragTraffic enables non-experts to generate a variety of realistic driving scenarios for different types of traffic agents through an adaptive mixture expert architecture. We employ a regression model to provide a general initial solution and a refinement process based on the conditional diffusion model to ensure diversity. User-customized context is introduced through cross-attention to ensure high controllability. Experiments on a real-world driving dataset show that DragTraffic outperforms existing methods in terms of authenticity, diversity, and freedom. Demo videos and code are available at https://chantsss.github.io/Dragtraffic/.
Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2024. p. 14241-14247
National Category
Computer graphics and computer vision Computer Systems
Identifiers
URN: urn:nbn:se:kth:diva-359875DOI: 10.1109/IROS58592.2024.10801623ISI: 001433985300862Scopus ID: 2-s2.0-85216467705OAI: oai:DiVA.org:kth-359875DiVA, id: diva2:1937184
Conference
2024 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2024, Abu Dhabi, United Arab Emirates, Oct 14 2024 - Oct 18 2024
Note
Part of ISBN 9798350377705]
QC 20250213
2025-02-122025-02-122025-06-12Bibliographically approved