Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
AttentionGuard: Transformer-based Misbehavior Detection for Secure Vehicular Platoons
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Programvaruteknik och datorsystem, SCS.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Programvaruteknik och datorsystem, SCS.ORCID-id: 0000-0002-4656-2565
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Programvaruteknik och datorsystem, SCS.
KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Nätverk och systemteknik. KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Kommunikationssystem, CoS. KTH, Skolan för elektroteknik och datavetenskap (EECS), Datavetenskap, Programvaruteknik och datorsystem, SCS.ORCID-id: 0000-0002-3267-5374
2025 (engelsk)Inngår i: PROCEEDINGS OF THE 2025 ACM WORKSHOP ON WIRELESS SECURITY AND MACHINE LEARNING, WISEML 2025, Association for Computing Machinery (ACM) , 2025, s. 8-13Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

Vehicle platooning, with vehicles traveling in close formation coordinated through Vehicle-to-Everything (V2X) communications, offers significant benefits in fuel efficiency and road utilization. However, it is vulnerable to sophisticated falsification attacks by authenticated insiders that can destabilize the formation and potentially cause catastrophic collisions. This paper addresses this challenge: misbehavior detection in vehicle platooning systems. We present AttentionGuard, a transformer-based framework for misbehavior detection that leverages the self-attention mechanism to identify anomalous patterns in mobility data. Our proposal employs a multi-head transformer-encoder to process sequential kinematic information, enabling effective differentiation between normal mobility patterns and falsification attacks across diverse platooning scenarios, including steady-state (no-maneuver) operation, join, and exit maneuvers. Our evaluation uses an extensive simulation dataset featuring various attack vectors (constant, gradual, and combined falsifications) and operational parameters (controller types, vehicle speeds, and attacker positions). Experimental results demonstrate that AttentionGuard achieves up to 0.95 F1-score in attack detection, with robust performance maintained during complex maneuvers. Notably, our system performs effectively with minimal latency (100ms decision intervals), making it suitable for real-time transportation safety applications. Comparative analysis reveals superior detection capabilities and establishes the transformer-encoder as a promising approach for securing Cooperative Intelligent Transport Systems (C-ITS) against sophisticated insider threats.

sted, utgiver, år, opplag, sider
Association for Computing Machinery (ACM) , 2025. s. 8-13
Emneord [en]
Transformer Encoder, Anomaly Detection, Vehicular Platoons, V2X, Maneuvering
HSV kategori
Identifikatorer
URN: urn:nbn:se:kth:diva-374032DOI: 10.1145/3733965.3733966ISI: 001539259600003ISBN: 979-8-4007-1531-0 (tryckt)OAI: oai:DiVA.org:kth-374032DiVA, id: diva2:2022301
Konferanse
2025 Workshop on Wireless Security and Machine Learning-WISEML, JUL 03, 2025, Arlington, VA
Merknad

QC 20251216

Tilgjengelig fra: 2025-12-16 Laget: 2025-12-16 Sist oppdatert: 2025-12-16bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekst

Person

Kalogiannis, KonstantinosHussain, Ahmed MohamedPapadimitratos, Panos

Søk i DiVA

Av forfatter/redaktør
Li, HexuKalogiannis, KonstantinosHussain, Ahmed MohamedPapadimitratos, Panos
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric

doi
isbn
urn-nbn
Totalt: 43 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf