kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Brain-Focused Multimodal Approach for Studying Conversational Engagement in HRI
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH.ORCID iD: 0000-0001-5066-7186
2025 (English)In: HRI 2025 - Proceedings of the 2025 ACM/IEEE International Conference on Human-Robot Interaction, Institute of Electrical and Electronics Engineers (IEEE) , 2025, p. 1894-1896Conference paper, Published paper (Refereed)
Abstract [en]

My research adopts an interdisciplinary approach to study conversational engagement in human-robot interaction, by integrating cognitive neuroscience with multimodal behavioral measures and self-assessment, to provide a more comprehensive and objective evaluation of user experience. By utilizing brain imaging to analyze conversations, I aim to investigate the differences between interactions with humans and robots, as well as enhance our understanding of the cognitive mechanisms underlying communication. In addition to exploring variations in neural patterns for different agents, my work leverages multimodal machine learning to assess how brain imaging data, combined with other modalities such as eye tracking, audio, and video, can improve engagement detection, to ultimately design robots that can effectively detect, evaluate, and respond to user engagement, thereby facilitating more effective communication.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2025. p. 1894-1896
Keywords [en]
brain imaging, conversational engagement, human-robot interaction, multimodal
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:kth:diva-363765DOI: 10.1109/HRI61500.2025.10973818Scopus ID: 2-s2.0-105004872665OAI: oai:DiVA.org:kth-363765DiVA, id: diva2:1959860
Conference
20th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2025, Melbourne, Australia, Mar 4 2025 - Mar 6 2025
Note

Part of ISBN 9798350378931

QC 20250527

Available from: 2025-05-21 Created: 2025-05-21 Last updated: 2025-05-27Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Torubarova, Ekaterina

Search in DiVA

By author/editor
Torubarova, Ekaterina
By organisation
Speech, Music and Hearing, TMH
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 107 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf