kth.sePublications
Change search
Link to record
Permanent link

Direct link
Alternative names
Publications (10 of 148) Show all publications
Latupeirissa, A. B., Murdeshwar, A. & Bresin, R. (2025). Semiotic Analysis of Robot Sounds in Films: Implications for Sound Design in Social Robotics. International Journal of Social Robotics, 17(1), 39-58
Open this publication in new window or tab >>Semiotic Analysis of Robot Sounds in Films: Implications for Sound Design in Social Robotics
2025 (English)In: International Journal of Social Robotics, ISSN 1875-4791, E-ISSN 1875-4805, Vol. 17, no 1, p. 39-58Article in journal (Refereed) Published
Abstract [en]

This paper investigates the sound design of robots in films and their potential influence on the field of social robotics. Cinematic robot portrayals have inspired researchers and practitioners in Human-Robot Interaction (HRI). While the non-verbal sounds of iconic film robots like R2-D2 and Wall-E have been explored, this study takes a more comprehensive approach. We explore a broader selection of 15 films featuring humanoid robots across decades through a semiotic analysis of their nonverbal communication sounds, including those related to movements and internal mechanisms . Informed by Bateman and Schmidt’s multimodal film analysis framework, which draws on Saussure’s organization of signs through paradigmatic and syntagmatic relations, our analysis interprets the paradigmatic axis as an examination of the sound and the syntagmatic axis as an exploration of the events surrounding the sound. The findings uncover two primary film robot sound materials: mechanical and synthetic. Furthermore, the analysis revealed several narrative themes, categorized into two groups based on the syntagmatic focus: sounds associated with the robots’ visual appearances and sounds linked to the broader elements within the scene. The discussion section explores the implications of these findings for social robotics, including the importance of sound materials, the role of movement sounds in communication and emotional expression, and the significance of narrative and context in human-robot interaction. The paper also acknowledges the challenges in translating film sound design into practical applications in social robotics. This study provides valuable insights for HRI researchers, practitioners, and sound designers seeking to enhance non-verbal auditory expressions in social robots.

Place, publisher, year, edition, pages
Springer Nature, 2025
Keywords
Robot sound, Film sound design, Human-robot interaction, Semiotic analysis
National Category
Computer and Information Sciences Other Engineering and Technologies Robotics and automation
Research subject
Media Technology; Art, Technology and Design; Human-computer Interaction
Identifiers
urn:nbn:se:kth:diva-357658 (URN)10.1007/s12369-024-01186-2 (DOI)001367107200001 ()2-s2.0-85210732194 (Scopus ID)
Funder
Swedish Research Council, 2017-03979NordForsk, 86892
Note

QC 20241211

Available from: 2024-12-11 Created: 2024-12-11 Last updated: 2025-02-27Bibliographically approved
Hultman, A., Goina, M. & Bresin, R. (2024). Interactive sonification helps make sense of the negative environmental impact of vessel traffic in the Baltic Sea. In: Proceedings of the 19th international audio mostly conference: explorations in sonic cultures. Paper presented at AM '24: Audio Mostly 2024 - Explorations in Sonic Cultures, September 18-20, 2024, Milan, Italy (pp. 209-217). New York, NY, USA: Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Interactive sonification helps make sense of the negative environmental impact of vessel traffic in the Baltic Sea
2024 (English)In: Proceedings of the 19th international audio mostly conference: explorations in sonic cultures, New York, NY, USA: Association for Computing Machinery (ACM) , 2024, p. 209-217Conference paper, Published paper (Refereed)
Abstract [en]

The health of the Baltic Sea is under heavy human-induced pressures exacerbated by climate change. This study explores the use of interactive sonification for functionally raising awareness and communicating negative emotions about vessel emissions in the Baltic Sea and how it can contribute to making sense of that data. A prototype of an interactive Baltic Sea map incorporating interactive sonification of vessel data was created. The prototype was evaluated in exploratory prototype sessions followed by semi-structured interviews with 10 participants divided into two groups, one receiving both visual and sonic feedback and the other receiving only visual feedback. The two groups used the prototype to identify vessels’ data and shape their perception of the vessels’ emissions. Observed differences between the two groups in stated emotions, use of metaphoric language, and identification of positive and negative vessels on the map indicate that the interactive sonification elicited emotional reactions as part of a sense-making process.

Place, publisher, year, edition, pages
New York, NY, USA: Association for Computing Machinery (ACM), 2024
Keywords
Affective Computing, Baltic Sea, Emissions, Environment, Global Warming, Interactive Sonification, Sense-making, Sonification, SuperCollider
National Category
Computer and Information Sciences Other Engineering and Technologies Other Engineering and Technologies Human Computer Interaction
Research subject
Media Technology; Human-computer Interaction; Information and Communication Technology
Identifiers
urn:nbn:se:kth:diva-354002 (URN)10.1145/3678299.3678320 (DOI)001321703300013 ()2-s2.0-85204945380 (Scopus ID)
Conference
AM '24: Audio Mostly 2024 - Explorations in Sonic Cultures, September 18-20, 2024, Milan, Italy
Projects
SOUNDINVR2023-04496
Funder
Swedish Research Council, 2023-04496
Note

Part of ISBN: 979-8-4007-0968-5

QC 20241111

Available from: 2024-09-26 Created: 2024-09-26 Last updated: 2025-02-18Bibliographically approved
Favero, F., Bresin, R., Mancini, M., Lowden, A. & Avola, D. (2024). Light and Motion: Effects of Light Conditions and mEDI on Activity and Motion Area under a Sky-Lighting Machine. LEUKOS The Journal of the Illuminating Engineering Society of North America, 1-23
Open this publication in new window or tab >>Light and Motion: Effects of Light Conditions and mEDI on Activity and Motion Area under a Sky-Lighting Machine
Show others...
2024 (English)In: LEUKOS The Journal of the Illuminating Engineering Society of North America, ISSN 1550-2724, E-ISSN 1550-2716, p. 1-23Article in journal (Refereed) Epub ahead of print
Abstract [en]

We investigated whether differences in light levels and spectral properties have effects on motion. Twenty-two participants, divided into groups of two, experienced the same room in two diffused light conditions (daylight [DL] or static artificial light [AL]), which were experienced in a repeated measures design layout, controlled for order. Both light conditions offered a stimulus of at least 250 melanopic equivalent daylight illuminance (mEDI) lux, without a view. Participants were observed during an individual reading session and a collaborative construction game session. We measured the connectivity of the built structures, activity by actigraphy, and we automatically extracted motion area and quantity of motion from video analysis. We found a correlation between mEDI values in the two light conditions (DL or AL) and activity and a correlation between light condition and motion area. Diffuse daylight conditions were correlated with lower activity and less extended motion than a diffuse static condition at levels recommended for office lighting and to ensure alerting responses. Indeed, static AL was found to be related to increased spatial exploration, which might indicate restlessness, and high mEDI to a more composed motion. Actigraphy measurements correlate with quantity of motion values; therefore, the two methods provided comparable results. Results also showed a high correlation between all photometric values in the daylight condition. These findings offer arguments for favoring DL conditions in the design of places where it is desirable to avoid fidgetiness, like educational institutions, and to support composed motion, like medical institutions.

Place, publisher, year, edition, pages
Informa UK Limited, 2024
Keywords
Lighting design; daylight; variability; interdisciplinary; automatic motion features analysis
National Category
Design Computer and Information Sciences Architecture Other Computer and Information Science
Research subject
Media Technology; Architecture; Art, Technology and Design; Human-computer Interaction
Identifiers
urn:nbn:se:kth:diva-352040 (URN)10.1080/15502724.2024.2379279 (DOI)001290753400001 ()2-s2.0-85201222778 (Scopus ID)
Note

QC 20240830

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2025-02-24Bibliographically approved
Telang, S., Marques, M., Latupeirissa, A. B. & Bresin, R. (2023). Emotional Feedback of Robots: Comparing the perceived emotional feedback by an audience between masculine and feminine voices in robots in popular media. In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction: . Paper presented at 11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023 (pp. 434-436). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Emotional Feedback of Robots: Comparing the perceived emotional feedback by an audience between masculine and feminine voices in robots in popular media
2023 (English)In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction, Association for Computing Machinery (ACM) , 2023, p. 434-436Conference paper, Published paper (Refereed)
Abstract [en]

The sound design of different fantastical aspects can tell the audience much about characters and things. Robots are one of the common fantastical characters that need to be sonified to indicate different aspects of their character. Often, one or more of these traits are an indication of gender and behavior. We investigated these traits in a survey where we asked both quantitative and qualitative questions about the participants' perceptions. We found that participants indicated a bias towards certain robots depending on perceived femininity and masculinity.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
Keywords
Gender, Gender stereotypes, Human-robot interaction, Perception, Robots, Science Fiction, Social Robots
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-341673 (URN)10.1145/3623809.3623953 (DOI)001148034200071 ()2-s2.0-85180130385 (Scopus ID)
Conference
11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023
Note

Part of ISBN 9798400708244

QC 20231229

Available from: 2023-12-29 Created: 2023-12-29 Last updated: 2024-03-05Bibliographically approved
Zhang, B. J., Orthmann, B., Torre, I., Bresin, R., Fick, J., Leite, I. & Fitter, N. T. (2023). Hearing it Out: Guiding Robot Sound Design through Design Thinking. In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN: . Paper presented at 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA (pp. 2064-2071). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Hearing it Out: Guiding Robot Sound Design through Design Thinking
Show others...
2023 (English)In: 2023 32ND IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, RO-MAN, Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 2064-2071Conference paper, Published paper (Refereed)
Abstract [en]

Sound can benefit human-robot interaction, but little work has explored questions on the design of nonverbal sound for robots. The unique confluence of sound design and robotics expertise complicates these questions, as most roboticists do not have sound design expertise, necessitating collaborations with sound designers. We sought to understand how roboticists and sound designers approach the problem of robot sound design through two qualitative studies. The first study followed discussions by robotics researchers in focus groups, where these experts described motivations to add robot sound for various purposes. The second study guided music technology students through a generative activity for robot sound design; these sound designers in-training demonstrated high variability in design intent, processes, and inspiration. To unify the two perspectives, we structured recommendations through the design thinking framework, a popular design process. The insights provided in this work may aid roboticists in implementing helpful sounds in their robots, encourage sound designers to enter into collaborations on robot sound, and give key tips and warnings to both.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Series
IEEE RO-MAN, ISSN 1944-9445
National Category
Design
Identifiers
urn:nbn:se:kth:diva-342045 (URN)10.1109/RO-MAN57019.2023.10309489 (DOI)001108678600269 ()2-s2.0-85186967284 (Scopus ID)
Conference
32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), AUG 28-31, 2023, Busan, SOUTH KOREA
Note

Part of proceedings ISBN 979-8-3503-3670-2

QC 20240110

Available from: 2024-01-10 Created: 2024-01-10 Last updated: 2025-02-24Bibliographically approved
Ziemer, T., Lenzi, S., Rönnberg, N., Hermann, T. & Bresin, R. (2023). Introduction to the special issue on design and perception of interactive sonification. Journal on Multimodal User Interfaces, 17(4), 213-214
Open this publication in new window or tab >>Introduction to the special issue on design and perception of interactive sonification
Show others...
2023 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 17, no 4, p. 213-214Article in journal, Editorial material (Other academic) Published
Place, publisher, year, edition, pages
Springer Nature, 2023
National Category
Other Engineering and Technologies
Identifiers
urn:nbn:se:kth:diva-348214 (URN)10.1007/s12193-023-00425-6 (DOI)001098004600001 ()2-s2.0-85175546156 (Scopus ID)
Note

QC 20240624

Available from: 2024-06-24 Created: 2024-06-24 Last updated: 2025-02-18Bibliographically approved
Rafi, A. K., Murdeshwar, A., Latupeirissa, A. B. & Bresin, R. (2023). Investigating the Role of Robot Voices and Sounds in Shaping Perceived Intentions. In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction: . Paper presented at 11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023 (pp. 425-427). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Investigating the Role of Robot Voices and Sounds in Shaping Perceived Intentions
2023 (English)In: HAI 2023 - Proceedings of the 11th Conference on Human-Agent Interaction, Association for Computing Machinery (ACM) , 2023, p. 425-427Conference paper, Published paper (Refereed)
Abstract [en]

This study explores if, and how, the choices made regarding a robot's speaking voice and characteristic body sounds influence viewers' perceptions of its intent i.e., whether the robot's intention is positive or negative. The analysis focuses on robot representations and sounds in three films: "Robots"(2005) [1], "NextGen"(2018) [2], and "Love, Death, and Robots - Three Robots"(2019) [3]. In eight qualitative interviews, five parameters (tonality, intonation, volume, pitch, and speed) were used to understand robot sounds and the participant's perception of a robot's attitude and intentions. The study culminates in a set of recommendations and considerations for human-robot interaction designers to consider while sound coding for body, physiology, and movement.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
Keywords
Human Perception, Movies, Qualitative Study, Robot sounds, Sound Design
National Category
Robotics and automation Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-341677 (URN)10.1145/3623809.3623949 (DOI)001148034200068 ()2-s2.0-85180124967 (Scopus ID)
Conference
11th Conference on Human-Agent Interaction, HAI 2023, Gothenburg, Sweden, Dec 4 2023 - Dec 11 2023
Note

Part of ISBN 9798400708244

QC 20231229

Available from: 2023-12-29 Created: 2023-12-29 Last updated: 2025-02-05Bibliographically approved
Goina, M., Bresin, R. & Rodela, R. (2023). Our Sound Space (OSS): An installation for participatory and interactive exploration of soundscapes. In: Bresin R., Falkenberg K. (Ed.), SMC 2023: Proceedings of the Sound and Music Computing Conference 2023. Paper presented at 20th Sound and Music Computing Conference, SMC 2023, Hybrid, Stockholm, Sweden, Jun 15 2023 - Jun 17 2023 (pp. 255-260). Sound and Music Computing Network
Open this publication in new window or tab >>Our Sound Space (OSS): An installation for participatory and interactive exploration of soundscapes
2023 (English)In: SMC 2023: Proceedings of the Sound and Music Computing Conference 2023 / [ed] Bresin R., Falkenberg K., Sound and Music Computing Network , 2023, p. 255-260Conference paper, Published paper (Refereed)
Abstract [en]

This paper describes the development of an interactive tool which allows playing different soundscapes by mixing diverse environmental sounds on demand. This tool is titled Our Sound Space (OSS) and has been developed as part of an ongoing project where we test methods and tools for the participation of young people in spatial planning. As such OSS is meant to offer new opportunities to engage youth in talks about planning, placemaking and more sustainable living environments. In this paper, we describe an implementation of OSS that we are using as an interactive soundscape installation sited in a public place daily visited by people from a diversity of entities (e.g. university, a gymnasium, a restaurant, start-ups). The OSS installation is designed to allow simultaneous activation of several prerecorded sounds broadcast through four loudspeakers. The installation is interactive, meaning that it can be activated and operated by anyone via smartphones and is designed to allow interaction among multiple people at the same time and space.

Place, publisher, year, edition, pages
Sound and Music Computing Network, 2023
Series
Proceedings of the Sound and Music Computing Conferences, ISSN 2518-3672 ; 20
National Category
Human Computer Interaction Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-337824 (URN)10.5281/zenodo.8399025 (DOI)2-s2.0-85171747211 (Scopus ID)
Conference
20th Sound and Music Computing Conference, SMC 2023, Hybrid, Stockholm, Sweden, Jun 15 2023 - Jun 17 2023
Projects
PwY
Note

Part of ISBN 9789152773727

QC 20231009

Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2025-02-18Bibliographically approved
Zojaji, S., Latupeirissa, A. B., Leite, I., Bresin, R. & Peters, C. (2023). Persuasive polite robots in free-standing conversational groups. In: Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023): . Paper presented at 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023) (pp. 1-8). Institute of Electrical and Electronics Engineers (IEEE)
Open this publication in new window or tab >>Persuasive polite robots in free-standing conversational groups
Show others...
2023 (English)In: Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023), Institute of Electrical and Electronics Engineers (IEEE) , 2023, p. 1-8Conference paper, Published paper (Refereed)
Abstract [en]

Politeness is at the core of the common set of behavioral norms that regulate human communication and is therefore of significant interest in the design of Human-Robot Interactions. In this paper, we investigate how the politeness behaviors of a humanoid robot impact human decisions about where to join a group of two robots. We also evaluate the resulting impact on the perception of the robot's politeness. In a study involving 59 participants, the main (Pepper) robot in the group invited participants to join using six politeness behaviors derived from Brown and Levinson's politeness theory. It requests participants to join the group at the furthest side of the group which involves more effort to reach than a closer side that is also available to the participant but would ignore the request of the robot. We evaluated the robot's effectiveness in terms of persuasiveness, politeness, and clarity. We found that more direct and explicit politeness strategies derived from the theory have a higher level of success in persuading participants to join at the furthest side of the group. We also evaluated participants' adherence to social norms i.e. not walking through the center, or \textit{o-space}, of the group when joining it. Our results showed that participants tended to adhere to social norms when joining at the furthest side by not walking through the center of the group of robots, even though they were informed that the robots were fully automated. 

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2023
Keywords
Social robotics, Politeness, Persuasiveness, Social norms, Human-Robot interaction, free-standing conversational groups
National Category
Engineering and Technology
Identifiers
urn:nbn:se:kth:diva-338180 (URN)10.1109/IROS55552.2023.10341830 (DOI)001133658803003 ()2-s2.0-85182524342 (Scopus ID)
Conference
2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023)
Note

Part of proceedings ISBN 978-1-6654-9190-7

QC 20231016

Available from: 2023-10-16 Created: 2023-10-16 Last updated: 2024-03-04Bibliographically approved
Latupeirissa, A. B., Panariello, C. & Bresin, R. (2023). Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification. ACM Transactions on Human-Robot Interaction
Open this publication in new window or tab >>Probing Aesthetics Strategies for Robot Sound: Complexity and Materiality in Movement Sonification
2023 (English)In: ACM Transactions on Human-Robot Interaction, E-ISSN 2573-9522Article in journal (Refereed) Published
Abstract [en]

This paper presents three studies where we probe aesthetics strategies of sound produced by movement sonification of a Pepper robot by mapping its movements to sound models.

We developed two sets of sound models. The first set was made by two sound models, a sawtooth-based one and another based on feedback chains, for investigating how the perception of synthesized robot sounds would depend on their design complexity. We implemented the second set of sound models for probing the “materiality” of sound made by a robot in motion. This set consisted of a sound synthesis based on an engine highlighting the robot’s internal mechanisms, a metallic sound synthesis highlighting the robot’s typical appearance, and a whoosh sound synthesis highlighting the movement.

We conducted three studies. The first study explores how the first set of sound models can influence the perception of expressive gestures of a Pepper robot through an online survey. In the second study, we carried out an experiment in a museum installation with a Pepper robot presented in two scenarios: (1) while welcoming patrons into a restaurant and (2) while providing information to visitors in a shopping center. Finally, in the third study, we conducted an online survey with stimuli similar to those used in the second study.

Our findings suggest that participants preferred more complex sound models for the sonification of robot movements. Concerning the materiality, participants liked better subtle sounds that blend well with the ambient sound (i.e., less distracting) and soundscapes in which sound sources can be identified. Also, sound preferences varied depending on the context in which participants experienced the robot-generated sounds (e.g., as a live museum installation vs. an online display).

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2023
Keywords
SONAO
National Category
Human Computer Interaction Robotics and automation
Identifiers
urn:nbn:se:kth:diva-324962 (URN)10.1145/3585277 (DOI)001153514400008 ()2-s2.0-85170233153 (Scopus ID)
Note

QC 20230328

Available from: 2023-03-21 Created: 2023-03-21 Last updated: 2025-02-05Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-3086-0322

Search in DiVA

Show all publications