Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 16) Show all publications
Lungaro, P., Tollmar, K., Saeik, F., Mateu Gisbert, C. & Dubus, G. (2018). Demonstration of a low-cost hyper-realistic testbed for designing future onboard experiences. In: Adjunct Proceedings - 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018: . Paper presented at 10th ACM International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, 23 September 2018 through 25 September 2018 (pp. 235-238). Association for Computing Machinery, Inc
Open this publication in new window or tab >>Demonstration of a low-cost hyper-realistic testbed for designing future onboard experiences
Show others...
2018 (English)In: Adjunct Proceedings - 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, Association for Computing Machinery, Inc , 2018, p. 235-238Conference paper, Published paper (Refereed)
Abstract [en]

This demo presents DriverSense, a novel experimental platform for designing and validating onboard user interfaces for self-driving and remotely controlled vehicles. Most of currently existing vehicular testbeds and simulators are designed to reproduce with high fidelity the ergonomic aspects associated with the driving experience. However, with increasing deployment of self-driving and remotely controlled or monitored vehicles, it is expected that the digital components of the driving experience will become more relevant. That is because users will be less engaged in the actual driving tasks and more involved with oversight activities. In this respect, high visual testbed fidelity becomes an important pre-requisite for supporting the design and evaluation of future interfaces. DriverSense, which is based on the hyper-realistic video game GTA V, has been developed to satisfy this need. To showcase its experimental flexibility, a set of self-driving interfaces have been implemented, including Heads-Up Display (HUDs), Augmented Reality (ARs) and directional audio.

Place, publisher, year, edition, pages
Association for Computing Machinery, Inc, 2018
Keywords
AR, Experimental assessment, HUD, Self-driving, Testbed, Argon, Augmented reality, Costs, Testbeds, Design and evaluations, Digital components, Driving experiences, Experimental platform, Heads-up display, Pre-requisites, Self drivings, User interfaces
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-252258 (URN)10.1145/3239092.3267850 (DOI)2-s2.0-85063139649 (Scopus ID)9781450359474 (ISBN)
Conference
10th ACM International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, 23 September 2018 through 25 September 2018
Note

QC20190614

Available from: 2019-06-14 Created: 2019-06-14 Last updated: 2019-06-14Bibliographically approved
Lungaro, P., Tollmar, K., Saeik, F., Mateu Gisbert, C. & Dubus, G. (2018). DriverSense: A hyper-realistic testbed for the design and evaluation of novel user interfaces in self-driving vehicles. In: Adjunct Proceedings - 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018: . Paper presented at 10th ACM International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, 23 September 2018 through 25 September 2018 (pp. 127-131). Association for Computing Machinery, Inc
Open this publication in new window or tab >>DriverSense: A hyper-realistic testbed for the design and evaluation of novel user interfaces in self-driving vehicles
Show others...
2018 (English)In: Adjunct Proceedings - 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, Association for Computing Machinery, Inc , 2018, p. 127-131Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents DriverSense, a novel experimental platform for designing and validating onboard user interfaces for self-driving and remotely controlled vehicles. Most of currently existing academic and industrial testbeds and vehicular simulators are designed to reproduce with high fidelity the ergonomic aspects associated with the driving experience. However, with increasing deployment of self-driving and remote controlled vehicular modalities, it is expected that the digital components of the driving experience will become more and more relevant, because users will be less engaged in the actual driving tasks and more involved with oversight activities. In this respect, high visual testbed fidelity becomes an important pre-requisite for supporting the design and evaluation of future onboard interfaces. DriverSense, which is based on the hyper-realistic video game GTA V, has been developed to satisfy this need. To showcase its experimental flexibility, a set of selected case studies, including Heads-Up Diplays (HUDs), Augmented Reality (ARs) and directional audio solutions, are presented. 

Place, publisher, year, edition, pages
Association for Computing Machinery, Inc, 2018
Keywords
AR, Autonomous vehicular systems, Trust, Argon, Augmented reality, Remote control, Testbeds, Design and evaluations, Digital components, Driving experiences, Driving tasks, Experimental platform, Pre-requisites, Vehicular systems, User interfaces
National Category
Communication Systems
Identifiers
urn:nbn:se:kth:diva-252259 (URN)10.1145/3239092.3265955 (DOI)2-s2.0-85063134845 (Scopus ID)9781450359474 (ISBN)
Conference
10th ACM International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, 23 September 2018 through 25 September 2018
Note

QC20190607

Available from: 2019-06-07 Created: 2019-06-07 Last updated: 2019-06-07Bibliographically approved
Dubus, G. & Bresin, R. (2015). Exploration and evaluation of a system for interactive sonification of elite rowing. Sports Engineering, 18(1), 29-41
Open this publication in new window or tab >>Exploration and evaluation of a system for interactive sonification of elite rowing
2015 (English)In: Sports Engineering, ISSN 1369-7072, E-ISSN 1460-2687, Vol. 18, no 1, p. 29-41Article in journal (Refereed) Published
Abstract [en]

In recent years, many solutions based on interactive sonification have been introduced for enhancing sport training. Few of them have been assessed in terms of efficiency or design. In a previous study, we performed a quantitative evaluation of four models for the sonification of elite rowing in a non-interactive context. For the present article, we conducted on-water experiments to investigate the effects of some of these models on two kinematic quantities: stroke rate value and fluctuations in boat velocity. To this end, elite rowers interacted with discrete and continuous auditory displays in two experiments. A method for computing an average rowing cycle is introduced, together with a measure of velocity fluctuations. Participants answered to questionnaires and interviews to assess the degree of acceptance of the different models and to reveal common trends and individual preferences. No significant effect of sonification could be determined in either of the two experiments. The measure of velocity fluctuations was found to depend linearly on stroke rate. Participants provided feedback about their aesthetic preferences and functional needs during interviews, allowing us to improve the models for future experiments to be conducted over longer periods.

Place, publisher, year, edition, pages
Springer London, 2015
Keywords
Auditory display, Evaluation, Interactive, Rowing, Sonic interaction, Sonification, Sport
National Category
Computer Sciences Media and Communication Technology Human Computer Interaction
Research subject
Computer Science; Media Technology; Human-computer Interaction
Identifiers
urn:nbn:se:kth:diva-158168 (URN)10.1007/s12283-014-0164-0 (DOI)000366675700004 ()2-s2.0-84923250080 (Scopus ID)
Note

QC 20160114

Available from: 2014-12-30 Created: 2014-12-30 Last updated: 2018-01-11Bibliographically approved
Dubus, G. & Bresin, R. (2013). A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities. PLoS ONE, 8(12), e82491
Open this publication in new window or tab >>A Systematic Review of Mapping Strategies for the Sonification of Physical Quantities
2013 (English)In: PLoS ONE, ISSN 1932-6203, E-ISSN 1932-6203, Vol. 8, no 12, p. e82491-Article in journal (Refereed) Published
Abstract [en]

The field of sonification has progressed greatly over the past twenty years and currently constitutes an established area of research. This article aims at exploiting and organizing the knowledge accumulated in previous experimental studies to build a foundation for future sonification works. A systematic review of these studies may reveal trends in sonification design, and therefore support the development of design guidelines. To this end, we have reviewed and analyzed 179 scientific publications related to sonification of physical quantities. Using a bottom-up approach, we set up a list of conceptual dimensions belonging to both physical and auditory domains. Mappings used in the reviewed works were identified, forming a database of 495 entries. Frequency of use was analyzed among these conceptual dimensions as well as higher-level categories. Results confirm two hypotheses formulated in a preliminary study: pitch is by far the most used auditory dimension in sonification applications, and spatial auditory dimensions are almost exclusively used to sonify kinematic quantities. To detect successful as well as unsuccessful sonification strategies, assessment of mapping efficiency conducted in the reviewed works was considered. Results show that a proper evaluation of sonification mappings is performed only in a marginal proportion of publications. Additional aspects of the publication database were investigated: historical distribution of sonification works is presented, projects are classified according to their primary function, and the sonic material used in the auditory display is discussed. Finally, a mapping-based approach for characterizing sonification is proposed.

Keywords
sonification, theory, mapping, meta-analysis, review, physical, auditory, dimension
National Category
Computer Sciences Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-127939 (URN)10.1371/journal.pone.0082491 (DOI)000328737700019 ()2-s2.0-84893070216 (Scopus ID)
Projects
SOM - The sound of motion: Providing sound feedback to human movements
Funder
Swedish Research Council, 2010-4654
Note

QC 20140123. Updated from manuscript to article in journal.

Available from: 2013-09-09 Created: 2013-09-09 Last updated: 2018-01-11Bibliographically approved
Dubus, G. (2013). Interactive sonification of motion: Design, implementation and control of expressive auditory feedback with mobile devices. (Doctoral dissertation). Stockholm: KTH Royal Institute of Technology
Open this publication in new window or tab >>Interactive sonification of motion: Design, implementation and control of expressive auditory feedback with mobile devices
2013 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Sound and motion are intrinsically related, by their physical nature and through the link between auditory perception and motor control. If sound provides information about the characteristics of a movement, a movement can also be influenced or triggered by a sound pattern. This thesis investigates how this link can be reinforced by means of interactive sonification. Sonification, the use of sound to communicate, perceptualize and interpret data, can be used in many different contexts. It is particularly well suited for time-related tasks such as monitoring and synchronization, and is therefore an ideal candidate to support the design of applications related to physical training. Our objectives are to develop and investigate computational models for the sonification of motion data with a particular focus on expressive movement and gesture, and for the sonification of elite athletes movements.  We chose to develop our applications on a mobile platform in order to make use of advanced interaction modes using an easily accessible technology. In addition, networking capabilities of modern smartphones potentially allow for adding a social dimension to our sonification applications by extending them to several collaborating users. The sport of rowing was chosen to illustrate the assistance that an interactive sonification system can provide to elite athletes. Bringing into play complex interactions between various kinematic and kinetic quantities, studies on rowing kinematics provide guidelines to optimize rowing efficiency, e.g. by minimizing velocity fluctuations around average velocity. However, rowers can only rely on sparse cues to get information relative to boat velocity, such as the sound made by the water splashing on the hull. We believe that an interactive augmented feedback communicating the dynamic evolution of some kinematic quantities could represent a promising way of enhancing the training of elite rowers. Since only limited space is available on a rowing boat, the use of mobile phones appears appropriate for handling streams of incoming data from various sensors and generating an auditory feedback simultaneously. The development of sonification models for rowing and their design evaluation in offline conditions are presented in Paper I. In Paper II, three different models for sonifying the synchronization of the movements of two users holding a mobile phone are explored. Sonification of expressive gestures by means of expressive music performance is tackled in Paper III. In Paper IV, we introduce a database of mobile applications related to sound and music computing. An overview of the field of sonification is presented in Paper V, along with a systematic review of mapping strategies for sonifying physical quantities. Physical and auditory dimensions were both classified into generic conceptual dimensions, and proportion of use was analyzed in order to identify the most popular mappings. Finally, Paper VI summarizes experiments conducted with the Swedish national rowing team in order to assess sonification models in an interactive context.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2013. p. xiii, 33
Series
Trita-CSC-A, ISSN 1653-5723 ; 2013:09
National Category
Computer Sciences Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-127944 (URN)978-91-7501-858-4 (ISBN)
Public defence
2013-09-27, F3, Lindstedtsvägen 26, KTH, Stockholm, 13:00 (English)
Opponent
Supervisors
Note

QC 20130910

Available from: 2013-09-10 Created: 2013-09-09 Last updated: 2018-01-11Bibliographically approved
Dubus, G., Hansen, K. F. & Bresin, R. (2012). An overview of sound and music applications for Android available on the market. In: Serafin, Stefania (Ed.), Proceedings of the 9th Sound and Music Computing Conference, SMC 2012: . Paper presented at 9th Sound and Music Computing Conference, SMC 2012, Copenhagen, Denmark, 11 July 2012 through 14 July 2012 (pp. 541-546). Sound and music Computing network
Open this publication in new window or tab >>An overview of sound and music applications for Android available on the market
2012 (English)In: Proceedings of the 9th Sound and Music Computing Conference, SMC 2012 / [ed] Serafin, Stefania, Sound and music Computing network , 2012, p. 541-546Conference paper, Published paper (Refereed)
Abstract [en]

This paper introduces a database of sound-based applications running on the Android mobile platform. The longterm objective is to provide a state-of-the-art of mobile applications dealing with sound and music interaction. After exposing the method used to build up and maintain the database using a non-hierarchical structure based on tags, we present a classification according to various categories of applications, and we conduct a preliminary analysis of the repartition of these categories reflecting the current state of the database.

Place, publisher, year, edition, pages
Sound and music Computing network, 2012
Keywords
Android (operating system), Mobile applications, Mobile platform, Music applications, Music interaction, Preliminary analysis, Structure-based
National Category
Computer Sciences Language Technology (Computational Linguistics)
Identifiers
urn:nbn:se:kth:diva-109381 (URN)2-s2.0-84905197265 (Scopus ID)978-383253180-5 (ISBN)
Conference
9th Sound and Music Computing Conference, SMC 2012, Copenhagen, Denmark, 11 July 2012 through 14 July 2012
Note

QC 20150507. QC 20160115

Available from: 2013-01-02 Created: 2013-01-02 Last updated: 2018-01-11Bibliographically approved
Dubus, G. (2012). Evaluation of four models for the sonification of elite rowing. Journal on Multimodal User Interfaces, 5(3-4), 143-156
Open this publication in new window or tab >>Evaluation of four models for the sonification of elite rowing
2012 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 5, no 3-4, p. 143-156Article in journal (Refereed) Published
Abstract [en]

Many aspects of sonification represent potential benefits for the practice of sports. Taking advantage of the characteristics of auditory perception, interactive sonification offers promising opportunities for enhancing the training of athletes. The efficient learning and memorizing abilities pertaining to the sense of hearing, together with the strong coupling between auditory and sensorimotor systems, make the use of sound a natural field of investigation in quest of efficiency optimization in individual sports at a high level. This study presents an application of sonification to elite rowing, introducing and evaluating four sonification models.The rapid development of mobile technology capable of efficiently handling numerical information offers new possibilities for interactive auditory display. Thus, these models have been developed under the specific constraints of a mobile platform, from data acquisition to the generation of a meaningful sound feedback. In order to evaluate the models, two listening experiments have then been carried out with elite rowers. Results show a good ability of the participants to efficiently extract basic characteristics of the sonified data, even in a non-interactive context. Qualitative assessment of the models highlights the need for a balance between function and aesthetics in interactive sonification design. Consequently, particular attention on usability is required for future displays to become widespread.

Place, publisher, year, edition, pages
Springer, 2012
Keywords
Sonification, Rowing, Sculler, Sports, Accelerometer
National Category
Computer Sciences Language Technology (Computational Linguistics)
Identifiers
urn:nbn:se:kth:diva-52253 (URN)10.1007/s12193-011-0085-1 (DOI)000309998300007 ()2-s2.0-84860990213 (Scopus ID)
Funder
Swedish Research CouncilEU, FP7, Seventh Framework Programme, FP7-ICT- STREP-215749
Note

QC 20120628

Available from: 2012-04-23 Created: 2011-12-14 Last updated: 2018-01-12Bibliographically approved
Fabiani, M., Bresin, R. & Dubus, G. (2012). Interactive sonification of expressive hand gestures on a handheld device. Journal on Multimodal User Interfaces, 6(1-2), 49-57
Open this publication in new window or tab >>Interactive sonification of expressive hand gestures on a handheld device
2012 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 6, no 1-2, p. 49-57Article in journal (Refereed) Published
Abstract [en]

We present here a mobile phone application called MoodifierLive which aims at using expressive music performances for the sonification of expressive gestures through the mapping of the phone’s accelerometer data to the performance parameters (i.e. tempo, sound level, and articulation). The application, and in particular the sonification principle, is described in detail. An experiment was carried out to evaluate the perceived matching between the gesture and the music performance that it produced, using two distinct mappings between gestures and performance. The results show that the application produces consistent performances, and that the mapping based on data collected from real gestures works better than one defined a priori by the authors.

Keywords
Automatic music performance, Emotional hand gestures, Mobile phone, Sonification
National Category
Computer and Information Sciences Human Computer Interaction Psychology
Identifiers
urn:nbn:se:kth:diva-34084 (URN)10.1007/s12193-011-0076-2 (DOI)000309998800006 ()2-s2.0-84863200318 (Scopus ID)
Projects
SAME
Funder
Swedish Research Council, 2010-4654EU, FP7, Seventh Framework Programme, FP7-ICT-STREP-215749
Note

QC 20120809. Updated from submitted to published.

Available from: 2011-05-25 Created: 2011-05-25 Last updated: 2018-01-12Bibliographically approved
Varni, G., Dubus, G., Oksanen, S., Volpe, G., Fabiani, M., Bresin, R., . . . Camurri, A. (2012). Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices. Journal on Multimodal User Interfaces, 5(3-4), 157-173
Open this publication in new window or tab >>Interactive sonification of synchronisation of motoric behaviour in social active listening to music with mobile devices
Show others...
2012 (English)In: Journal on Multimodal User Interfaces, ISSN 1783-7677, E-ISSN 1783-8738, Vol. 5, no 3-4, p. 157-173Article in journal (Refereed) Published
Abstract [en]

This paper evaluates three different interactive sonifications of dyadic coordinated human rhythmic activity. An index of phase synchronisation of gestures was chosen as coordination metric. The sonifications are implemented as three prototype applications exploiting mobile devices: Sync’n’Moog, Sync’n’Move, and Sync’n’Mood. Sync’n’Moog sonifies the phase synchronisation index by acting directly on the audio signal and applying a nonlinear time-varying filtering technique. Sync’n’Move intervenes on the multi-track music content by making the single instruments emerge and hide. Sync’n’Mood manipulates the affective features of the music performance. The three sonifications were also tested against a condition without sonification.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2012
Keywords
Interactive sonification, Interactive systems, Audio systems, Sound and music computing, Active music listening, Synchronisation
National Category
Computer Sciences Human Computer Interaction Psychology Media and Communication Technology
Identifiers
urn:nbn:se:kth:diva-52200 (URN)10.1007/s12193-011-0079-z (DOI)000309998300008 ()2-s2.0-84861014654 (Scopus ID)
Projects
SAME
Funder
EU, FP7, Seventh Framework Programme, 215749 SAMESwedish Research Council, 2010-4654
Note

QC 20150623

Available from: 2011-12-14 Created: 2011-12-14 Last updated: 2018-01-12Bibliographically approved
Hansen, K. F., Dubus, G. & Bresin, R. (2012). Using modern smartphones to create interactive listening experiences for hearing impaired. TMH-QPSR special issue: Proceedings of SMC Sweden 2012 Sound and Music Computing, Understanding and Practicing in Sweden, 52(1), 42
Open this publication in new window or tab >>Using modern smartphones to create interactive listening experiences for hearing impaired
2012 (English)In: TMH-QPSR special issue: Proceedings of SMC Sweden 2012 Sound and Music Computing, Understanding and Practicing in Sweden, ISSN 1104-5787, Vol. 52, no 1, p. 42-Article in journal (Refereed) Published
Place, publisher, year, edition, pages
TMH KTH, 2012
National Category
Computer Sciences Language Technology (Computational Linguistics)
Identifiers
urn:nbn:se:kth:diva-109377 (URN)
Note

tmh_import_13_01_02, tmh_id_3759. QC 20130103. QC 20160115

Available from: 2013-01-02 Created: 2013-01-02 Last updated: 2018-01-11Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-8830-963X

Search in DiVA

Show all publications