Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People
KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
2010 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Spatial information presented visually is not easily accessible to visually impairedusers. Current technologies, such as screen readers, cannot intuitively conveyspatial layout or structure. This lack of overview is an obstacle for a visuallyimpaired user, both when using the computer individually and when collaboratingwith other users. With the development of haptic and audio technologies, it ispossible to let visually impaired users access to three-dimensional (3D) VirtualReality (VR) environments through the senses of touch and hearing.The work presented in this thesis comprises investigations of haptic and audiointeraction for visually impaired computer users in two stages.The first stage of my research focused on collaborations between sighted andblind-folded computer users in a shared virtual environment. One aspect Iconsidered is how different modalities affect one’s awareness of the other’sactions, as well as of one’s own actions, during the work process. The secondaspect I investigated is common ground, i.e. how visually impaired people obtaina common understanding of the elements of their workspace through differentmodalities. A third aspect I looked at was how different modalities affectperceived social presence, i.e. their ability to perceive the other person’sintentions and emotions. Finally, I attempted to understand how human behaviorand efficiency in task performance are affected when different modalities are usedin collaborative situations.The second stage of my research focused on how the visually impaired access3D multimodal virtual environment individually. I conducted two studies basedon two different haptic and audio prototypes concerning understanding the effectof haptic-audio modalities on navigation and interface design. One prototype thatI created was a haptic and audio game, a labyrinth. The other is a virtualsimulation environment based on the real world of Kulturhuset in Stockholm. Oneaspect I investigated in this individual interaction is how it is possible for users toaccess the spatial layout through a multimodal virtual environment. The secondaspect I investigated is usability; how the haptic and audio cues help visuallyimpaired people understand the spatial layout. The third aspect concernsnavigation and cognitive mapping in a multimodal virtual environment.This thesis contributes to the field of human-computer interaction for thevisually impaired with a set of studies of multimodal interactive systems, andbrings new perspectives to the enhancement of understanding real environmentsfor visually impaired users through a haptic and audio virtual computerenvironment.

Place, publisher, year, edition, pages
Stockholm: KTH , 2010. , 82 p.
Series
Trita-CSC-A, ISSN 1653-5723 ; 2010:09
Keyword [en]
Multimodal interaction, 3D worlds, Haptics, Audio, Visually impaired users, Collaboration, Navigation
National Category
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-13140ISBN: 978-91-7415-683-6 (print)OAI: oai:DiVA.org:kth-13140DiVA: diva2:321168
Public defence
2010-06-10, F3, Lindstedtsvägen 26, KTH, Stockholm, 10:15 (English)
Opponent
Supervisors
Note
QC20100701Available from: 2010-05-31 Created: 2010-05-29 Last updated: 2010-07-01Bibliographically approved
List of papers
1. Integrating Audio and Haptic Feedback in a Collaborative Virtual Environment
Open this publication in new window or tab >>Integrating Audio and Haptic Feedback in a Collaborative Virtual Environment
2007 (English)In: proceeding of HCI International Conference, 2007Conference paper, Published paper (Other academic)
Abstract [en]

An ongoing study is presented here. The purpose is to design and evaluate an experiment comparing an audio/haptic/visual and ahaptic/visual VR environment supporting collaborative work among sighted and blindfolded people. We want to investigate how hapticand audio functions could improve collaboration in a shared workspace. We used a 3D VR environment that supports learning of spatial geometry. The scene is a room containing objects which you can pick up and move around by means of a touch feedback pointing device called Phantom. An experiment was performed with group work in the VR environment comparing an audio/haptic/visual interface with a haptic/visual interface of the application in alaboratory. We investigate if adding audio cues improves awareness, common ground, social presence, perceivedperformance and work efficiency. The aim is also to conduct aquantitative and qualitative analysis of the video-recordedcollaboration in order to obtain information about whether and howthe added audio information changes the work process in the groups.

Keyword
HCI, Haptic, Audio, Multimodal interface, Awareness, Common ground
National Category
Computer Engineering
Identifiers
urn:nbn:se:kth:diva-13879 (URN)
Note
QC20100701Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2010-07-01Bibliographically approved
2. Auditory Feedback in Haptic Collaborative Interfaces
Open this publication in new window or tab >>Auditory Feedback in Haptic Collaborative Interfaces
2012 (English)In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 70, no 4, 257-270 p.Article in journal (Refereed) Published
Abstract [en]

The combined effect of haptic and auditory feedback in shared interfaces, on the cooperation between visually impaired and sighted persons is under-investigated. A central challenge for cooperating group members lies in obtaining a common understanding of the elements of the workspace and maintaining awareness of the other members’, as well as one’s own, actions during the work process. The aim of the experimental study presented here, was to investigate if adding audio cues in a haptic and visual interface makes collaboration between a sighted and a blindfolded person more efficient. Results showed that task performance was significantly faster in the audio, haptic and visual feedback condition compared to the haptic and visual feedback condition. One special focus was also to study how participants utilize the auditory and haptic force feedback in order to obtain a common understanding of the workspace and to maintain an awareness of the group members’ actions. Results from a qualitative analysis showed that the auditory and haptic feedback was used in a number of important ways for the group members’ action awareness and in the participants’ grounding process.

Keyword
Awareness, Collaboration, Common ground, Force feedback, Haptic, Multimodal interface, Virtual environments
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-13887 (URN)10.1016/j.ijhcs.2011.11.006 (DOI)000301313100001 ()2-s2.0-84855266298 (Scopus ID)
Funder
Swedish e‐Science Research CenterStandUp
Note

QC 20100701

Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2017-12-12Bibliographically approved
3. Audio makes a difference in haptic collaborative virtual environments
Open this publication in new window or tab >>Audio makes a difference in haptic collaborative virtual environments
2010 (English)In: Interacting with computers, ISSN 0953-5438, E-ISSN 1873-7951, Vol. 22, no 6, 544-555 p.Article in journal (Refereed) Published
Abstract [en]

In this paper a study is presented which aimed at exploring the effects of audio feedback in a haptic and visual interface supporting collaboration among sighted and people who cannot see. A between group design was used and the participants worked in pairs with one sighted and one blindfolded in each. The application used was a haptic 3D environment in which participants could build composed objects out of building blocks. The building blocks could be picked up and moved around by means of a touch feedback pointing device. In one version of the application sound cues could be used to tell the other person whereyou were, and to get feedback on your own and the other person’s actions. Results showed that sound cues together with haptic feedback made a difference in the interaction between the collaborators regarding their shared understanding of the workspace and the work process. Especially, sound cues played an important role for maintaining awareness of ongoing work – you knew what was going on, and you got a response on your own actions.

Keyword
Haptic, Audio, Multimodal Interfaces, Collaboration, Problem solving
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-13889 (URN)10.1016/j.intcom.2010.06.001 (DOI)000285176600010 ()2-s2.0-78149469381 (Scopus ID)
Funder
Swedish Research Council
Note
QC20100701Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2017-12-12Bibliographically approved
4. Exploration of interface usability in a haptic 3D virtual labyrinth for visually impaired users
Open this publication in new window or tab >>Exploration of interface usability in a haptic 3D virtual labyrinth for visually impaired users
2009 (English)In: The proceeding of IADIS Interfaces and Human Computer Interaction, 2009Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, an experimental study is presented on navigation in a 3D virtual environment by blind and visually impaired people with haptic and audio interaction. A simple 3D labyrinth is developed with haptic and audio interfaces to allow blind and visually impaired persons to access a three-dimensional Virtual Reality scene through senses of touch and hearing. The user must move from inside of the labyrinth to find an exit. Objects with different shapes can be found inside the labyrinth with walls around them. Different navigation tools were designed in order to assist their spatial orientation and mobility with haptic and audio cues. The result of the experimental study with qualitative analysis is an investigation on the accessibility and usability for blind and visually impaired people to play online 3D virtual reality games like Second Life in a haptic and audio virtual environment. This description can later serve as an input to innovative presentations on cognitive mapping, orientation and navigation of spatial structures using haptic and auditory displays.

National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-13907 (URN)
Note
QC20100701Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2010-07-01Bibliographically approved
5. Exploration in 3D Virtual Worlds with Haptic-Audio Support for Nonvisual Spatial Recognition
Open this publication in new window or tab >>Exploration in 3D Virtual Worlds with Haptic-Audio Support for Nonvisual Spatial Recognition
2010 (English)In: Human Computer Interaction Symposium, World Computer Congress, 2010Conference paper, Published paper (Refereed)
Abstract [en]

Mobility indoors or outdoors is a problem for visually impaired people because of lacking the visual channel. Appropriate learning tools about spatial information could be used as a preparation for navigation before going to the real place. Current accessibility technologies, such as screen readers, do not well convey spatial layout or structure. In this paper, an study on accessing non-visual spatial information and on support for efficient navigation and orientation with haptic and audio cues is presented. A 3D virtual simulation prototype of a real world environment was created for this purpose. Different navigation tools were designed in the prototype with haptic and audio cues. The main results of the study reported are: (a) the development of a virtual 3D environment helping visually impaired users to learn about real space where they are required to navigate (e.g. in schools, work places, public buildings), (b) whether the spatial information for establishing the mental mapping of the space could be acquired by using compensatory sensorial channels (e.g., touch and hearing), as an alternative to the visual channel; (c) how haptic and audio cues can enable the efficient navigation, mobility and orientation in such 3D virtual environments. Results from qualitative analysis regarding learning process and actual performance in the 3D virtual world are presented.

Keyword
Haptic; Audio; Virtual Reality; Navigation; Cognitive Mapping
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-13905 (URN)
Note
QC20100701Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2010-07-01Bibliographically approved

Open Access in DiVA

fulltext(810 kB)1046 downloads
File information
File name FULLTEXT01.pdfFile size 810 kBChecksum SHA-512
56eeb1714ad7d673b7575d0980f00c1c113c0874ce6ab28f99575f0dcae8a1d5ca05f089fd117f4666f86c325815420798cd8ae2441271d8f3b7c83cb23989e0
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Huang, Ying Ying
By organisation
Human - Computer Interaction, MDI
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 1046 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 784 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf