Change search
Link to record
Permanent link

Direct link
BETA
Alternative names
Publications (10 of 36) Show all publications
Back, J., Heeffer, C., Paget, S., Rau, A., Sallnäs Pysander, E. L. & Waern, A. (2016). Designing children's digital-physical play in natural outdoors settings. In: Conference on Human Factors in Computing Systems - Proceedings: . Paper presented at 34th Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2016, 7 May 2016 through 12 May 2016 (pp. 1359-1366). Association for Computing Machinery
Open this publication in new window or tab >>Designing children's digital-physical play in natural outdoors settings
Show others...
2016 (English)In: Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery , 2016, p. 1359-1366Conference paper, Published paper (Refereed)
Abstract [en]

Children's outdoor play is fluent and fluctuating, shaped by environmental features and conditions. The article reports on a project where interaction designers and landscape architects work together to develop solutions for integrating interactive play in outdoor environments. Here we report on a schoolyard trial, where interactive play technology was installed as an integral part of the schoolyard environment, and discuss the interplay between technology and the environment. We highlight in particular how the interactive technology contributed to the versatility of play activities, but also how the nature setting and the availability of natural materials contributed to the play activities around the interactive artefacts. 

Place, publisher, year, edition, pages
Association for Computing Machinery, 2016
Keywords
Interactive play, Landscape architecture, Playscape
National Category
Media and Communication Technology
Identifiers
urn:nbn:se:kth:diva-207504 (URN)10.1145/2851581.2892416 (DOI)2-s2.0-85014663422 (Scopus ID)
Conference
34th Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2016, 7 May 2016 through 12 May 2016
Note

Conference code: 121620; Export Date: 22 May 2017; Conference Paper. QC 20170607

Available from: 2017-06-07 Created: 2017-06-07 Last updated: 2018-01-13Bibliographically approved
Waern, A., Back, J., Sallnäs Pysander, E. L., Heefer, C. J. H., Rau, A., Paget, S. & Petterson, L. (2015). DigiFys: The interactive play landscape. In: 12TH ADVANCES IN COMPUTER ENTERTAINMENT TECHNOLOGY CONFERENCE (ACE15): . Paper presented at 12th Advances in Computer Entertainment Technology Conference (ACE), NOV 16-19, 2015, Iskandar, MALAYSIA. Association for Computing Machinery (ACM)
Open this publication in new window or tab >>DigiFys: The interactive play landscape
Show others...
2015 (English)In: 12TH ADVANCES IN COMPUTER ENTERTAINMENT TECHNOLOGY CONFERENCE (ACE15), Association for Computing Machinery (ACM), 2015Conference paper, Published paper (Refereed)
Abstract [en]

The DigiFys project explores the design of interactive landscapes for children's outdoor play. The project combines landscape architecture with design of interactive technology, working towards designs that support children in their everyday play activity, close to home. In the creative lab session, we want to co-design the play landscape together with local children. The focus is on acquiring a perspective on similarities and differences between the children's play culture in Sweden where the project originates, and Malaysia.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2015
Keywords
Playscape, Landscape architecture, Interactive play technology, Play
National Category
Interaction Technologies Design Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-193876 (URN)10.1145/2832932.2832961 (DOI)000382173300046 ()2-s2.0-84979771571 (Scopus ID)978-145033852-3 (ISBN)
Conference
12th Advances in Computer Entertainment Technology Conference (ACE), NOV 16-19, 2015, Iskandar, MALAYSIA
Note

QC 20161017

Available from: 2016-10-17 Created: 2016-10-11 Last updated: 2018-01-14Bibliographically approved
Forsslund, J., Yip, M. & Sallnäs, E.-L. (2015). WoodenHaptics: A Starting Kit for Crafting Force-Reflecting Spatial Haptic Devices. In: Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction: . Paper presented at Tangible, Embedded, and Embodied Interaction TEI 2015,Stanford University, January 15-19 2015 (pp. 133-140). Stanford: ACM Digital Library
Open this publication in new window or tab >>WoodenHaptics: A Starting Kit for Crafting Force-Reflecting Spatial Haptic Devices
2015 (English)In: Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction, Stanford: ACM Digital Library, 2015, p. 133-140Conference paper, Published paper (Refereed)
Abstract [en]

Spatial haptic interfaces have been around for 20 years. Yet, few affordable devices have been produced, and the design space in terms of physical workspace and haptic fidelity of devices that have been produced are limited and discrete. In this paper, an open-source, open-hardware module-based kit is presented that allows an interaction designer with little electro-mechanical experience to manufacture and assemble a fully working spatial haptic interface. It also allows for modification in shape and size as well as tuning of parameters to fit a particular task or application. Results from an evaluation showed that the haptic quality of the WoodenHaptics device was on par with a Phantom Desktop and that a novice could assemble it with guidance in a normal office space. This open source starting kit, uploaded free-to-download online, affords sketching in hardware; it “unsticks” the hardware from being a highly-specialized and esoteric craft to being an accessible and user-friendly technology, while maintaining the feel of high-fidelity haptics.

Place, publisher, year, edition, pages
Stanford: ACM Digital Library, 2015
Keywords
Guides; do-it-yourself; open-source; open-hardware; spatial haptics; force-feedback; haptic device; hardware sketching; interaction design
National Category
Human Computer Interaction
Research subject
Human-computer Interaction
Identifiers
urn:nbn:se:kth:diva-164285 (URN)10.1145/2677199.2680595 (DOI)2-s2.0-84924057539 (Scopus ID)978-1-4503-3305-4 (ISBN)
Conference
Tangible, Embedded, and Embodied Interaction TEI 2015,Stanford University, January 15-19 2015
Note

QC 20150420

Available from: 2015-04-15 Created: 2015-04-15 Last updated: 2018-01-11Bibliographically approved
Rosen, A., Eliassi, S., Fors, U., Sallnäs Pysander, E. L., Forsslund, J., Sejersen, R. & Lund, B. (2014). A computerised third molar surgery simulator - results of supervision by different professionals. European journal of dental education, 18(2), 86-90
Open this publication in new window or tab >>A computerised third molar surgery simulator - results of supervision by different professionals
Show others...
2014 (English)In: European journal of dental education, ISSN 1396-5883, E-ISSN 1600-0579, Vol. 18, no 2, p. 86-90Article in journal (Refereed) Published
Abstract [en]

The purpose of the study was to investigate which supervisory approach afforded the most efficient learning method for undergraduate students in oral and maxillofacial surgery (OMS) using a computerised third molar surgery simulator. Fifth year dental students participated voluntarily in a randomised experimental study using the simulator. The amount of time required and the number of trials used by each student were evaluated as a measure of skills development. Students had the opportunity to practise the procedure until no further visible improvements were achieved. The study assessed four different types of supervision to guide the students. The first group was where they were supported by a teacher/specialist in OMS, the second by a teaching assistant, the third group practised without any supervision and the fourth received help from a simulator technician/engineer. A protocol describing assessment criteria was designed for this purpose, and a questionnaire was completed by all participating students after the study. The average number of attempts required to virtually remove a third molar tooth in the simulator was 1.44 times for the group supervised by an OMS teacher; 1.5 times for those supervised by a teaching assistant; 2.8 times for those who had no supervision; and 3.6 times when support was provided only by a simulator technician. The results showed that the most efficient experience of the students was when they were helped by an OMS teacher or a teaching assistant. In a time and cost-effective perspective, supervision by a teaching assistant for a third molar surgery simulator would be the optimal choice.

Keywords
third molar surgery simulator, supervision, learning, teaching methods
National Category
Surgery Educational Sciences
Identifiers
urn:nbn:se:kth:diva-155499 (URN)10.1111/eje.12060 (DOI)000334607100008 ()2-s2.0-84899095438 (Scopus ID)
Note

QC 20141110

Available from: 2014-11-10 Created: 2014-11-06 Last updated: 2017-12-05Bibliographically approved
Frid, E., Bresin, R., Moll, J. & Sallnäs Pysander, E.-L. (2014). Sonification of haptic interaction in a virtual scene. In: Roberto Bresin (Ed.), Sound and Music Computing Sweden 2014, Stockholm, December 4-5, 2014: . Paper presented at Sound and Music Computing Sweden December 4-5 2014 (pp. 14-16).
Open this publication in new window or tab >>Sonification of haptic interaction in a virtual scene
2014 (English)In: Sound and Music Computing Sweden 2014, Stockholm, December 4-5, 2014 / [ed] Roberto Bresin, 2014, p. 14-16Conference paper, Published paper (Refereed)
Abstract [en]

This paper presents a brief overview of work-in-progress for a study on correlations between visual and haptic spatial attention in a multimodal single-user application comparing different modalities. The aim is to gain insight into how auditory and haptic versus visual representations of temporal events may affect task performance and spatial attention. For this purpose, a 3D application involving one haptic model and two different sound models for interactive sonification are developed.

Keywords
Interactive sonification, haptic feedback, spatial attention
National Category
Media and Communication Technology Human Computer Interaction
Research subject
Media Technology; Human-computer Interaction
Identifiers
urn:nbn:se:kth:diva-162115 (URN)
Conference
Sound and Music Computing Sweden December 4-5 2014
Note

Qc 20150323

Available from: 2015-03-21 Created: 2015-03-21 Last updated: 2020-01-28Bibliographically approved
Moll, J., Sallnäs Pysander, E.-L., Severinsson Eklundh, K. & Hellström, S.-O. (2014). The Effects of Audio and Haptic Feedback on Collaborative Scanning and Placing. Interacting with computers, 26(3), 177-195
Open this publication in new window or tab >>The Effects of Audio and Haptic Feedback on Collaborative Scanning and Placing
2014 (English)In: Interacting with computers, ISSN 0953-5438, E-ISSN 1873-7951, Vol. 26, no 3, p. 177-195Article in journal (Refereed) Published
Abstract [en]

This paper presents a study aimed at exploring the effects of different modality combinations on collaborative task performance and employed joint task-solving strategies in a shared interface. The modality combinations visual/haptic, visual/audio and visual/haptic/audio were compared in an experiment in which users solved a task together, working in pairs in adjacent rooms. The application used contained a flat surface in a 3D interface on which piles of cubes were randomly placed in a grid. The task involved scanning for empty cells and placing continuously falling cubes until all empty cells were filled. The cubes and the flat surface were designed in such a way that they could be felt and heard and thus could be recognized by different kinds of haptic and audio feedback cues. This made it possible to scan the environment and read both absolute and relative positions in the grid. A quantitative analysis of task performance and a qualitative analysis of video recordings and interview data were performed. Results showed that task completion times were significantly faster in the visual/haptic/audio condition compared with the other conditions and that there were also significantly fewer errors, result checks of one's own actions and double checks of the partner's actions in the visual/haptic/audio condition than in the other conditions. Qualitative results show that participants work simultaneously to a larger extent in the visual/haptic/audio condition and that less communication occurred in the visual/haptic/audio condition compared with the other conditions. We argue that more modalities improved the awareness of the environment resulting in the participants feeling more confident with their interaction in the environment in the visual/haptic/audio condition. This resulted in improved task performance. The visual/audio feedback was better suited for solving the task than the visual/haptic feedback even though haptic feedback gave a significant added value in the visual/haptic/audio condition.

Keywords
collaborative interaction, haptic devices, auditory feedback, pointing, laboratory experiments
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-145573 (URN)10.1093/iwc/iwt031 (DOI)000334363300001 ()2-s2.0-84898410500 (Scopus ID)
Funder
Swedish Research Council, 60467801
Note

QC 20140611

Available from: 2014-06-11 Created: 2014-05-23 Last updated: 2020-01-28Bibliographically approved
Moll, J. & Sallnäs Pysander, E.-L. (2013). A haptic tool for group work on geometrical concepts engaging blind and sighted pupils. ACM Transactions on Accessible Computing, 4(4)
Open this publication in new window or tab >>A haptic tool for group work on geometrical concepts engaging blind and sighted pupils
2013 (English)In: ACM Transactions on Accessible Computing, ISSN 1936-7228, Vol. 4, no 4Article in journal (Refereed) Published
Abstract [en]

General Terms: Design, Experimentation, Human Factors In the study presented here, two haptic and visual applications for learning geometrical concepts in group work in primary school have been designed and evaluated. The aim was to support collaborative learning among sighted and visually impaired pupils. The first application is a static flattened 3D environment that supports learning to distinguish between angles by means of a 3D haptic device providing touch feedback. The second application is a dynamic 3D environment that supports learning of spatial geometry. The scene is a room with a box containing geometrical objects, which pupils can pick up and move around. The applications were evaluated in four schools with groups of two sighted and one visually impaired pupil. The results showed the support for the visually impaired pupil and for the collaboration to be satisfying. A shared understanding of the workspace could be achieved, as long as the virtual environment did not contain movable objects. Verbal communication was crucial for the work process but haptic guiding to some extent substituted communication about direction. When it comes to joint action between visually impaired and sighted pupils a number of interesting problems were identified when the dynamic and static virtual environments were compared. These problems require further investigation. The study extends prior work in the areas of assistive technology and multimodal communication by evaluating functions for joint haptic manipulation in the unique setting of group work in primary school.

Keywords
Assistive technology, Collaborative learning, Geometrical objects, Multimodal communications, Shared understanding, Verbal communications, Visual applications, Visually impaired, Communication, Human computer interaction, Three dimensional, Virtual reality, Geometry
National Category
Interaction Technologies
Identifiers
urn:nbn:se:kth:diva-140033 (URN)10.1145/2493171.2493172 (DOI)2-s2.0-84885628555 (Scopus ID)
Note

QC 20140121

Available from: 2014-01-21 Created: 2014-01-16 Last updated: 2020-01-28Bibliographically approved
Huang, Y. Y., Moll, J., Sallnäs Pysander, E.-L. & Sundblad, Y. (2012). Auditory Feedback in Haptic Collaborative Interfaces. International journal of human-computer studies, 70(4), 257-270
Open this publication in new window or tab >>Auditory Feedback in Haptic Collaborative Interfaces
2012 (English)In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 70, no 4, p. 257-270Article in journal (Refereed) Published
Abstract [en]

The combined effect of haptic and auditory feedback in shared interfaces, on the cooperation between visually impaired and sighted persons is under-investigated. A central challenge for cooperating group members lies in obtaining a common understanding of the elements of the workspace and maintaining awareness of the other members’, as well as one’s own, actions during the work process. The aim of the experimental study presented here, was to investigate if adding audio cues in a haptic and visual interface makes collaboration between a sighted and a blindfolded person more efficient. Results showed that task performance was significantly faster in the audio, haptic and visual feedback condition compared to the haptic and visual feedback condition. One special focus was also to study how participants utilize the auditory and haptic force feedback in order to obtain a common understanding of the workspace and to maintain an awareness of the group members’ actions. Results from a qualitative analysis showed that the auditory and haptic feedback was used in a number of important ways for the group members’ action awareness and in the participants’ grounding process.

Keywords
Awareness, Collaboration, Common ground, Force feedback, Haptic, Multimodal interface, Virtual environments
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-13887 (URN)10.1016/j.ijhcs.2011.11.006 (DOI)000301313100001 ()2-s2.0-84855266298 (Scopus ID)
Funder
Swedish e‐Science Research CenterStandUp
Note

QC 20100701

Available from: 2010-07-01 Created: 2010-07-01 Last updated: 2020-01-28Bibliographically approved
Forsslund, J., Sallnäs Pysander, E.-L. & Lundin Palmerius, K. (2011). Design of Perceptualization Applications in Medicine. In: : . Paper presented at First workshop on Engineering Interactive Computing Systems for Medicine and Health Care (EICS4Med). Pisa, Italy - June 13, 2011 (pp. 42-47).
Open this publication in new window or tab >>Design of Perceptualization Applications in Medicine
2011 (English)Conference paper, Published paper (Refereed)
Abstract [en]

We are in this position paper presenting the experiences we have from three medical application projects. A user centered design methodology have been applied in order to ground the design in requirements gathered from field studies of professional medical environments. Methods used have been interviews, user observations in the work context and cooperative evaluations of prototypes. With a particular focus on haptic (touch) feedback, we are exploring how novel medical applications can benefit from feedback to more senses than vision and how needs can berevealed and transformed into effective design.

Keywords
User centered design, perceptualization, haptics, medical
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-59547 (URN)2-s2.0-84890705975 (Scopus ID)
Conference
First workshop on Engineering Interactive Computing Systems for Medicine and Health Care (EICS4Med). Pisa, Italy - June 13, 2011
Note

QC 20130916

Available from: 2012-01-11 Created: 2012-01-11 Last updated: 2018-01-12Bibliographically approved
Jansson, J., Ioakeimidou, F., Ericson, F., Spühler, J., Hoffman, J., Olwal, A., . . . Forsslund, J. (2011). Gestural 3D Interaction with a Beating Heart: Simulation Visualization and Interaction. In: Thomas Larsson, Lars Kjelldahl & Kai-Mikael Jää-Aro (Ed.), Proceedings of SIGRAD 2011: Evaluations of Graphics and Visualization— Efficiency, Usefulness, Accessibility, Usability. Paper presented at SIGRAD 2011. Linköping University Electronic Press
Open this publication in new window or tab >>Gestural 3D Interaction with a Beating Heart: Simulation Visualization and Interaction
Show others...
2011 (English)In: Proceedings of SIGRAD 2011: Evaluations of Graphics and Visualization— Efficiency, Usefulness, Accessibility, Usability / [ed] Thomas Larsson, Lars Kjelldahl & Kai-Mikael Jää-Aro, Linköping University Electronic Press, 2011Conference paper, Published paper (Refereed)
Abstract [en]

The KTH School of Computer Science and Communication (CSC) established a strategic platform in Simulation-Visualization-Interaction (SimVisInt) in 2009, focused on the high potential in bringing together CSC core com-petences in simulation technology, visualization and interaction. The main part of the platform takes the form aset of new trans-disciplinary projects across established CSC research groups, within the theme of ComputationalHuman Modeling and Visualization: (i) interactive virtual biomedicine (HEART), (ii) simulation of human mo-tion (MOTION), and (iii) virtual prototyping of human hand prostheses (HAND). In this paper, we present recentresults from the HEART project that focused on gestural and haptic interaction with a heart simulation.

Place, publisher, year, edition, pages
Linköping University Electronic Press, 2011
National Category
Computational Mathematics Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-52832 (URN)978-91-7393-008-6 (ISBN)
Conference
SIGRAD 2011
Projects
Simulation Visualization Interaction (SimVisInt)
Note

QC 20120202

Available from: 2011-12-20 Created: 2011-12-20 Last updated: 2018-01-12Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-3743-100X

Search in DiVA

Show all publications