Change search
Link to record
Permanent link

Direct link
BETA
Publications (10 of 29) Show all publications
Leithinger, D., Follmer, S., Olwal, A. & Ishii, H. (2015). Shape Displays: Spatial Interaction with Dynamic Physical Form. IEEE Computer Graphics and Applications, 35(5), 5-11
Open this publication in new window or tab >>Shape Displays: Spatial Interaction with Dynamic Physical Form
2015 (English)In: IEEE Computer Graphics and Applications, ISSN 0272-1716, E-ISSN 1558-1756, Vol. 35, no 5, p. 5-11Article in journal (Refereed) Published
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-175931 (URN)10.1109/MCG.2015.111 (DOI)000361969200002 ()26416359 (PubMedID)2-s2.0-84942746011 (Scopus ID)
Note

QC 20151103

Available from: 2015-11-03 Created: 2015-10-26 Last updated: 2018-01-10Bibliographically approved
Leithinger, D., Follmer, S., Olwal, A. & Ishii, H. (2014). Physical Telepresence: Shape capture and display for embodied, computer-mediated remote collaboration. In: UIST 2014 - Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology: . Paper presented at 27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014, 5 October 2014 through 8 October 2014, United States (pp. 461-470). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Physical Telepresence: Shape capture and display for embodied, computer-mediated remote collaboration
2014 (English)In: UIST 2014 - Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Association for Computing Machinery (ACM), 2014, p. 461-470Conference paper, Published paper (Refereed)
Abstract [en]

We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of user's body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2014
Keyword
Actuated tangible interfaces, Physical telepresence, Shape displays, Shape-changing user interfaces, Teleoperation
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-157897 (URN)10.1145/2642918.2647377 (DOI)2-s2.0-84912045315 (Scopus ID)978-145033069-5 (ISBN)
Conference
27th Annual ACM Symposium on User Interface Software and Technology, UIST 2014, 5 October 2014 through 8 October 2014, United States
Funder
Swedish Research Council
Note

QC 20141217

Available from: 2014-12-17 Created: 2014-12-17 Last updated: 2018-01-11Bibliographically approved
Lakatos, D., Blackshaw, M., Olwal, A., Barryte, Z., Perlin, K. & Ishii, H. (2014). T(ether): Spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation. In: SUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction: . Paper presented at 2nd ACM Symposium on Spatial User Interaction, SUI 2014, 4 October 2014 through 5 October 2014, Honolulu, United States (pp. 90-93). Association for Computing Machinery (ACM)
Open this publication in new window or tab >>T(ether): Spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation
Show others...
2014 (English)In: SUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction, Association for Computing Machinery (ACM), 2014, p. 90-93Conference paper, Published paper (Refereed)
Abstract [en]

T(ether) is a spatially-aware display system for multi-user, collaborative manipulation and animation of virtual 3D objects. The handheld display acts as a window into virtual reality, providing users with a perspective view of 3D data. T(ether) tracks users' heads, hands, fingers and pinching, in addition to a handheld touch screen, to enable rich interaction with the virtual scene. We introduce gestural interaction techniques that exploit proprioception to adapt the UI based on the hand's position above, behind or on the surface of the display. These spatial interactions use a tangible frame of reference to help users manipulate and animate the model in addition to controlling environment properties. We report on initial user observations from an experiment for 3D modeling, which indicate T(ether)'s potential for embodied viewport control and 3D modeling interactions.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2014
Keyword
3D modeling, 3D user interfaces, Collaborative, Gestural interaction, Multi-user, Spatially-aware displays, VR
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-157950 (URN)10.1145/2659766.2659785 (DOI)2-s2.0-84910604125 (Scopus ID)978-145032820-3 (ISBN)
Conference
2nd ACM Symposium on Spatial User Interaction, SUI 2014, 4 October 2014 through 5 October 2014, Honolulu, United States
Note

QC 20141219

Available from: 2014-12-19 Created: 2014-12-18 Last updated: 2018-01-11Bibliographically approved
Ioakeimidou, F., Olwal, A., Nordberg, A. & von Holst, H. (2011). 3D Visualization and Interaction with Spatiotemporal X-ray Data to Minimize Radiation in Image-guided Surgery. In: Olive, M; Solomonides, T (Ed.), 2011 24TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS (CBMS). Paper presented at 24th International Symposium on Computer-Based Medical Systems (CBMS). Univ W England, Bristol, ENGLAND. JUN 27-30, 2011. NEW YORK, NY: IEEE
Open this publication in new window or tab >>3D Visualization and Interaction with Spatiotemporal X-ray Data to Minimize Radiation in Image-guided Surgery
2011 (English)In: 2011 24TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS (CBMS) / [ed] Olive, M; Solomonides, T, NEW YORK, NY: IEEE , 2011Conference paper, Published paper (Refereed)
Abstract [en]

Image-guided surgery (IGS) often depends on X-ray imaging, since pre-operative MRI, CT and PET scans do not provide an up-to-date internal patient view during the operation. X-rays introduce hazardous radiation, but long exposures for monitoring are often necessary to increase accuracy in critical situations. Surgeons often also take multiple X-rays from different angles, as X-rays only provide a distorted 2D perspective from the current viewpoint. We introduce a prototype IGS system that augments 2D X-ray images with spatiotemporal information using a motion tracking system, such that the use of X-rays can be reduced. In addition, an interactive visualization allows exploring 2D X-rays in timeline views and 3D clouds where they are arranged according to the viewpoint at the time of acquisition. The system could be deployed and used without time-consuming calibration, and has the potential to improve surgeons' spatial awareness, while increasing efficiency and patient safety.

Place, publisher, year, edition, pages
NEW YORK, NY: IEEE, 2011
Series
IEEE International Symposium on Computer-Based Medical Systems, ISSN 1063-7125
Keyword
data visualisation, diagnostic radiography, medical image processing, motion estimation, surgery
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-46837 (URN)10.1109/CBMS.2011.5999129 (DOI)000295472700046 ()2-s2.0-80052973896 (Scopus ID)978-1-4577-1190-9 (ISBN)
Conference
24th International Symposium on Computer-Based Medical Systems (CBMS). Univ W England, Bristol, ENGLAND. JUN 27-30, 2011
Note
QC 20111107Available from: 2011-11-07 Created: 2011-11-07 Last updated: 2018-01-12Bibliographically approved
Olwal, A., Frykholm, O., Groth, K. & Moll, J. (2011). Design and Evaluation of Interaction Technology for Medical Team Meetings. In: 13th IFIP TC 13 International Conference, Lisbon, Portugal, September 5-9, 2011, Proceedings, Part I. Paper presented at INTERACT 2011. 13th IFIP TC13 Conference on Human-Computer Interaction, Lisbon, Portugal, September, 5-9, 2011 (pp. 505-522). Springer
Open this publication in new window or tab >>Design and Evaluation of Interaction Technology for Medical Team Meetings
2011 (English)In: 13th IFIP TC 13 International Conference, Lisbon, Portugal, September 5-9, 2011, Proceedings, Part I, Springer, 2011, p. 505-522Conference paper, Published paper (Refereed)
Abstract [en]

Multi-disciplinary team meetings (MDTMs) are essential in health-care, where medical specialists discuss diagnosis and treatment of patients. We introduce a prototype multi-display groupware system, intended to augment the discussions of medical imagery, through a range of input mechanisms, multi-user interfaces and interaction techniques on multi-touch devices and pen-based technologies. Observations of MDTMs, as well as interviews and observations of surgeons and radiologists, serve as a foundation for guidelines and a set of implemented techniques. We present a detailed analysis of a study where the techniques’ potential was explored with radiologists and surgeons of different specialties and varying expertise. The results show that the implemented technologies have the potential to bring numerous benefits to the team meetings with minimal modification to the current workflow. We discuss how they can augment the expressiveness and communication between meeting participants, facilitate understanding for novices, and improve remote collaboration.

Place, publisher, year, edition, pages
Springer, 2011
Series
Lecture Notes in Social Networks ; 6946
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-55292 (URN)10.1007/978-3-642-23774-4_42 (DOI)2-s2.0-80052815101 (Scopus ID)
Conference
INTERACT 2011. 13th IFIP TC13 Conference on Human-Computer Interaction, Lisbon, Portugal, September, 5-9, 2011
Note

QC 20120102. QC 20130326

Available from: 2012-01-02 Created: 2012-01-02 Last updated: 2018-01-12Bibliographically approved
Ericson, F. & Olwal, A. (2011). Interaction and rendering techniques for handheld phantograms. In: Conf Hum Fact Comput Syst Proc: . Paper presented at 29th Annual CHI Conference on Human Factors in Computing Systems, CHI 2011, 7 May 2011 through 12 May 2011, Vancouver, BC (pp. 1339-1344).
Open this publication in new window or tab >>Interaction and rendering techniques for handheld phantograms
2011 (English)In: Conf Hum Fact Comput Syst Proc, 2011, p. 1339-1344Conference paper, Published paper (Refereed)
Abstract [en]

We present a number of rendering and interaction techniques that exploit the user's viewpoint for improved realism and immersion in 3D applications on handheld devices. Unlike 3D graphics on stationary screens, graphics on handheld devices are seldom regarded from a fixed perspective. This is particularly true for recent mobile platforms, where it is increasingly popular to use device orientation for interaction. We describe a set of techniques for improved perception of rendered 3D content. View-point correct anamorphosis and stereoscopy are discussed along with ways to approximate the spatial relationship between the user and the device. We present the design and implementation of a prototype phantogram viewer that was used to explore these methods for interaction with real-time photorealistic 3D models on commercially available mobile devices.

Series
Conference on Human Factors in Computing Systems - Proceedings
Keyword
Immersion, Interaction, Mobile, Phantogram, Rendering, User interface, Virtual reality, Hand held computers, Human computer interaction, Human engineering, Mobile devices, User interfaces, Three dimensional
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-151209 (URN)10.1145/1979742.1979771 (DOI)2-s2.0-79957934762 (Scopus ID)9781450302289 (ISBN)
Conference
29th Annual CHI Conference on Human Factors in Computing Systems, CHI 2011, 7 May 2011 through 12 May 2011, Vancouver, BC
Note

QC 20140919

Available from: 2014-09-19 Created: 2014-09-15 Last updated: 2018-01-11Bibliographically approved
Olwal, A., Lachanas, D. & Zacharouli, E. (2011). OldGen: Mobile phone personalization for older adults. In: Conference on Human Factors in Computing Systems: . Paper presented at 29th Annual CHI Conference on Human Factors in Computing Systems, CHI 2011, 7 May 2011 through 12 May 2011, Vancouver, BC (pp. 3393-3396).
Open this publication in new window or tab >>OldGen: Mobile phone personalization for older adults
2011 (English)In: Conference on Human Factors in Computing Systems, 2011, p. 3393-3396Conference paper, Published paper (Refereed)
Abstract [en]

Mobile devices are currently difficult to customize for the usability needs of elderly users. The elderly are instead referred to specially designed "senior phones" or software add-ons. These tend to compromise in functionality as they attempt to solve many disabilities in a single solution. We present OldGen, a prototype framework where a novel concept enables accessibility features on generic mobile devices, by decoupling the software user interface from the phone's physical form factor. This opens up for better customization of the user interface, its functionality and behavior, and makes it possible to adapt it to the specific needs of each individual. OldGen makes the user interface portable, such that it could be moved between different phone hardware, regardless of model and brand. Preliminary observations and evaluations with elderly users indicate that this concept could address individual user interface related accessibility issues on general-purpose devices.

Series
Conference on Human Factors in Computing Systems - Proceedings
Keyword
Accessibility, Elderly, Mobile, Older users, User interfaces, Form factors, Novel concept, Older adults, Personalizations, Software user interface, Approximation theory, Human computer interaction, Human engineering, Mobile devices, Portable equipment, Telephone sets, Cellular telephone systems
National Category
Human Computer Interaction Communication Systems
Identifiers
urn:nbn:se:kth:diva-151204 (URN)10.1145/1978942.1979447 (DOI)2-s2.0-79958163326 (Scopus ID)9781450302289 (ISBN)
Conference
29th Annual CHI Conference on Human Factors in Computing Systems, CHI 2011, 7 May 2011 through 12 May 2011, Vancouver, BC
Note

QC 20140919

Available from: 2014-09-19 Created: 2014-09-15 Last updated: 2018-01-11Bibliographically approved
Olwal, A. (2009). Augmenting Surface Interaction through Context-Sensitive Mobile Devices. In: Gross T; Gulliksen J; Kotze P; Oestreicher L; Palanque P; Prates RO; Winckler M (Ed.), HUMAN-COMPUTER INTERACTION - INTERACT 2009, PT II, PROCEEDINGS. Paper presented at 12th IFIP International Conference on Human-Computer Interaction, Uppsala, SWEDEN, AUG 24-28, 2009 (pp. 336-339). , 5727
Open this publication in new window or tab >>Augmenting Surface Interaction through Context-Sensitive Mobile Devices
2009 (English)In: HUMAN-COMPUTER INTERACTION - INTERACT 2009, PT II, PROCEEDINGS / [ed] Gross T; Gulliksen J; Kotze P; Oestreicher L; Palanque P; Prates RO; Winckler M, 2009, Vol. 5727, p. 336-339Conference paper, Published paper (Refereed)
Abstract [en]

We discuss the benefits of using a mobile device to expand and improve the interactions on a large touch-sensitive Surface. The mobile device's denser arrangement of pixels and touch-sensor elements, and its rich set of mechanical on-board input controls, can be leveraged for increased expressiveness, visual feedback and more precise direct-manipulation. We also show how these devices can support unique input from Multiple simultaneous users in collaborative scenarios. Handheld mobile devices and large interactive surfaces can be Mutually beneficial in numerous ways, while their complementary nature allows them to preserve the behavior of the original user interface.

Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; 5727
Keyword
Touch, mobile, interaction techniques, interactive surface
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-30440 (URN)10.1007/978-3-642-03658-3_39 (DOI)000270204900039 ()2-s2.0-70349573081 (Scopus ID)978-3-642-03657-6 (ISBN)
Conference
12th IFIP International Conference on Human-Computer Interaction, Uppsala, SWEDEN, AUG 24-28, 2009
Note
QC 20110224Available from: 2011-02-24 Created: 2011-02-24 Last updated: 2018-01-12Bibliographically approved
Rakkolainen, I., Höllerer, T., DiVerdi, S. & Olwal, A. (2009). Mid-air display experiments to create novel user interfaces. Multimedia tools and applications, 44(3), 389-405
Open this publication in new window or tab >>Mid-air display experiments to create novel user interfaces
2009 (English)In: Multimedia tools and applications, ISSN 1380-7501, E-ISSN 1573-7721, Vol. 44, no 3, p. 389-405Article in journal (Refereed) Published
Abstract [en]

Displays are the most visible part of most computer applications. Novel display technologies strongly influence and inspire new forms of computer use and interaction. We are particularly interested in the interplay of novel displays and interaction for ubiquitous computing or ambient media environments, as emerging display technologies may become game-changers in how we define and use computers, possibly changing the context of computing fundamentally. We present some of our experiments and lessons learnt with a new category of displays, the "immaterial" FogScreen. It can be described as a novel media platform, exhibiting some fundamental differences to and advantages over other displays. It also enables novel kinds of user interfaces and experiences. In this paper we give insights about the special properties and strengths of the FogScreen by looking at a set of successfully demonstrated interfaces and applications. We also discuss its future potential for user interface design.

Keyword
FogScreen, Display, Interaction, Ambient media
National Category
Computer and Information Sciences
Identifiers
urn:nbn:se:kth:diva-30747 (URN)10.1007/s11042-009-0280-1 (DOI)000268313900004 ()2-s2.0-70349568763 (Scopus ID)
Note
QC 20110304 1st ACM International Workshop on Semantic Ambient Media Experience, Vancouver, CANADA, OCT 31, 2008Available from: 2011-03-04 Created: 2011-03-04 Last updated: 2018-01-12Bibliographically approved
Olwal, A. & Feiner, S. (2009). Spatially Aware Handhelds for High-Precision Tangible Interaction with Large Displays. In: TEI 2009: International Conference on Tangible and Embedded Interaction (pp. 181-188).
Open this publication in new window or tab >>Spatially Aware Handhelds for High-Precision Tangible Interaction with Large Displays
2009 (English)In: TEI 2009: International Conference on Tangible and Embedded Interaction, 2009, p. 181-188Conference paper, Published paper (Refereed)
Abstract [en]

While touch-screen displays are becoming increasingly popular, many factors affect user experience and performance. Surface quality, parallax, input resolution, and robustness, for instance, can vary with sensing technology, hardware configurations, and environmental conditions.

We have developed a framework for exploring how we could overcome some of these dependencies, by leveraging the higher visual and input resolution of small, coarsely tracked mobile devices for direct, precise, and rapid interaction on large digital displays.

The results from a formal user study show no significant differences in performance when comparing four techniques we developed for a tracked mobile device, where two existing touch-screen techniques served as baselines. The mobile techniques, however, had more consistent performance and smaller variations among participants, and an overall higher user preference in our setup. Our results show the potential of spatially aware handhelds as an interesting complement or substitute for direct touch-interaction on large displays.

Keyword
Interaction technique; LightSense; Mobile; MobileButtons; MobileDrag; MobileGesture; MobileRub; Spatially aware; Tangible; Touch; Touch-screen
National Category
Computer Sciences
Identifiers
urn:nbn:se:kth:diva-10437 (URN)10.1145/1517664.1517705 (DOI)2-s2.0-70349092988 (Scopus ID)
Note
QC 20100805Available from: 2009-05-15 Created: 2009-05-14 Last updated: 2018-01-13Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-2578-3403

Search in DiVA

Show all publications