Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rubbing and Tapping for Precise and Rapid Selection on Touch-Screen Displays
KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.ORCID iD: 0000-0003-2578-3403
KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
2008 (English)In: CHI: SIGCHI Conference on Human Factors in Computing Systems, 2008, 295-304 p.Conference paper, Published paper (Refereed)
Abstract [en]

We introduce two families of techniques, rubbing and tapping, that use zooming to make precise interaction on passive touch screens possible. Rub-Pointing uses a diagonal rubbing gesture to integrate pointing and zooming in a single-handed technique. In contrast, Zoom-Tapping is a twohanded technique in which the dominant hand points, while the non-dominant hand taps to zoom, simulating multitouch functionality on a single-touch display. Rub-Tapping is a hybrid technique that integrates rubbing with the dominant hand to point and zoom, and tapping with the nondominant hand to confirm selection. We describe the results of a formal user study comparing these techniques with each other and with the well-known Take-Off and Zoom-Pointing selection techniques. Rub-Pointing and Zoom-Tapping had significantly fewer errors than Take-Off for small targets, and were significantly faster than Take-Off and Zoom-Pointing. We show how the techniques can be used for fluid interaction in an image viewer and in existing applications, such as Google Maps.

Place, publisher, year, edition, pages
2008. 295-304 p.
Keyword [en]
Interaction techniques; Pointing; Rub-Pointing; Rub-Tapping; Rubbing; Tapping; Touch screens; Zoom-Tapping
National Category
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-10465DOI: 10.1145/1357054.1357105ISI: 000268586100039Scopus ID: 2-s2.0-57649178619OAI: oai:DiVA.org:kth-10465DiVA: diva2:217818
Conference
26th Annual CHI Conference on Human Factors in Computing Systems, CHI 2008; Florence; 5 April 2008 through 10 April 2008
Note
QC 20100804Available from: 2009-05-15 Created: 2009-05-15 Last updated: 2010-12-06Bibliographically approved
In thesis
1. Unobtrusive Augmentation  of Physical Environments: Interaction Techniques, Spatial Displays and Ubiquitous Sensing
Open this publication in new window or tab >>Unobtrusive Augmentation  of Physical Environments: Interaction Techniques, Spatial Displays and Ubiquitous Sensing
2009 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

The fundamental idea of Augmented Reality (AR) is to improve and enhance our perception of the surroundings, through the use of sensing, computing and display systems that make it possible to augment the physical environment with virtual computer graphics. AR is, however, often associated with user-worn equipment, whose current complexity and lack of comfort limit its applicability in many scenarios.

The goal of this work has been to develop systems and techniques for uncomplicated AR experiences that support sporadic and spontaneous interaction with minimal preparation on the user’s part.

This dissertation defines a new concept, Unobtrusive AR, which emphasizes an optically direct view of a visually unaltered physical environment, the avoidance of user-worn technology, and the preference for unencumbering techniques.

The first part of the work focuses on the design and development of two new AR display systems. They illustrate how AR experiences can be achieved through transparent see-through displays that are positioned in front of the physical environment to be augmented. The second part presents two novel sensing techniques for AR, which employ an instrumented surface for unobtrusive tracking of active and passive objects. These techniques have no visible sensing technology or markers, and are suitable for deployment in scenarios where it is important to maintain the visual qualities of the real environment. The third part of the work discusses a set of new interaction techniques for spatially aware handheld displays, public 3D displays, touch screens, and immaterial displays (which are not constrained by solid surfaces or enclosures). Many of the techniques are also applicable to human-computer interaction in general, as indicated by the accompanying qualitative and quantitative insights from user evaluations.

The thesis contributes a set of novel display systems, sensing technologies, and interaction techniques to the field of human-computer interaction, and brings new perspectives to the enhancement of real environments through computer graphics.

Place, publisher, year, edition, pages
Stockholm: KTH, 2009. xii, 72 p.
Series
TRITA-CSC-A, ISSN 1653-5723 ; 2009:09
National Category
Computer Science
Identifiers
urn:nbn:se:kth:diva-10439 (URN)978-91-7415-339-2 (ISBN)
Public defence
2009-06-05, E1, Lindstedtsvägen 3, KTH, 13:00 (English)
Opponent
Supervisors
Note

QC 20100805

Available from: 2009-05-26 Created: 2009-05-14 Last updated: 2015-01-30Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Authority records BETA

Olwal, Alex

Search in DiVA

By author/editor
Olwal, AlexHeyman, Susanna
By organisation
Numerical Analysis and Computer Science, NADA
Computer Science

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 93 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf