Change search
ReferencesLink to record
Permanent link

Direct link
Hand-free motion interaction on google glass
KTH, School of Computer Science and Communication (CSC), Media Technology and Interaction Design, MID.ORCID iD: 0000-0003-3779-5647
2014 (English)In: SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SA 2014, 2014Conference paper (Refereed)
Abstract [en]

There is an increasing interest in creating wearable device interaction technologies. Novel emerging user interface technologies (e.g. eye-ball tracking, speech recognition, gesture recognition, ECG, EEG and fusion of them) have the potential to significantly affect market share in PC, smartphones, tablets and latest wearable devices such as google glass. As a result, displacing these technologies in devices such as smart phones and wearable devices is challenging. Google glass has many impressive characteristics (i.e. voice actions, head wake up, wink detection), which are human-glass interface (HGI) technologies. Google glass won't meet the 'the occlusion problem' and 'the fat finger problem' any more, which are the problems of direct-touch finger input on touch screen. However, google glass only provides a touchpad that includes haptics with simple 'tapping and sliding your finger' gestures which is a one-dimensional interaction in fact, instead of the traditional two-dimensional interaction based on the complete touch screen of smartphone. The one-dimensional 'swipe the touchpad' interaction with a row of 'Cards' which replace traditional two-dimensional icon menu limits the intuitive and flexibility of HGI. Therefore, there is a growing interest in implementing 3D gesture recognition vision systems in which optical sensors capture real-time video of the user and ubiquitous algorithms are then used to determine what the user's gestures are, without the user having to hold any device. We will demonstrate a hand-free motion interaction application based on computer vision technology on google glass. Presented application allows user to perform touch-less interaction by hand or foot gesture in front of the camera of google glass. Based on the same core ubiquitous gestures recognition algorithm as used in this demonstration, a hybrid wearable smartphone system based on mixed hardware and software has been presented in our previous work [Lv 2013][Lu et al. 2013][Lv et al. 2013], which can support either hand or foot interaction with today' smartphone.

Place, publisher, year, edition, pages
National Category
Media and Communications
URN: urn:nbn:se:kth:diva-167895DOI: 10.1145/2669062.2669066ScopusID: 2-s2.0-84919333354ISBN: 9781450318914OAI: diva2:820153
SIGGRAPH Asia 2014 Mobile Graphics and Interactive Applications, SA 2014, 3 December 2014 through 6 December 2014

QC 20150611

Available from: 2015-06-11 Created: 2015-05-22 Last updated: 2015-06-11Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Li, Haibo
By organisation
Media Technology and Interaction Design, MID
Media and Communications

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Altmetric score

Total: 14 hits
ReferencesLink to record
Permanent link

Direct link