skip to main content
10.1145/2642918.2647397acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Gaze-touch: combining gaze with multi-touch for interaction on the same surface

Published: 05 October 2014 Publication History

Abstract

Gaze has the potential to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of 'gaze selects, touch manipulates'. Gaze is used to select a target, and coupled with multi-touch gestures that the user can perform anywhere on the surface. Gaze-touch enables users to manipulate any target from the same touch position, for whole-surface reachability and rapid context switching. Conversely, gaze-touch enables manipulation of the same target from any touch position on the surface, for example to avoid occlusion. Gaze-touch is designed to complement direct-touch as the default interaction on multi-touch surfaces. We provide a design space analysis of the properties of gaze-touch versus direct-touch, and present four applications that explore how gaze-touch can be used alongside direct-touch. The applications demonstrate use cases for interchangeable, complementary and alternative use of the two modes of interaction, and introduce novel techniques arising from the combination of gaze-touch and conventional multi-touch.

Supplementary Material

suppl.mov (uistf3464-file3.mp4)
Supplemental video

References

[1]
Abednego, M., Lee, J.-H., Moon, W., and Park, J.-H. I-grabber: Expanding physical reach in a large-display tabletop environment through the use of a virtual grabber. In ITS '09, ACM (2009), 61--64.
[2]
Albinsson, P.-A., and Zhai, S. High precision touch screen interaction. In CHI '03, ACM (2003), 105--112.
[3]
Banerjee, A., Burstyn, J., Girouard, A., and Vertegaal, R. Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops. In ITS '11, ACM (2011), 11--20.
[4]
Benko, H., Wilson, A. D., and Baudisch, P. Precise selection techniques for multi-touch screens. In CHI '06, ACM (2006), 1263--1272.
[5]
Bezerianos, A., and Balakrishnan, R. The vacuum: facilitating the manipulation distant objects. In CHI '05, ACM (2005), 361--370.
[6]
Hansen, J. P., T�rning, K., Johansen, A. S., Itoh, K., and Aoki, H. Gaze typing compared with input by head and hand. In ETRA '04, ACM (2004), 131--138.
[7]
Hoggan, E., Nacenta, M., Kristensson, P. O., Williamson, J., Oulasvirta, A., and Lehti�, A. Multi-touch pinch gestures: Performance and ergonomics. In ITS '13, ACM (2013), 219--222.
[8]
Hoggan, E., Williamson, J., Oulasvirta, A., Nacenta, M., Kristensson, P. O., and Lehti�, A. Multi-touch rotation gestures: Performance and ergonomics. In CHI '13, ACM (2013), 3047--3050.
[9]
Holz, C., and Baudisch, P. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In CHI '10, ACM (2010), 581--590.
[10]
Jacob, R. J. K. What you look at is what you get: eye movement-based interaction techniques. In CHI '90, ACM (1990), 11--18.
[11]
Jacob, R. J. K. Eye movement-based human-computer interaction techniques: Toward non-command interfaces. In Advances in Human-Computer Interaction, Vol. 4, Ablex Publishing (1993), 151--190.
[12]
Koons, D. B., Sparrell, C. J., and Thorisson, K. R. Intelligent multimedia interfaces. American Association for Artificial Intelligence, Menlo Park, CA, USA, 1993, ch. Integrating Simultaneous Input from Speech, Gaze, and Hand Gestures, 257--276.
[13]
Kumar, M., Klingner, J., Puranik, R., Winograd, T., and Paepcke, A. Improving the accuracy of gaze input for interaction. In ETRA '08, ACM (2008), 65--68.
[14]
Mateo, J. C., San Agustin, J., and Hansen, J. P. Gaze beats mouse: Hands-free selection by combining gaze and emg. In CHI EA '08, ACM (2008), 3039--3044.
[15]
Pouke, M., Karhu, A., Hickey, S., and Arhippainen, L. Gaze tracking and non-touch gesture based interaction method for mobile 3d virtual spaces. In OzCHI '12, ACM (2012), 505--512.
[16]
Sibert, L. E., and Jacob, R. J. K. Evaluation of eye gaze interaction. In CHI '00, ACM (2000), 281--288.
[17]
Stasko, J., Gorg, C., and Liu, Z. Jigsaw: Supporting investigative analysis through interactive visualization. Information Visualization 7, 2 (Apr. 2008), 118--132.
[18]
Stellmach, S., and Dachselt, R. Investigating gaze-supported multimodal pan and zoom. In ETRA '12, ACM (2012), 357--360.
[19]
Stellmach, S., and Dachselt, R. Look & touch: gaze-supported target acquisition. In CHI '12, ACM (2012), 2981--2990.
[20]
Stellmach, S., and Dachselt, R. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In CHI '13, ACM (2013), 285--294.
[21]
Turner, J., Alexander, J., Bulling, A., Schmidt, D., and Gellersen, H. Eye pull, eye push: Moving objects between large screens and personal devices with gaze & touch. In INTERACT '13, Springer (2013), 170--186.
[22]
Turner, J., Bulling, A., Alexander, J., and Gellersen, H. Cross-device gaze-supported point-to-point content transfer. In ETRA '14, ACM (2014), 19--26.
[23]
Wigdor, D., Benko, H., Pella, J., Lombardo, J., and Williams, S. Rock & rails: Extending multi-touch interactions with shape gestures to enable precise spatial manipulations. In CHI '11, ACM (2011), 1581--1590.
[24]
Wise, J. A., Thomas, J. J., Pennock, K., Lantrip, D., Pottier, M., Schur, A., and Crow, V. Visualizing the non-visual: Spatial analysis and interaction with information from text documents. In INFOVIS '95, IEEE (1995), 51--58.
[25]
Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In CHI '99, ACM (1999), 246--253.
[26]
Zhang, X., Ren, X., and Zha, H. Improving eye cursor's stability for eye pointing tasks. In CHI '08, ACM (2008), 525--534.

Cited By

View all
  • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
  • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
  • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
  • Show More Cited By

Index Terms

  1. Gaze-touch: combining gaze with multi-touch for interaction on the same surface

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '14: Proceedings of the 27th annual ACM symposium on User interface software and technology
    October 2014
    722 pages
    ISBN:9781450330695
    DOI:10.1145/2642918
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 October 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze input
    2. interactive surface
    3. multi-touch
    4. multimodal ui

    Qualifiers

    • Research-article

    Conference

    UIST '14

    Acceptance Rates

    UIST '14 Paper Acceptance Rate 74 of 333 submissions, 22%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)179
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 19 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Eye-Hand Movement of Objects in Near Space Extended RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676446(1-13)Online publication date: 13-Oct-2024
    • (2024)Hands-on, Hands-off: Gaze-Assisted Bimanual 3D InteractionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676331(1-12)Online publication date: 13-Oct-2024
    • (2024)GEARS: Generalizable Multi-Purpose Embeddings for Gaze and Hand Data in VR InteractionsProceedings of the 32nd ACM Conference on User Modeling, Adaptation and Personalization10.1145/3627043.3659551(279-289)Online publication date: 22-Jun-2024
    • (2024)Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642915(1-16)Online publication date: 11-May-2024
    • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
    • (2024)GazePointAR: A Context-Aware Multimodal Voice Assistant for Pronoun Disambiguation in Wearable Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642230(1-20)Online publication date: 11-May-2024
    • (2024)Cone&Bubble: Evaluating Combinations of Gaze, Head and Hand Pointing for Target Selection in Dense 3D Environments2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00126(642-649)Online publication date: 16-Mar-2024
    • (2024)GazePuffer: Hands-Free Input Method Leveraging Puff Cheeks for VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00055(331-341)Online publication date: 16-Mar-2024
    • (2024)Design Principles and Challenges for Gaze + Pinch Interaction in XRIEEE Computer Graphics and Applications10.1109/MCG.2024.338296144:3(74-81)Online publication date: 1-May-2024
    • (2023)A Modular Haptic Agent System with Encountered-Type Active InteractionElectronics10.3390/electronics1209206912:9(2069)Online publication date: 30-Apr-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media