Hui-Shyong Yeo and Aaron Quigley
Personal and intimate mobile and wearable devices such as head-mounted displays, smartwatches, smartphones and tablets are seeing increasing use. Within SACHI we are interested in wearable devices, wearable displays, wearable interaction in mobile settings, with second-screen scenarios, or as part of coordinated multi-display environments from the desktop, second-screen to gigapixel display walls.
The screen size of a smartwatch provides limited space to enable expressive multi-touch input, resulting in a markedly difficult and limited experience. In SACHI we are exploring ways enhance and extend the wearable experience. One of our projects is called WatchMI which stands for Watch Movement Input without requiring additional or hardware modification of the smartwatch. WatchMI enhances touch interaction on a smartwatch to support continuous pressure touch, twist, pan gestures and their combinations. Along with studying these primary WatchMI interactions we have also demonstrated a wide range of applications which WatchMI interactions support including, map navigation, an alarm clock, a music player, pan gesture recognition, text entry, file explorer and controlling remote devices or a game character. Our software analyzes, in real-time, the data from a built-in Inertial Measurement Unit (IMU) in order to determine with great accuracy and different levels of granularity. On this page you can find links to WatchMI videos, the WatchMI paper presented at MobileHCI 2016 and some of the media coverage of this work. This work has been undertaken with colleagues in KAIST, South Korea.
Following WatchMI is Sidetap & Slingshot Gestures on Unmodified Smartwaches. We present a technique for detecting gestures on the edge of an unmodified smartwatch. We demonstrate two exemplary gestures, i) Sidetap – tapping on any side and ii) Slingshot – pressing on the edge and then releasing quickly. Our technique is lightweight, as it relies on measuring the data from the internal Inertial measurement unit (IMU) only. With these two gestures, we expand the input expressiveness of a smartwatch, allowing users to use intuitive gestures with natural tactile feedback, e.g., for the rapid navigation of a long list of items with a tap, or act as shortcut commands to launch applications. It can also allow for eyes-free interaction or subtle interaction where visual attention is not available.
Coordinated, on (ie. wearable) and around, body devices
Displays on and around the body such as smartwatches, head-mounted displays or tablets enables users to interact on the go. However, diverging input and output fidelities of these devices can lead to interaction seams that inhibit efficient mobile interaction when users employ multiple devices at once. Our work on MultiFi demonstrates Multi-Fidelity Interaction with Displays On and Around the Body and was presented at CHI 2015. MultiFI combines the strengths of multiple displays (smart watch and head mounted display or mobile device and head mounted displays).
The display coordination in MultiFi presents three powerful concepts with (1) body-aligned mode, the devices share a common information space, which is spatially registered to the user’s body, (2) device-aligned mode, where the information space is spatially registered to the touchscreen device and moves with it and (3) side-by-side mode, interaction is redirected from one device to the other without requiring a spatial relationship among devices. Intuitively, in device-aligned mode, content which is displayed is glued to a single reference display eg. a high res map shown on a smart-watch, with a low res head mounted display of the “rest of the map” shown around the prephiery of the smart-watch. When the watch moves, the content of the head mounted display follows! On this page you can find links to MultiFi videos/talks, the MutliFI paper presented at CHI 2015 and some follow up journal and workshop papers. This work has been undertaken with colleagues in the University of Passau, Germay and Graz University of Technology, Graz, Austria.
In ItchyNose, we explored subtle and discreet on body interaction – detecting finger movements on the nose by using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on the nose, such as flicking, pushing or rubbing. These subtle gestures can be used to control a wearable computer without calling attention to the user in public.
2015: Watch the full MultiFi talk below from CHI
- Best Paper Honorable Mention Award. Yeo, HS, Lee, J, Bianchi, A & Quigley, AJ 2016, WatchMI: pressure touch, twist and pan gesture input on unmodified smartwatches. in Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services: MobileHCI ’16 . ACM Press – Association for Computing Machinery, 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, Florence, Italy, 6-9 September. DOI: 10.1145/2935334.2935375
- Yeo, HS, Lee, J, Bianchi, A & Quigley, AJ 2016, WatchMI: applications of watch movement input on unmodified smartwatches. in Adjunct Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services MobileHCI ’16 . ACM Press – Association for Computing Machinery, pp. 594-598, 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, Florence, Italy, 6-9 September. DOI: 10.1145/2957265.2961825
- Best Poster Award. Yeo, HS, Lee, J, Bianchi, A & Quigley, AJ 2016, Sidetap & Slingshot Gestures on Unmodified Smartwatches, in the Adjunct Proceeding of UIST 2016 , Tokyo Japan. Open Access.
- Juyoung Lee, Hui-Shyong Yeo, Murtaza Dhuliawala, Jedidiah Akano, Junichi Shimizu, Thad Starner, Aaron Quigley, Woontack Woo, and Kai Kunze 2017, Itchy nose: discreet gesture interaction using EOG sensors in smart eyewear. in Proceedings of the 2017 ACM International Symposium on Wearable Computers ISWC’17. ACM, New York, NY, USA, 94-97. DOI: https://doi.org/10.1145/3123021.3123060
- Grubert, J, Kranz, M & Quigley, AJ 2016, ‘Challenges in mobile multi-device ecosystems‘ mUX: The Journal of Mobile User Experience, vol 5, 5. DOI: 10.1186/s13678-016-0007-y
Quigley, AJ & Grubert, J 2015, Perceptual and social challenges in body proximate display ecosystems. in MobileHCI ’15 Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM Press – Association for Computing Machinery, New York, pp. 1168-1174, Mobile Collocated Interactions with Wearables: Workshop at MobileHCI 2015, Copenhagen, Denmark, 24 August. DOI: 10.1145/2786567.2794349
Grubert, J, Kranz, M & Quigley, AJ 2015, Design and technology challenges for body proximate display ecosystems. in MobileHCI ’15 Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM Press – Association for Computing Machinery, New York, pp. 951-954, From Mobile to Wearable: Using Wearable Devices to Enrich Mobile Interaction, Copenhagen, Denmark, 24-24 August. DOI: 10.1145/2786567.2794310
- Grubert, J, Heinisch, M, Quigley, AJ & Schmalstieg, D 2015, MultiFi: multi-fidelity interaction with displays on and around the body. in CHI ’15 Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM Press – Association for Computing Machinery, New York, pp. 3933-3942, 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea, Republic of, 18-23 April. DOI: 10.1145/2702123.2702331
Some Media Coverage
- Android Headlines
- Wareable: WatchMI wants to bring new gesture controls to existing smartwatches
- Android Police: Researchers in UK develop amazing new way to interact with Android Wear devices
- Engadget: WatchMI: Touchscreen-Interaktionen auf SmartWatches die Spass machen
- Silicon India
- India Today
- ACM TechNews – SIGCHI Edition