St Andrews HCI Research Group

19

May 2014

Aaron Quigley and Daniel Rough, Practice talks for AVI 2014


<!–Speaker: Aaron Quigley and Daniel Rough, University of St Andrews
Date/Time: 12-1pm May 20, 2014
Location: Jack Cole 1.33a–>
Title: AwToolkit: Attention-Aware User Interface Widgets
Authors: Juan-Enrique Garrido, Victor M. R. Penichet, Maria-Dolores Lozano, Aaron Quigley, Per Ola Kristensson.
Abstract: Increasing screen real-estate allows for the development of applications where a single user can manage a large amount of data and related tasks through a distributed user inter- face. However, such users can easily become overloaded and become unaware of display changes as they alternate their attention towards different displays. We propose Aw- Toolkit, a novel widget set for developers that supports users in maintaing awareness in multi-display systems. The Aw- Toolkit widgets automatically determine which display a user is looking at and provide users with notifications with different levels of subtlety to make the user aware of any unattended display changes. The toolkit uses four notifica- tion levels (unnoticeable, subtle, intrusive and disruptive), ranging from an almost imperceptible visual change to a clear and visually saliant change. We describe AwToolkit’s six widgets, which have been designed for C# developers, and the design of a user study with an application oriented towards healthcare environments. The evaluation results re- veal a marked increase in user awareness in comparison to the same application implemented without AwToolkit.
Title: An Evaluation of Dasher with a High-Performance Language Model as a Gaze Communication Method
Authors: Daniel Rough, Keith Vertanen, Per Ola Kristensson
Abstract: Dasher is a promising fast assistive gaze communication method. However, previous evaluations of Dasher have been inconclusive. Either the studies have been too short, involved too few partici- pants, suffered from sampling bias, lacked a control condition, used an inappropriate language model, or a combination of the above. To rectify this, we report results from two new evaluations of Dasher carried out using a Tobii P10 assistive eye-tracker machine. We also present a method of modifying Dasher so that it can use a state-of-the-art long-span statistical language model. Our experi- mental results show that compared to a baseline eye-typing method, Dasher resulted in significantly faster entry rates (12.6 wpm versus 6.0 wpm in Experiment 1, and 14.2 wpm versus 7.0 wpm in Exper- iment 2). These faster entry rates were possible while maintaining error rates comparable to the baseline eye-typing method. Partici- pants’ perceived physical demand, mental demand, effort and frus- tration were all significantly lower for Dasher. Finally, participants significantly rated Dasher as being more likeable, requiring less concentration and being more fun.
This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.