Touchscreens are ever-present in technologies today. The large featureless sensors are rapidly replacing the physical keys and buttons on a wide array of digital technologies, the most common is the mobile device. Gaining popularity across all demographics and endorsed for their superior interface soft design flexibility and rich gestural interactions, the touchscreen currently plays a pivotal role in digital technologies. However, just as touchscreens have enabled many to engage with digital technologies, its barriers to access are excluding many others with visual and motor impairments. The contemporary techniques to address the accessibility issues fail to consider the variable nature of abilities between people, and the ever changing characteristics of an individuals impairment. User models for personalisation are often constructed from stereotypical generalisations of the similarities of people with disabilities, neglecting to recognise the unique characteristics of the individuals themselves. Existing strategies for measuring abilities and performance require users to complete exhaustive training exercises that are disruptive from the intended interactions, and result in the creation of descriptions of a users performance for that particular instance.
This research aimed to develop novel techniques to support the continuous measurement of individual user’s needs and abilities through natural touchscreen device interactions. The goal was to create detailed interaction models for individual users, in order to understand the short and long-term variances of their abilities and characteristics. Resulting in the development of interface adaptions that better support interaction needs of people with visual and motor impairments.
Kyle Montague is a PhD student based within the School of Computing at the University of Dundee. Kyle works as part of the Social Inclusion through the Digital Economy (SiDE) research hub. He is investigating the application of shared user models and adaptive interfaces to improve the accessibility of digital touchscreen technologies for people with vision and motor impairments.
His doctoral studies explore novel methods of collecting and aggregating user interaction data from multiple applications and devices, creating domain independent user models to better inform adaptive systems of individuals needs and abilities.
Alongside his research Kyle works for iGiveADamn, a small digital design company he set up with a fellow graduate Jamie Shek. Past projects have included iGiveADamn Connect platform for charities, Scottish Universities Sports and the Blood Donation mobile apps. Prior to this he completed an undergraduate degree in Applied Computing at the University of Dundee.