St Andrews HCI Research Group

News

Seminar: Harnessing Usability, UX and Dependability for Interactions in Safety Critical Contexts


Event Details

  • When: Monday 03 February 2020, 11am – 12hrs
  • Where: JCB:1.33A – Teaching Laboratory

Abstract: Innovation and creativity are the research drivers of the Human-Computer Interaction (HCI) community which is currently investing a vast amount of resources in the design and evaluation of “new” user interfaces and interaction techniques, leaving the correct functioning of these interfaces at the discretion of the helpless developers.  In the area of formal methods and dependable systems the emphasis is usually put on the correct functioning of the system leaving its usability to secondary-level concerns (if at all addressed).  However, designing interactive systems requires blending knowledge from these domains in order to provide operators with enjoyable, usable and dependable systems.  The talk will present possible research directions and their benefits for combining several complementary approaches to engineer interactive critical systems.  Due to their specificities, addressing this problem requires the definition of methods, notations, processes and tools to go from early informal requirements to deployed and maintained operational interactive systems.  The presentation will highlight the benefits of (and the need for) an integrated framework for the iterative design of operators’ procedures and tasks, training material and the interactive system itself.  The emphasis will be on interaction techniques specification and validation as their design is usually the main concern of HCI conferences.  A specific focus will be on automation that is widely integrated in interactive systems both at interaction techniques level and at application level.  Examples will be taken from interactive cockpits on large civil commercial aircrafts (such as the A380), satellite ground segment application and Air Traffic Control workstations.

Bio: Dr. Philippe Palanque is Professor in Computer Science at the University Toulouse 3 “Paul Sabatier” and is head of the Interactive Critical Systems group at the Institut de Recherche en Informatique de Toulouse (IRIT) in France. Since the late 80s he has been working on the development and application of formal description techniques for interactive system. He has worked for more than 10 years on research projects to improve interactive Ground Segment Systems at the Centre National d’Etudes Spatiales (CNES) and is also involved in the development of software architectures and user interface modeling for interactive cockpits in large civil aircraft (funded by Airbus). He was involved in the research network HALA! (Higher Automation Levels in Aviation) funded by SESAR programme which targets at building the future European air traffic management system. The main driver of Philippe’s research over the last 20 years has been to address in an even way Usability, Safety and Dependability in order to build trustable safety critical interactive systems. He is the secretary of the IFIP Working group 13.5 on Resilience, Reliability, Safety and Human Error in System Development, was steering committee chair of the CHI conference series at ACM SIGCHI and chair of the IFIP Technical Committee 13 on Human-Computer Interaction.

 

Event details

  • When: 3rd February 2020 11:00 - 12:00
  • Where: Cole 1.33a

Seminar: Toward magnetic force based haptic rendering and friction based tactile rendering


Event Details

  • When: Thursday 14 November 2019, 2-3pm
  • Where: JCB:1.33B – Teaching Laboratory

Title: Toward magnetic force based haptic rendering and friction based tactile rendering

Abstract: Among all senses, the haptic system provides a unique and bidirectional communication channel between humans and the real word around them.  Extending the frontier of traditional visual rendering and auditory rendering, haptic rendering enables human operators to actively feel, touch and manipulate virtual (or remote) objects through force and tactile feedback, which further increases the quality of Human-Computer Interaction.  It has been effectively used for a number of applications including surgical simulation and training, virtual prototyping, data visualization, nano-manipulation, education and other interactive applications.  My work will explore the design and construction of our magnetic haptic interface for force feedback and our surface friction based tactile rendering system through combining electrovibration effect and squeeze film effect.

Bio: Dr XIONG LU is an Associate Professor in College of Control Engineering at Nanjing University of Aeronautics and Astronautics and is an academic visitor in St Andrews HCI research group in the School of Computer Science at University of St Andrews.  He received his Ph.D. degree in Measuring and Testing Technologies and Instruments from Southeast University, in China.  His mainly research interests is Human-Computer Interaction, Haptic Rendering and Tactile Rendering.

 

DLS: Multimodal human-computer interaction: past, present and future


Event details

  • When: 8th October 2019 09:30 – 15:15
  • Where: Byre Theatre
  • Series: Distinguished Lectures Series
  • Format: Distinguished lecture

Speaker: Stephen Brewster (University of Glasgow)
Venue: The Byre Theatre

Timetable:

9:30: Lecture 1: The past: what is multimodal interaction?
10:30 Coffee break
11:15 Lecture 2: The present: does it work in practice?
12:15 Lunch (not provided)
14:15 The future: Where next for multimodal interaction?

Speaker Bio:

Professor Brewster is a Professor of Human-Computer Interaction in the Department of Computing Science at the University of Glasgow, UK. His main research interest is in Multimodal Human-Computer Interaction, sound and haptics and gestures. He has done a lot of research into Earcons, a particular form of non-speech sounds.

He did his degree in Computer Science at the University of Herfordshire in the UK. After a period in industry he did his PhD in the Human-Computer Interaction Group at the University of York in the UK with Dr Alistair Edwards. The title of his thesis was “Providing a structured method for integrating non-speech audio into human-computer interfaces”. That is where he developed my interests in Earcons and non-speech sound.

After finishing his PhD he worked as a research fellow for the European Union as part of the European Research Consortium for Informatics and Mathematics (ERCIM). From September, 1994 – March, 1995 he worked at VTT Information Technology in Helsinki, Finland. He then worked at SINTEF DELAB in Trondheim, Norway.

 

Seminar: Brain-based HCI – What could brain data can tell us HCI


Event details

  • When: Friday 25 October, 2-3pm
  • Where: JCB:1.33B – Teaching Laboratory

Abstract: This talk will describe a range of our projects, utilising functional Near Infrared Spectroscopy (fNIRS) in HCI.  As a portable alternative that’s more tolerate of motion artefacts than EEG, fNIRS measures the amount of oxygen in the brain, as e.g. mental workload creates demand.  As opposed to BCI (trying to control systems with our brain), we focus on brain-based HCI, asking what brain data can tell us about our software, our work, our habits, and ourselves.  In particular, we are driven by the idea that brain data can become personal data in the future.

Bio: Dr Max L. Wilson is an Associate Professor in the Mixed Reality Lab in Computer Science at the University of Nottingham.  His research focus is on evaluating Mental Workload in HCI contexts – as real-world as possible – primarily using functional Near Infrared Spectroscopy (fNIRS).  As a highly tolerant form of brain sensor, fNIRS is suitable for use in HCI research into user interface design, work tasks, and everyday experiences.  This work emerged from his prior research into the design and evaluation of complex user interfaces for information interfaces. Across these two research areas, Max has over 120 publications, including a Honourable Mention CHI2019 paper on a Brain-Controlled Movie – The MOMENT.

BEGIN seminar: Maps, Space and the 2D Plane from the Data and User Interface Perspective


Event details

  • When: Tuesday 15 Octuber, 3-4pm
  • Where: School VI, United Colleges
Title: “Maps, Space and the 2D Plane from the Data and User Interface Perspective”

Abstract: The 2D plane underpins most displays of information and therefore most of the ways in which interface designers and data analysts can dynamically represent information. As a user interface and information visualization designer/researcher I encounter the 2D plane often as a necessity and sometimes as an opportunity to enhance human cognitive processes.

Maps, who are the original example of use of the 2D plane to represent information serve often as inspiration.In this talk, I will discuss some of my most exciting encounters with the 2D plane and maps, and reflect on their deeper affordances to support thinking and understanding. I hope also to engage in conversation with you in the audience about what maps and the 2D plane mean for you and how you use them.

MORE

Event details

  • When: 15th October 2019 15:00 - 16:00
  • Where: Various

Rachel Menzies: Unlocking Accessible Escape Rooms: Is Technology the Key?


Event details

  • When: 2nd April 2019 14:00 – 15:00
  • Where: Cole 1.33a
  • Series: School Seminar Series
  • Format: Seminar

Abstract:

Escape rooms are popular recreational activities whereby players are locked in a room and must solve a series of puzzles in order to ‘escape’. Recent years have seen a large expansion technology being used in these rooms in order to provide ever changing and increasingly immersive experiences. This technology could be used to minimise accessibility issues for users, e.g. with hearing or visual impairments, so that they can engage in the same way as their peers without disabilities. Escape room designers and players completed an online questionnaire exploring the use of technology and the accessibility of escape rooms. Results show that accessibility remains a key challenge in the design and implementation of escape rooms, despite the inclusion of technology that could be used to improve the experience of users with disabilities. This presentation will explore the lack of accessibility within Escape Rooms and the potential for technology to bridge this gap.

Speaker Bio:

Dr Rachel Menzies is the Head of Undergraduate Studies for Computing at the University of Dundee and is the current SICSA Director of Education (https://www.sicsa.ac.uk/education/). She co-directs the UX’d research group (https://www.ux-d.co.uk/) and her research interests include user centred design with marginalised user groups, such as users with disabilities, as well as exploring novel interfaces, data visualisation and CS education. Her most recent work focusses on accessibility is in escape rooms, in particular how users with varied disabilities can access and enjoy the experience alongside typical users.

Event details

  • When: 2nd April 2019 14:00 - 15:00

Keynote in KAIST Korea


Keynote in KAIST, Daejeon

Aaron with two of his co-authors Juyoung Lee and Hyung-il Kim in KAIST.


On February 1st, Professor Quigley delivered an invited talk as part of the ACM Distinguished Speaker Program during the HCI@KAIST International Workshop in KAIST, Daejeon, South Korea.
MORE

CUTE Centre Seminar Singapore


Professor Aaron Quigley is currently in Singapore on sabbatical with the CUTE centre. His welcome seminar was on the topic of Discreet Computing and showcased a number of SACHI projects.
MORE

SACHI Seminar: Jason Alexander (Lancaster University) – What would you do if you could touch your data?



Title:  What would you do if you could touch your data?
Abstract: Data Physicalizations are physical artefacts whose geometry or material properties encode data. They bring digital datasets previously locked behind 2D computer screens out into the physical world, enabling exploration, manipulation, and understanding using our rich tactile senses. My work explores the design and construction of dynamic data physicalizations, where users can interact with physical datasets that dynamically update. I will describe our data physicalization vision and show our progress on designing, building, and evaluating physicalizations and discuss the many exciting challenges faced by this emerging field.
Speaker biography:  Jason is a Senior Lecturer in the School of Computing and Communications at Lancaster University. He has a BSc(Hons) and PhD in Computer Science from the University of Canterbury in New Zealand and was previously a post-doctoral researcher at the University of Bristol. His research is broadly in Human-Computer Interaction, with a particular interest in developing novel interactive systems to bridge the physical-digital divide. His recent work focuses on the development of shape-changing interfaces—surfaces that can dynamically change their geometry based on digital content—and their application to data physicalization. He also has interests in digital fabrication and novel haptic interaction techniques.

Event details

  • When: 29th November 2018 14:00 - 15:00
  • Where: Cole 1.33a

SACHI Seminar – Professor Anirudha Joshi: The story of Swarachakra – Cracking the puzzle of text input in Indian languages



Title: The story of Swarachakra – Cracking the puzzle of text input in Indian languages
Abstract: There was a time when text input in Indian languages was called a ‘puzzle’. People found it so difficult that became a barrier that prevented them from using most other technology products, from doing common tasks such as searching the web or saving a contact. As a result, Indians typed very little in their own languages. The Roman script (in which we write English) is an Alphabet. In contrast, a large majority of Indian scripts are Abugidas – a different type of scripts. In our lab, we were convinced that we need different solutions – what works for Alphabets may not work for Abugidas. Over the years we explored several designs. Our early solutions were for desktop computers. Later we developed concepts for the feature phones. We tried several creative ideas and made prototypes. We got interesting results in the lab. We published papers and case studies. But beyond that, we could not reach out and make a difference to the end-users. Then smartphones arrived, and quickly became popular. It became relatively easier to develop and deploy keyboards. Again, we tried several ideas. One solution stood out in comparison with others. We called it “Swarachakra”. Today, Swarachakra is available for 12 Indian languages and has been downloaded by about 4 million users. What was the problem, and how was it solved? And what challenges remain? Come to the talk to find out.
Speaker biography: Anirudha Joshi is professor in the interaction design stream in the IDC School of Design, IIT Bombay, India, though currently he is on a sabbatical, visiting universities in the UK. His specialises in design of interactive products for emergent users in developing economies. He has worked in diverse domains including healthcare, literacy, Indian language text input, banking, education, industrial equipment, and FMCG packaging. Anirudha also works in the area of integrating HCI activities with software engineering processes. He has developed process models, tools, and metrics to help HCI practitioners deliver a better user experience. Anirudha is active with HCI communities in India and outside. He has chaired in various roles in several conferences including India HCI, INTERACT and CHI. Since 2007, he represents India on IFIP TC13. He is the founding director of HCI Professionals Association of India since 2013. Since 2015 he is the Liaison for India for the ACM SIGCHI Asian Development Committee. Since 2016, he has been the VP Finance of the ACM SIGCHI Executive Committee. Anirudha has diverse backgrounds. He is a BTech (1989) in Electrical Engineering, an MDes (1992), in Visual Communication Design, and a PhD (2011) in Computer Science and Engineering, all from IIT Bombay.

Event details

  • When: 29th October 2018 15:00 - 16:00
  • Where: Cole 1.33a