St Andrews HCI Research Group

News

Congratulations to Xu, Pireh, and Abd.


We are delighted to see Xu, Pireh, and Abd graduated this week! Congratulations!

 

 

 

 

 

 

 

 

 

Emory – St Andrews Collaborative Research Grant


Dr Jason Jacques (St Andrews) and Dr Kristin Williams (Emory) have been awarded the Emory – St Andrews Collaborative Research Grant for their project Considering Household Division of Labor when Engaging Civic Participation in Environmental Stewardship’

 

 

Climate change and environmental policy research often focuses on planetary scale changes, that are challenging for individuals to contextualise in their own environment. The project focuses on exploring and understanding how individuals engage with environmental stewardship on a hyper-local level: how do individual households understand their environment and are they able to draw connections between their actions, the activities in their community, and the environment around them?

In answering these questions, the project will connect households and communities together, combining this network of citizen scientists and environmental sensors, to enhance their understanding through bespoke visualisations customised for these communities.

“It great that the St Andrews is willing to take the initiative to support international collaboration on big questions, including environmental policy and sustainability. Connecting with Kristin, and others at Emory, for this project has offered unique opportunities to build capabilities and contrast the unique needs of communities on both sides of the Atlantic.” 

See more information about the Global Partnerships at St Andrews.

User Troubles during “Shoot St Andrews to Green”!


A map shows missing images from the OpenStreetMap for St Andrews

A map shows missing images from the OpenStreetMap for St Andrews

Many photos of St Andrews are missing from open-access maps. WikiShootMe allows anyone to add an image to places on Wikimedia and Wikipedia that doesn’t already have one. So we took the initiative to take photos of St Andrews’ historic buildings and upload them to Wikicommons using WikiShootMe. However, WikiShootMe is currently only a desktop website and is difficult to use when out and about. Many usability challenges emerged, leading us to turn this into a User-Centred Interaction Design project.

MORE

Measuring heart rate and blood oxygen remotely in the home


Pireh Pirzada has developed and validated a first rPPG system (Automated Remote Pulse Oximetry System, or ARPOS) that measures both heart rate and blood oxygenation levels remotely within participants’ home environments (real-life scenarios).

The research shares the first data set collected from real life scenarios which includes various factors such as skin pigmentations, illuminations, beard, makeup, and glasses. The research also shares its experiment protocol and source code used to collect and analyse the data.

 

Abstract:

Current methods of measuring heart rate (HR) and oxygen levels (SPO2) require physical contact, are individualised, and for accurate oxygen levels may also require a blood test. No-touch or non-invasive technologies are not currently commercially available for use in healthcare settings. To date, there has been no assessment of a system that measures HR and SPO2 using commercial off-the-shelf camera technology that utilises R, G, B, and IR data. Moreover, no formal remote photoplethysmography studies have been performed in real-life scenarios with participants at home with different demographic characteristics. This novel study addresses all these objectives by developing, optimising, and evaluating a system that measures the HR and SPO2 of 40 participants. HR and SPO2 are determined by measuring the frequencies from different wavelength band regions using FFT and radiometric measurements after pre-processing face regions of interest (forehead, lips, and cheeks) from colour, IR, and depth data. Detrending, interpolating, hamming, and normalising the signal with FastICA produced the lowest RMSE of 7.8 for HR with the r-correlation value of 0.85 and RMSE 2.3 for SPO2. This novel system could be used in several critical care settings, including in care homes and in hospitals and prompt clinical intervention as required.

Keywords: remote health monitoring; heart rate measurement; blood oxygenation level measurement; rPPG system

 

The research outputs also include:

Dataset: https://doi.org/10.5281/zenodo.6522389

Experiment protocol: dx.doi.org/10.17504/protocols.io.n2bvj6zkxlk5/v1

Code: https://github.com/PirehP/ARPOSpublic

 

Researchers:

Pireh Pirzada

HCI Staff Position at SACHI


Come and join our group! We are currently advertising for a new staff member to join our HCI group at the School of Computer Science.


Supporting the expansion and development of the SAHCI group, topics of interest include but are not limited to: tangible computing, digital fabrication, ubiquitous computing, information visualization, human-centered artificial intelligence, augmented reality, novel software and hardware interactions, and critical HCI. Expertise in the field of HCI and technical expertise in the creation of hardware and or software interactions is of particular interest.


For more details: https://www.jobs.ac.uk/job/CRS296/lecturer-senior-lecturer-reader-in-human-computer-interaction-ac7180gb


Closing Date: 17th August 2022


Please share far and wide

HCI meets Constraint Programming


Understanding How People Approach Constraint Modelling and Solving – University of St Andrews and University of Victoria

Ruth Hoffmann will be presenting the paper on “Understanding How People Approach Constraint Modelling and Solving” at the 28th International Conference on Principles and Practice of Constraint Programming (CP 2022) taking place between July 31 to August 5, 2022 in Haifa, Israel.

This paper is a joint collaboration between SACHI (Human Computer Interaction) and Constraint Programming groups, in both the University of St Andrews, Scotland and the University of Victoria, BC.

Abstract

Research in constraint programming typically focuses on problem solving efficiency. However, the way users conceptualise problems and communicate with constraint programming tools is often sidelined. How humans think about constraint problems can be important for the development of efficient tools that are useful to a broader audience. For example, a system incorporating knowledge on how people think about constraint problems can provide explanations to users and improve the communication between the human and the solver.
We present an initial step towards a better understanding of the human side of the constraint solving process. To our knowledge, this is the first human-centred study addressing how people approach constraint modelling and solving. We observed three sets of ten users each (constraint programmers, computer scientists and non-computer scientists) and analysed how they find solutions for well-known constraint problems. We found regularities offering clues about how to design systems that are more intelligible to humans.

Researchers

The paper can be found at: https://doi.org/10.4230/LIPIcs.CP.2022.28

Conference

Ruth will be presenting the paper in the main conference and giving an invited talk at ModRef 2022 to raise awareness of the benefits of understanding how people represent, model and solve constraint problems.

CP 2022 Conference link: https://easychair.org/smart-program/FLoC2022/CP-2022-08-03.html#talk:197219

ModRef 2022 link: https://easychair.org/smart-program/FLoC2022/ModRef-2022-07-31.html#talk:197355

More ModRef info: https://modref.github.io/ModRef2022.html#invtalks

SACHI @ IEEE VIS in Vancouver


Uta Hinrichs, Fearn Bishop and Xu Zhu are representing SACHI this year at the IEEE VIS’19 conference which is held in Vancouver, BC, Canada.

Fearn will present her research on exploring free-form visualization processes of children. Xu will present his work on how people visually represent discrete constraint problems. Uta has been involved on research that introduces design by immersion as a novel transdisciplinary approach to problem-driven visualization. She is also co-chairing the VIS Doctoral Colloquium this year, and is co-organizing the 4th workshop on Visualization for the Digital Humanities (VIS4DH’19).

 

Design by Immersion: A Transdisciplinary Approach to Problem-driven Visualizations [preprint]
Kyle Wm. Hall, Adam Bradley, Uta Hinrichs, Samuel Huron, Jo Wood, Christopher Collins and Sheelagh Carpendale.

Tuesday, Oct. 22 – 2:35-3:50 PM  [preview video]
Provocations; Ballroom A

 

Construct-A-Vis: Exploring the Free-form Visualization Processes of Children [preprint]
Fearn Bishop, Johannes Zagermann, Ulrike Pfeil, Gemma Sanderson, Harald Reiterer and Uta Hinrichs.

Wednesday, Oct. 23 – 2:20-3:50 PM
(De)Construction; Ballroom A

 

 

How People Visually Represent Discrete Constraint Problems [TVCG paper; PDF]
Xu Zhu, X, Miguel Nacenta, Özgür Akgün and Peter W. Nightingale

Thursday, Oct. 24 – 9:00-10:30 AM [preview video]
Vis for Software and Systems; Ballroom B

 

 

2019 Recruitment: Lecturer and Associate Lecturer


Our school of Computer Science are looking to recruit two people to join us in this unique and captivating place. Seven centuries of history link the students with the town, leading to the ancient and yet modern institution where you will be at the forefront of topics in CS e.g Human Computer Interaction. https://www.st-andrews.ac.uk

  1. Lecturer advertisement  
  2. Associate Lecturer advertisement 

As noted in the School’s blog post on this, the school is particularly interested in recruiting someone with an interest in HCI into one of these posts.

The closing date is 25 October 2019

[1] The School of Computer Science is looking to recruit a lecturer as part of a large on-going expansion of our academic staff. We are especially, but not exclusively, interested in those working in Human Computer Interaction.    

We wish to appoint a Lecturer to join our vibrant teaching and research community that is ranked amongst the top venues for Computer Science education and research worldwide.  The successful candidate will be expected to have a range of interests, to be active in research publication that strengthens or complements those in the School and to be capable of teaching the subject to undergraduate and taught postgraduate students who come to us with a wide range of backgrounds.    

Candidates should hold a PhD in a cognate discipline. Excellent teaching skills and an interest in promoting knowledge exchange are essential.  You should also have some familiarity with grant seeking processes in relation to research councils and other sources.  The Lecturer comes with an academic promotion track to Senior Lecturer, Reader, Professor.

MORE

CHI 2021 Yokohama


Professor Aaron Quigley from SACHI and Professor Yoshifumi Kitamura (Tohoku University, Japan) are the general co-chairs for the ACM CHI conference on Human Factors in Computing Systems in Yokohama in 2021.  CHI is hosted by the ACM SIGCHI, the Special Interest Group on Computer-Human Interaction

The ACM CHI Conference on Human Factors in Computing Systems is the premier international conference for the field of Human-Computer Interaction (HCI). This flagship conference is generally considered the most prestigious in the field of HCI and attracts thousands of international attendees annually.

 

CHI provides a place where researchers and practitioners can gather from across the world to discuss the latest HCI topics. It has been held since 1982 and this is only the second time CHI will be held in Asia.

Best paper award at TVX 2019


Congratulations to Guilherme Carneiro, Miguel Nacenta, Alice Toniolo, Gonzalo Méndez and Aaron Quigley who won the Best Student Paper award for their paper “Deb8: A Tool for Collaborative Analysis of Video at TVX 2019.

Deb8 is a tool that allows collaborative analysis of video-based TV debates. The tool provides a novel UI designed to enable and capture rich synchronous collaborative discussion of videos based on argumentation graphs that link quotes of the video, opinions, questions, and external evidence.

MORE