St Andrews HCI Research Group

News

Laurel Riek, Facing Healthcare's Future: Designing Facial Expressivity for Robotic Patient Mannequins


<!–Speaker: Laurel Riek, Computer Science and Engineering University of Notre Dame
Date/Time: 1-2pm September 4, 2012
Location: 1.33a Jack Cole, University of St Andrews (directions)–>
Abstract:
In the United States, there are an estimated 98,000 people per year killed and $17.1 billion dollars lost due to medical errors. One way to prevent these errors is to have clinical students engage in simulation-based medical education, to help move the learning curve away from the patient. This training often takes place on human-sized android robots, called high-fidelity patient simulators (HFPS), which are capable of conveying human-like physiological cues (e.g., respiration, heart rate). Training with them can include anything from diagnostic skills (e.g., recognizing sepsis, a failure that recently killed 12-year-old Rory Staunton) to procedural skills (e.g., IV insertion) to communication skills (e.g., breaking bad news). HFPS systems allow students a chance to safely make mistakes within a simulation context without harming real patients, with the goal that these skills will ultimately be transferable to real patients.
While simulator use is a step in the right direction toward safer healthcare, one major challenge and critical technology gap is that none of the commercially available HFPS systems exhibit facial expressions, gaze, or realistic mouth movements, despite the vital importance of these cues in helping providers assess and treat patients. This is a critical omission, because almost all areas of health care involve face-to-face interaction, and there is overwhelming evidence that providers who are skilled at decoding communication cues are better healthcare providers – they have improved outcomes, higher compliance, greater safety, higher satisfaction, and they experience fewer malpractice lawsuits. In fact, communication errors are the leading cause of avoidable patient harm in the US: they are the root cause of 70% of sentinel events, 75% of which lead to a patient dying.
In the Robotics, Health, and Communication (RHC) Lab at the University of Notre Dame, we are addressing this problem by leveraging our expertise in android robotics and social signal processing to design and build a new, facially expressive, interactive HFPS system. In this talk, I will discuss our efforts to date, including: in situ observational studies exploring how individuals, teams, and operators interact with existing HFPS technology; design-focused interviews with simulation center directors and educators which future HFPS systems are envisioned; and initial software prototyping efforts incorporating novel facial expression synthesis techniques.
About Laurel:
Dr. Laurel Riek is the Clare Boothe Luce Assistant Professor of Computer Science and Engineering at the University of Notre Dame. She directs the RHC Lab, and leads research on human-robot interaction, social signal processing, facial expression synthesis, and clinical communication. She received her PhD at the University of Cambridge Computer Laboratory, and prior to that worked for eight years as a Senior Artificial Intelligence Engineer and Roboticist at MITRE.

New study recruiting participants – Gesture Memorability.


Participants wanted for an experiment on gesture user interfaces  –  £20 in amazon vouchers. 
See the page of the study for more details!
 
 

PhD Studentship on Perceptual Gaze-contingent Displays at St Andrews


The SACHI group (Human-Computer Interaction) at the University of St Andrews, Scotland’s first university, is offering a full scholarship to join the School of Computer Science as a doctoral researcher for 3.5 years. The scholarship covers tuition fees and provides a living-expenses stipend.
The work will focus on the creation of new forms of visualization with gaze-contingent displays (electronic displays that have access to the location of the person’s gaze), their evaluation through laboratory studies, and the implementation of new visualization and interaction techniques. The student will work closely with Dr. Miguel Nacenta and within the SACHI group.
Please, visit Dr. Nacenta’s site for more detail.

New SICSA role for Aaron


Logo for SICSA

As part of his work in the School of Computer Science, from the start of August 2012 Aaron is joining the Scottish Informatics and Computer Science Alliance (SICSA) executive as the deputy director for knowledge exchange for two years. As a result, he is stepping down as  theme leader for Multimodal Interaction.  Aaron has enjoyed his time working with Professor Stephen Brewster and is looking forward to joining the executive next month.

Luke Hutton, Virtual Walls: Studying the effectiveness of the privacy metaphor in the real world


<!–Speaker: Luke Hutton, SACHI
Date/Time: 1-2pm July 10, 2012
Location: 1.33a Jack Cole, University of St Andrews (directions)–>
Abstract:
The virtual wall is a simple privacy metaphor for ubiquitous computing environments. By expressing the transparency of a wall and the people to which the wall applies, a user can easily manage privacy policies for sharing their sensed data in a ubiquitous computing system.
While previous research shows that users understand the wall metaphor in a lab setting, the metaphor has not been studied for its practicality in the real world. This talk will describe a smartphone-based experience sampling method study (N=20) to demonstrate that the metaphor is sufficiently expressive to be usable in real-world scenarios. Furthermore, while people’s preferences for location sharing are well understood, our study provides insight into sharing preferences for a multitude of contexts. We find that whom data are shared with is the most important factor for users, reinforcing the walls approach of supporting apply-sets and abstracting away further granularity to provide improved usability.
About Luke:
Luke’s bio on the SACHI website.

Masterclass in mobile user studies


Dr Apu Kapadia is a Distinguished SICSA Visitor in August 2012. As part of his visit we are organising a pair of masterclasses in running mobile user studies. These masterclasses are open to all SICSA PhD students. Students will be need to be available to attend both masterclasses:

  • Thursday 2 August, University of Glasgow
  • Thursday 9 August, University of St Andrews

The classes will cover how to design and run a mobile user study using smartphones, and in particularly cover the use of the experience sampling method (ESM), a currently popular methodology for collecting rich data from real-world participants. In the first class, attendees will learn about the methodology and be given a smartphone. Attendees will then carry the smartphone and participate in a small study, and we will cover data analysis in the second class in St Andrews. The organisers have experience in running ESM studies which have looked at mobility, social networking, security and privacy, but the methodology should be of interest to PhD students in both the NGI and MMI themes.

If you have any questions or would like to attend, please e-mail Tristan Henderson (tnhh@st-andrews.ac.uk) before the 16th of July.

Biography of Dr Apu Kapadia:

Apu Kapadia is an Assistant Professor of Computer Science and Informatics at the School of Informatics and Computing, Indiana University. He received his Ph.D. in Computer Science from the University of Illinois at Urbana-Champaign in October 2005.

Dr Kapadia has published over thirty peer-reviewed conference papers and journal articles focused on privacy, with several of these at top-tier venues such as ACM TISSEC, IEEE TDSC, PMC, CCS, NDSS, Pervasive, and SOUPS. For his work on accountable anonymity, two of his papers were named as “Runners-up for PET Award 2009: Outstanding Research in Privacy Enhancing Technologies”, a prestigious award in the privacy community. His work on usable metaphors for controlling privacy was given the “Honorable Mention Award (Runner-up for Best Paper)” at Pervasive. Dr Kapadia’s recent work on smartphone “sensory” malware that make use of onboard sensors was published at NDSS and received widespread media coverage. His work on analyzing privacy leaks on Twitter also received media attention naming his work as one of the “7 Must-Read Twitter Studies from 2011”, and one of “The 10 Most Interesting Social Media Studies of 2011”.

Dr Kapadia is interested in topics related to systems’ security and privacy. He is particularly interested in security and privacy issues related to mobile sensing, privacy-enhancing technologies to facilitate anonymous access to services with some degree of accountability, usable mechanisms to improve security and privacy, and security in decentralized and mobile environments.

Ubiquitous User Modeling – U2M'2012


This week Aaron has been attending a research workshop of the Israel Science Foundation on Ubiquitous User Modeling (U2M’2012) – State of the art and current challenges in Haifa Israel. Aaron’s talk at this event was entitled Eyes, Gaze, Displays: User Interface Personalisation “You Lookin’ at me?”. In this he covered work with Mike Bennett, Umar Rashid, Jakub Dostal, Miguel A. Nacenta and Per Ola Kristensson from SACHI. The talk was a good way to show the interlocking and related research going on in SACHI.
His talk included references to a number of recent papers which include:

The alternative yet related viewpoints in this work made for a stimulating presentation and fruitful views for the international audience.
 

Lindsay MacDonald, A Very Delicate Agreement: A Process of Collaboration Between Disciplines


<!–Speaker: Lindsay MacDonald, University of Calgary, Canada
Date/Time: 1-2pm July 3, 2012
Location: 1.33a Jack Cole, University of St Andrews (directions)–>
Abstract:
In contrast to the romantic image of an artist working in alone in a studio, large-scale media art pieces are often developed and built by interdisciplinary teams. Lindsay MacDonald will describe the process of creating and developing one of these pieces, A Delicate Agreement, within such a team, and offer personal insight on the impact that this has had her artistic practice.
A Delicate Agreement is a gaze-triggered interactive installation that explores the potentially awkward act of riding in an elevator with another person. It is a set of elevator doors with a peephole in each door that entices viewers to peer inside and observe an animation of the passengers. Each elevator passenger, or character, has a programmed personality that enables them to act and react to the other characters’ behaviour and the viewers’ gaze. The result is the emergence of a rich interactive narrative made up of encounters in the liminal time and space of an elevator ride.
A Delicate Agreement is currently part of the New Alberta Contemporaries exhibition at the Esker Foundation in Calgary, Canada. For more information about the piece, please visit http://www.lindsaymacdonald.net/portfolio/a-delicate-agreement/.
About Lindsay:
Lindsay MacDonald is a Ph. D. student, artist, designer and interdisciplinary researcher from the Interactions Lab (iLab) at the University of Calgary in Canada. Lindsay’s approach to research and creative production combines methodology both from computer science and art, and she divides her time between the iLab and her studio in the Department of Art. Her research interests include interaction design, coded behaviour and performance and building interactive art installations.

Pervasive 2012


This week Aaron Quigley and Tristan Henderson attended Pervasive 2012, the Tenth International Conference on Pervasive Computing, at Newcastle University.
On Monday Aaron attended Pervasive Intelligibility the Second Workshop on Intelligibility and Control in Pervasive Computing. Here he presented a paper entitled Designing Mobile Computer Vision Applications for the Wild: Implications on Design and Intelligibility (PDF) by
Per Ola Kristensson, Jakub Dostal and Aaron Quigley. Later, he was a panelists with Judy Kay and Simone Stumpf where they discussed the research challenges of intelligibility with pervasive computing along with all participants.
On Tuesday Tristan attended the First Workshop on recent advances in behavior prediction and pro-active pervasive computing where he presented the paper Predicting location-sharing privacy preferences in social network applications by Greg Bigwood, Fehmi Ben Abdesslem and Tristan Henderson.
On Tuesday Aaron chaired the Doctoral Consortium with Elaine Huang from the University of Zurich with five panellists and nine students. The panellists were Adrian Friday, University of Lancaster, UK, Jin Nakazawa, Keio University, Japan and AJ Brush, Microsoft Research Seattle, USA.
The Pervasive 2012 doctoral consortium provided a collegial and supportive forum in which PhD students could present and defend their doctoral research-in-progress with constructive feedback and discussion. The consortium was guided by a panel of experienced researchers and practitioners with both academic and industrial experience. It offered students a valuable opportunity to receive high-quality feedback and fresh perspectives from recognised international experts in the field, and to engage with other senior doctoral students.
The day ended with a career Q&A session with an extended panel including Tristan Henderson from SACHI and Rene Mayhofer. Following this the panellists and students were able to have dinner together to continue active research and career discussions.
Along with work as a member of the joint steering committee of pervasive and UbiComp Aaron was a session chair for the HCI session on Thursday.

Two papers presented at the Pervasive Displays symposium, in Porto.



Last week (4-5 June, 2012) two papers from SACHI were presented at the Pervasive Displays international Symposium. The symposium took place in Porto, Portugal, and it was a great showcase of work on pervasive display systems from research groups in Europe, Japan and North America.
We presented two papers:
Factors Influencing Visual Attention Switch in Multi-Display User Interfaces: A Survey, which is part of the dissertation work of Umar Rashid, and
The LunchTable: A Multi-User, Multi-Display System for Information Sharing in Casual Group Interactions, which is part of collaborations with the ilab, at the University of Calgary.
Both papers were well-received and will be soon available through the ACM Digital Library, but the most interesting part was to share the energy of the pervasive displays community and the amazing discussions. We are hoping for a new edition next year, which will be in California.