News

SACHI at DIS 2014


Jakub Dostal from SACHI and external co-authors published a paper at this year’s DIS conference. The paper was awarded the best paper award, given to the top 1% papers at the conference. This paper is one of the outcomes of a Dagstuhl seminar on Proxemics in Human-Computer Interaction, which Aaron Quigley helped organise and which was attended by several SACHI members.

Dark Patterns in Proxemic Interactions: A Critical Perspective.

Saul Greenberg, Interactions Lab, Department of Computer Science, University of Calgary
Sebastian Boring, Department of Computer Science, University of Copenhagen
Jo Vermeulen, Expertise Centre for Digital Media, Hasselt University
Jakub Dostal, School of Computer Science, University of St Andrews

Abstract:
Proxemics theory explains peoples’ use of interpersonal distances to mediate their social interactions with others. Within Ubicomp, proxemic interaction researchers argue that people have a similar social understanding of their spatial relations with nearby digital devices, which can be exploited to better facilitate seamless and natural interactions. To do so, both people and devices are tracked to determine their spatial relationships. While interest in proxemic interactions has increased over the last few years, it also has a dark side: knowledge of proxemics may (and likely will) be easily exploited to the detriment of the user. In this paper, we offer a critical perspective on proxemic interactions in the form of dark patterns: ways proxemic interactions can be misused. We discuss a series of these patterns and describe how they apply to these types of interactions. In addition, we identify several root problems that underlie these patterns and discuss potential solutions that could lower their harmfulness.

More details about the paper can be found in the ACM Digital Library

SACHI at CHI 2014: What to see


Members of SACHI are presenting a number of papers and other works at this year’s CHI in Toronto, Canada. The schedule below will allow you to see a sample of the Human-Computer Interaction research at University of St Andrews.

Paper (Honourable Mention): Depth Perception with Gaze-contingent Depth of Field
Session: The Third Dimension
When: Monday 11:40-12:00
Where: 718AB
Teaser Video

Interactivity: Text Blaster: A Multi-Player Touchscreen Typing Game
When: Monday 17:30-19:30
Where: Exhibit Hall E
Teaser Video

Paper (Best Paper): RetroDepth: 3D Silhouette Sensing for High-Precision Input On and Above Physical Surfaces
Session: On and Above the Surface
When: Tuesday 11:00-11:20
Where: Exhibit Hall G
Teaser Video

AltCHI Paper: None of a CHInd: Relationship Counselling for HCI and Speech Technology
Session: Limits and Futures
When: Tuesday 11:20-11:40
Where: 717AB
Teaset Video

Paper: Modeling the Perception of User Performance
Session: User Models and Prediction
When: Tusday 14:00-14:20
Where: 801A
Teaser Video

TOCHI Paper: Complementing Text Entry Evaluations with a Composition Task
Session: Text Entry and Evaluation
When: Wednesday 9:40-10:00
Where: Exhibit Hall G
Teaser Video

Paper: Uncertain Text Entry on Mobile Devices
Session: Text Entry and Evaluation
When: Wednesday 10:00-10:20
Where: Exhibit Hall G
Teaser Video

SIG: The Usability of Text Entry Systems Now and in the Future
When: Wednesday 11:00-12:20
Where: 715A

Paper: Quantitative Measurement of Virtual vs. Physical Object Embodiment through Kinesthetic Figural After Effects
Session: Multitouch Interaction
When: Wednesday 14:40-15:00
Where: 718AB
Teaser Video

Aaron Quigley was an associate chair for the Interaction Using Specific Capabilities or Modalities sub-committee and  Per Ola Kristensson was an associate chair for the Interaction Techniques and Devices sub-committee. While Aaron Quigley was a session chair, Jakub Dostal and Michael Mauderer were Student Volunteers throughout the conference.

IMG_3282The Brainy Prof? IMG_3323 IMG_3333  IMG_3466 IMG_3508 IMG_3454

 

Presenting SpiderEyes at IUI 2014


IUI-2014 logoAt the end of February, Jakub Dostal and Per Ola Kristensson will be attending IUI 2014 in Haifa, Israel.

Jakub will be presenting the full paper SpiderEyes: Designing Attention and Proximity-Aware Collaborative Interfaces for Wall-Sized Displays by Jakub Dostal, Uta Hinrichs, Per Ola Kristensson and Aaron Quigley. This paper introduces the concept of collaborative proxemics: enabling groups of people to col- laboratively use attention- and proximity-aware applications. To help designers create such applications we have developed SpiderEyes: a system and toolkit for designing attention- and proximity-aware collaborative interfaces for wall-sized displays. SpiderEyes is based on low-cost technology and allows accurate markerless attention-aware tracking of multiple people interacting in front of a display in real-time. The paper discusses how this toolkit can be applied to design attention- and proximity-aware collaborative scenarios around large wall-sized displays, and how the information visualisation pipeline can be extended to incorporate proxemic interactions.

You will soon be able to read more about this work on its on designated project page: SpiderEyes.

Jakub is also a Student Volunteer for the conference. Per Ola is a member of the Senior Programme Committee for IUI 2014.

Jakub Dostal, Subtle Gaze-Dependent Techniques for Visualising Display Changes in Multi-Display Environments


Abstract:
Modern computer workstation setups regularly include multiple displays in various configurations. With such multi-monitor or multi-display setups we have reached a stage where we have more display real-estate available than we are able to comfortably attend to. This talk will present the results of an exploration of techniques for visualising display changes in multi-display environments. Apart from four subtle gaze-dependent techniques for visualising change on unattended displays, it will cover the technology used to enable quick and cost-effective deployment to workstations. An evaluation of the technology as well as the techniques themselves will be presented as well. The talk will conclude with a brief discussion on the challenges in evaluating subtle interaction techniques.

About Jakub:
Jakub’s bio on the SACHI website.

St Andrews Algorithmic Programming Competition


When: Wednesday 12th of September 9:30am – 5pm (with a 1 hour break for lunch)
Where: Sub-honours lab in Jack Cole building (0.35)

As part of this competition, you may be offered an opportunity to participate in a Human-Computer Interaction study on subtle interaction. Participation in this study is completely voluntary.

There will be two competitive categories:
HCI study participants:
1st prize: 7” Samsung Galaxy Tab 2
2nd prize: £50 Amazon voucher
3rd prize: £20 Amazon voucher
Everyone:
1st prize: £50 Amazon voucher
2nd prize: £20 Amazon voucher
3rd prize: £10 Amazon voucher

We will try to include as many programming languages as is reasonable, so if you have any special requests, let us know.
If you have one, bring a laptop in case we run out of lab computers!
If you have any questions, please email Jakub on jd67@st-andrews.ac.uk

Laurel Riek, Facing Healthcare’s Future: Designing Facial Expressivity for Robotic Patient Mannequins


Abstract:
In the United States, there are an estimated 98,000 people per year killed and $17.1 billion dollars lost due to medical errors. One way to prevent these errors is to have clinical students engage in simulation-based medical education, to help move the learning curve away from the patient. This training often takes place on human-sized android robots, called high-fidelity patient simulators (HFPS), which are capable of conveying human-like physiological cues (e.g., respiration, heart rate). Training with them can include anything from diagnostic skills (e.g., recognizing sepsis, a failure that recently killed 12-year-old Rory Staunton) to procedural skills (e.g., IV insertion) to communication skills (e.g., breaking bad news). HFPS systems allow students a chance to safely make mistakes within a simulation context without harming real patients, with the goal that these skills will ultimately be transferable to real patients.

While simulator use is a step in the right direction toward safer healthcare, one major challenge and critical technology gap is that none of the commercially available HFPS systems exhibit facial expressions, gaze, or realistic mouth movements, despite the vital importance of these cues in helping providers assess and treat patients. This is a critical omission, because almost all areas of health care involve face-to-face interaction, and there is overwhelming evidence that providers who are skilled at decoding communication cues are better healthcare providers – they have improved outcomes, higher compliance, greater safety, higher satisfaction, and they experience fewer malpractice lawsuits. In fact, communication errors are the leading cause of avoidable patient harm in the US: they are the root cause of 70% of sentinel events, 75% of which lead to a patient dying.

In the Robotics, Health, and Communication (RHC) Lab at the University of Notre Dame, we are addressing this problem by leveraging our expertise in android robotics and social signal processing to design and build a new, facially expressive, interactive HFPS system. In this talk, I will discuss our efforts to date, including: in situ observational studies exploring how individuals, teams, and operators interact with existing HFPS technology; design-focused interviews with simulation center directors and educators which future HFPS systems are envisioned; and initial software prototyping efforts incorporating novel facial expression synthesis techniques.

About Laurel:
Dr. Laurel Riek is the Clare Boothe Luce Assistant Professor of Computer Science and Engineering at the University of Notre Dame. She directs the RHC Lab, and leads research on human-robot interaction, social signal processing, facial expression synthesis, and clinical communication. She received her PhD at the University of Cambridge Computer Laboratory, and prior to that worked for eight years as a Senior Artificial Intelligence Engineer and Roboticist at MITRE.

Luke Hutton, Virtual Walls: Studying the effectiveness of the privacy metaphor in the real world


Abstract:

The virtual wall is a simple privacy metaphor for ubiquitous computing environments. By expressing the transparency of a wall and the people to which the wall applies, a user can easily manage privacy policies for sharing their sensed data in a ubiquitous computing system.
While previous research shows that users understand the wall metaphor in a lab setting, the metaphor has not been studied for its practicality in the real world. This talk will describe a smartphone-based experience sampling method study (N=20) to demonstrate that the metaphor is sufficiently expressive to be usable in real-world scenarios. Furthermore, while people’s preferences for location sharing are well understood, our study provides insight into sharing preferences for a multitude of contexts. We find that whom data are shared with is the most important factor for users, reinforcing the walls approach of supporting apply-sets and abstracting away further granularity to provide improved usability.

About Luke:
Luke’s bio on the SACHI website.

Lindsay MacDonald, A Very Delicate Agreement: A Process of Collaboration Between Disciplines


Abstract:

In contrast to the romantic image of an artist working in alone in a studio, large-scale media art pieces are often developed and built by interdisciplinary teams. Lindsay MacDonald will describe the process of creating and developing one of these pieces, A Delicate Agreement, within such a team, and offer personal insight on the impact that this has had her artistic practice.
A Delicate Agreement is a gaze-triggered interactive installation that explores the potentially awkward act of riding in an elevator with another person. It is a set of elevator doors with a peephole in each door that entices viewers to peer inside and observe an animation of the passengers. Each elevator passenger, or character, has a programmed personality that enables them to act and react to the other characters’ behaviour and the viewers’ gaze. The result is the emergence of a rich interactive narrative made up of encounters in the liminal time and space of an elevator ride.
A Delicate Agreement is currently part of the New Alberta Contemporaries exhibition at the Esker Foundation in Calgary, Canada. For more information about the piece, please visit http://www.lindsaymacdonald.net/portfolio/a-delicate-agreement/.

About Lindsay:
Lindsay MacDonald is a Ph. D. student, artist, designer and interdisciplinary researcher from the Interactions Lab (iLab) at the University of Calgary in Canada. Lindsay’s approach to research and creative production combines methodology both from computer science and art, and she divides her time between the iLab and her studio in the Department of Art. Her research interests include interaction design, coded behaviour and performance and building interactive art installations.

Carman Neustaedter, Connecting Families over Distance


Abstract:
Families often have a real need and desire to stay connected with their remote family members and close friends. For example, grandparents want to see their grandchildren grow up, empty-nest parents want to know about the well being of their adult children, and parents want to be involved in their children’s daily routines and happenings while away from them. Video conferencing is one technology that is increasingly being used by families to support this type of need. In this talk, I will give an overview of the research that my students and I have done in this space. This includes studies of the unique ways in which families with children, long-distance couples, and teenagers make use of existing video chat systems to support ‘presence’ and ‘connection’ over distance. I will also show several systems we have designed to support always-on video connections that move beyond ‘talking heads’ to ‘shared experiences’.

About Carman:
Dr. Carman Neustaedter is an Assistant Professor in the School of Interactive Arts and Technology at Simon Fraser University, Canada. Dr. Neustaedter specializes in the areas of human-computer interaction, domestic computing, and computer-supported collaboration. He is the director of the Connections Lab, an interdisciplinary research group focused on the design and use of technologies for connecting people through space and time. This includes design for families and friends, support for workplace collaboration, and bringing people together through pervasive games. For more information, see:
Connections Lab
Carman Neustaedter

Jim Young, Leveraging People’s Everyday Skill Sets for Interaction with Robots


Abstract:
Human-Robot Interaction (HRI), broadly, is the study of how people and robots can work together. This includes core interaction design problems of creating interfaces for effective robot control and communication with people, and sociological and psychological studies of how people and robots can share spaces or work together. In this talk I will introduce several of my past HRI projects, ranging from novel control schemes for collocated or remote control, programming robotic style by demonstration, and developing foundations for evaluating human-robot interaction, and will briefly discuss my current work in robotic authority and gender studies of human-robot interaction. In addition, I will introduce the JST ERATO Igarashi Design Interface Project, a large research project directed by Dr. Takeo Igarashi, which I have been closely involved over the last several years.

About Jim:
James (Jim) Young is an Assistant Professor at the University of Manitoba, Canada, where he founded the Human-Robot Interaction lab, and is involved with the Human-Computer Interaction lab with Dr. Pourang Irani and Dr. Andrea Bunt. He received his BSc from Vancouver Island University in 2005, and completed his PhD in Social Human-Robot Interaction at the University of Calgary in 2010 with Dr. Ehud Sharlin, co-supervised by Takeo Igarashi at the University of Tokyo. His background is rooted strongly in the intersection of sociology and human-robot interaction, and in developing robotic interfaces which leverage people’s existing skills rather than making them learn new ones.