Skip to content
Some highlights of 2014 to date

New Lectureship | RSE/Makdougall Brisbane Medal | AHRC funding for Palimpsest Project
General Chair MobileHCI'14 | Program Chair ITS'14 | Program Chair PerDis'14 | New SICSA theme
Best paper and honorable mention at CHI 2014 | Best paper at AVI 2014 | Best paper at DIS 2014
JISC funded Trading Consequences Launch | 9 papers and other works at CHI 2014.

Our newsfeed has details of these all these activities and research.

Feb 23 / Aaron Quigley

Helsinki Seminar: March 6th 2015

Prof. Aaron Quigley from the University of St Andrews is giving a talk on Friday 6th March.  The talk will take place in Otaniemi in the TUAS building at 14:00.
TIME
Friday 6th March, 14:00-15:00

PLACE
TUAS

TITLE
Public-displays to the left of me, head-mounted displays to the right, Here I am, stuck with the mobile phone that is you!

ABSTRACT
Displays are all around us, on and around our body, fixed and mobile, bleeding into the very fabric of our day to day lives.  Displays come in many forms such as smart watches, head-mounted displays or tablets and fixed, mobile, ambient and public displays. However, we know more about the displays connected to our devices than they know about us. Displays and the devices they are connected to are largely ignorant of the context in which they sit including knowing physiological, environmental and computational state. They don’t know about the physiological differences between people, the environments they are being used in, if they are being used by one or more people.

In this talk we review a number of aspects of displays in terms of how we can model, measure, predict and adapt how people can use displays in a myriad of settings. With modeling we seek to represent the physiological differences between people and use the models to adapt and personalize designs, user interfaces. With measurement and prediction we seek to employ various computer vision and depth sensing techniques to better understand how displays are used. And with adaptation we aim to explore subtle techniques and means to support diverging input and output fidelities of display devices. The talk draws on a number of studies from recent UMAP, IUI, AVI and CHI papers.

Our ubicomp user interface is complex and constantly changing, and affords us an ever changing computational and contextual edifice. As part of this, the display elements need to be better understood as an adaptive display ecosystem rather than simply pixels.

SHORT BIO
Professor Aaron Quigley is the Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews, UK. Aaron’s research interests include surface and multi-display computing, human computer interaction, pervasive and ubiquitous computing and information visualisation. He has published over 135 internationally peer-reviewed publications including edited volumes, journal papers, book chapters, conference and workshop papers and holds 3 patents. In addition he has served on over 80 program committees and has been involved in chairing roles of over 20 international conferences and workshops including UIST, ITS, CHI, Pervasive, UbiComp, Tabletop, LoCA, UM, I-HCI, BCS HCI and MobileHCI.

Feb 23 / Daniel John Rough

March 10th, seminar by Nick Taylor: Sustaining Civic Engagement in Communities

Speaker: Nick Taylor, University of Dundee
Date/Time: 2-3pm March 10, 2015
Location: CS1.33a, University of St Andrews

Abstract:
Engagement with local issues is typically very low, despite digital technologies opening up more channels for citizens to access information and get involved than ever before. This talk will present research around the use of simple physical interfaces in public spaces to lower barriers to participation and engage a wider audience in local issues. It will also explore the potential for moving beyond top-down interventions to support sustainable grassroots innovation, in which citizens can develop their own solutions to local issues.

Bio:
Nick Taylor is a Lecturer and Dundee Fellow in the Duncan of Jordanstone College of Art and Design at the University of Dundee. His research interests involve the use of novel technologies in social contexts, particularly in communities and public spaces. This has involved the exploration of technologies to support civic engagement in local democracy, public displays supporting community awareness and heritage, as well as methods of engaging communities in design.

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Dec 5 / uta

Now on Sale: FatFonts World Population Maps

fatFontPoster

Looking for a gift for a visualization aficionado? We are happy to announce that the first ever FatFonts World Population Map is now available in the Axis Maps store. All proceeds from the maps will be used to fund more FatFont-related research.

The map shows how the population of the world is distributed. It uses a typographic visualization technique–FatFonts–which allows you to read the exact number of people living in a particular area with a precision within 100,000 people. Each number in the world map corresponds to the population in an area of approx. 40,000 km².

detail_1_600

FatFonts – first conceived and designed by Miguel Nacenta and Uta Hinrichs – are digits that can be read as numbers, but also encode the information visually in the amount of ink that each digit uses. For example, digit eight 8 has eight times the amount of ink of digit one 1, digit seven 7 seven times and so on and so forth. This technique turns a table of numbers into a graphical representation where darker areas (with thicker numbers) represent higher population density. Stepping away from the map gives you an overview of which areas are heavily populated, coming closer lets you read the exact values.

To represent population densities from the tens of millions in a square (e.g., in New York City or Istanbul) to the hundreds of thousands, we use two layers: the FatFont numbers with orange backgrounds represent tens of millions of people. For example, the square that contains Buenos Aires shows you that fourteen million people live in that square of the world (the smaller 4 within the larger 1 represents the smaller order of magnitude). Tiles without an orange background represent populations between 9.9 million people to 100,000 (one order of magnitude lower).

This is an effective way to represent several orders of magnitude. The effect is quite mesmerising, and it gives you a good idea of where people actually live. Although it is possible to represent the same data with colours (i.e., colour scales), it is something different to see also the number itself. With the number you can easily make comparisons, calculate proportions, and relate what you see with the knowledge that you have already.

After a few minutes of looking at the map it starts to really sink in how empty some areas of the planet really are (Australia!), and how the real population centroid of the world is clearly in South East Asia. The map uses an equal-area projection; the numbers that you read are, therefore, also population densities. The representation is derived from the 2005 estimations for 2015 of the GPWFE dataset made available by SEDAC, Columbia University. 15 insets highlight interesting areas of high and low population in more detail, such as Northern China, Mexico City, Egypt, Western Japan, Bangladesh and Africa’s Great Lakes region.

Angle Australia_600

Nov 27 / Gonzalo Mendez

Exciting Collaboration with Wacom to Investigate Pen+Touch Interaction

wacomClose

Manipulation of visual information on the Wacom Cintiq 24HD touch display.

As part of a joint initiative to better understand pen+touch interaction in multi-touch devices, the SACHI lab has started a collaborative research endeavour with Wacom Co., Ltd. As a result, we recently welcomed some new arrivals to our lab: a  Cintiq 24HD touch display and a Cintiq Companion Hybrid tablet.

This equipment has an ergonomic design with a high resolution screen which combines multi-touch and pen capabilities. We intend to use them to explore new interaction possibilities and provide insights that can be incorporated in the design process of new multi-touch devices. Specifically, we will study user interaction within the creative space of Complex Graphic Manipulations, and with children in the context of handwriting.

SACHI looks forward to keeping you up to date with our discoveries.

wacomCollab

SACHI researchers collaborating with Wacom devices.

 

Nov 20 / Gonzalo Mendez

December 2nd, seminar by Eve Hoggan: Augmenting and Evaluating Communication with Multimodal Flexible Interfaces

Speaker: Dr. Eve Hoggan, Aalto Science Institute and the Helsinki Institute for Information Technology

Date/Time: December 2, 2014 / 2-3pm

Location: Jack Cole Building 1.33a, School of Computer Science

Title:  Augmenting and Evaluating Communication with Multimodal Flexible Interfaces

Abstract:

This talk will detail an exploratory study of remote interpersonal communication using the ForcePhone prototype. This research focuses on the types of information that can be expressed between two people using the haptic modality, and the impact of different feedback designs. Based on the results of this study and other current work, the potential of deformable interfaces and multimodal interaction techniques to enrich communication for users with impairments will be discussed. This talk will also present an introduction to neurophysiological measurements of such interfaces.

Bio:

Eve Hoggan is a Research Fellow at the Aalto Science Institute and the Helsinki Institute for Information Technology HIIT in Finland, where she is vice-leader of the Ubiquitous Interaction research group. Her current research focuses on the creation of novel interaction techniques, interpersonal communication and non-visual multimodal feedback.  The aim of her research is to use multimodal interaction and varying form factors to create more natural and effortless methods of interaction between humans and technology regardless of any situational or physical impairment.

More information can be found at www.evehoggan.com

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Nov 4 / Gonzalo Mendez

November 11th, seminar by Jason Alexander: Supporting the Design of Shape-Changing Interfaces

Speaker: Dr. Jason Alexander, School of Computing and Communications, Lancaster University

Date/Time: November 11, 2014 / 2-3pm

Location: Jack Cole Building 1.33a, School of Computer Science

Title:  Supporting the Design of Shape-Changing Interfaces

Abstract:

Shape-changing interfaces physically mutate their visual display surface to better represent on-screen content, provide an additional information channel, and facilitate tangible interaction with digital content. The HCI community has recently shown increasing interest in this area, with their physical dynamicity fundamentally changing how we think about displays. This talk will describe our current work supporting the design and prototyping of shape-changing displays: understanding shape-changing application areas through public engagement brainstorming, characterising fundamental touch input actions, creating tools to support design, and demonstrating example implementations. It will end with a look at future challenges and directions for research.

Bio:

Jason is a lecturer in the School of Computing and Communications at Lancaster University. His primary research area is Human-Computer Interaction, with a particular interest in bridging the physical-digital divide using novel physical interaction devices and techniques. He was previously a post-doctoral researcher in the Bristol Interaction and Graphics (BIG) group at the University of Bristol. Before that he was a Ph.D. student in the HCI and Multimedia Lab at the University of Canterbury, New Zealand. More information can be found at http://www.scc.lancs.ac.uk/~jason/

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Nov 4 / Michael Mauderer

Dr Miguel Nacenta Appointed Co-Leader for SICSA Research Theme: Human Computer Interaction

– photo by Callum Hyland

– photo by Callum Hyland

We are happy to announce that Dr Miguel Nacenta was appointed as co-leader of the SICSA Human-Computer Interaction research theme.

In his new position as SICSA HCI theme co-leader Miguel will, together with Professor Stephen Brewster of the University of Glasgow, take responsibility for the academic coordination of the theme and organize events such as the pre-CHI day, the All Hands Meeting and the SICSA HCI Doctoral Consortium.

If you are interested in the theme and its related events you can have a look at the SICSA HCI website, join the theme mailing list, or contact Miguel or Stephen for further information.

Oct 21 / Daniel John Rough

October 28th, seminar by Neal Lathia: Emotion Sense: From Design to Deployment

Speaker: Neal Lathia, Cambridge University
Date/Time: 2-3pm October 28, 2014
Location: Maths Lecture Theatre D, University of St Andrews

Abstract:
In the UK, more than 70% of mobile users now own a smartphone. These increasingly powerful, sensor-rich, and personal devices present an immense opportunity to monitor health-related behaviours and deliver digital behaviour-change interventions at unprecedented scale.

However, designing and building systems to measure and intervene on health behaviours presents a number of challenges. These range from balancing between energy efficiency and data granularity, translating between behavioural theory and design, making long psychological assessments usable for end users, and making sense of the sensor and survey data these apps collect in a multidisciplinary setting.

Approximately 18 months ago, we launched Emotion Sense, a mood-tracking app for Android where we tried to address some of these challenges. To date, the app has been downloaded over 35,000 times and has an active user base of about 2,000 people: in this talk, I will describe how we designed, trialled, and launched Emotion Sense, and the insights we are obtaining about diurnal patterns of activity and happiness that we are finding by mining the 100 million+ accelerometer samples the app has collected to date. I’ll close with future directions of this technology — including a novel smoking cessation intervention (Q Sense), and a generic platform (Easy M) that we have developed to allow researchers to conduct their own studies.

http://emotionsense.org/
http://www.qsense.phpc.cam.ac.uk/
http://www.cl.cam.ac.uk/~nkl25/easym/

Bio:
Neal is a Senior Research Associate in Cambridge University’s Computer Laboratory. His research to date falls somewhere in the intersection of data mining, mobile systems, ubiquitous/pervasive systems, and personalisation/ recommender systems, applied to a variety of contexts where we measure human behaviour by their digital footprints. He has a PhD in Computer Science from University College London. More info/contact http://www.cl.cam.ac.uk/~nkl25/

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Oct 20 / admin

Winter Augmented Reality Meeting 2015 Keynote Speaker Aaron Quigley

 

Professor Aaron Quigley

Professor Aaron Quigley

Aaron Quigley has been invited to the Winter Augmented Reality Meeting 2015 as a Keynote Speaker. WARM is an interdisciplinary meeting of experts in AR and related domains running its tenth installment. WARM2015 continues the success of previous WARM events (WARM’05, WARM’07, WARM’08, WARM’09, WARM’10, WARM’11, WARM’12, WARM’13, WARM’14).

The organisers of WARM’15 note that the fields of Computer Graphics, Augmented Reality, Computer Vision and Ubiquitous Computing are synergistic. However, the overlap and interleaving contributions of each area has yet to be expressed and understood. The domain expert, focusing and on excelling in his or her field of research, is unable to see the connections. This meeting is a fertile ground to connect ideas and therefore seeks a variety of topics revolving around Augmented Reality and Ubiquitous Computing.

Aaron is currently on sabbatical in Japan conducting research and working on a book. Elements from both of these will form the basis for his keynote lecture in February 2015 at Graz University of Technology, Institute for Computer Graphics and Vision, Austria.

The title for his talk will be “Constructing Reality: Digital-Physical Scaffolding” and the abstract is,

Abstract:
Is the relationship between human and computer akin to a dance, where each moves effortlessly responding to the movements of the other? Or, are computers just agents who do our bidding, either undertaking direct actions on our behalf or proactively determining services, information and supports we may need on a moment to moment basis? Or, should computers continue to be best thought of as simple devices which we should turn over work to as Vannevar Bush said or thinking assistants to perform the routinizable work as Licklider suggests while we focus on creative thought and decision? Neither the beautiful dance, the agent nor the simple device seems to capture our current experience of human computer interaction. Technology underpins the human experience and digital technologies in the form of devices, computers and communications are weaving themselves into the fabric of existence. The nature of this weaving is far from uniform, distributed or even fair. For some, the impact of digital technologies are far removed from their day to day life and serve only to support some of the infrastructure of where they live, if at all. For others, digital technologies form part of the substrate of their existence and living without their mobile phone, social media apps and streaming music service seems unimaginable. Between these extremes are broad swathes of the global population who rely on digital technologies for the infrastructure in their areas and services delivered to their homes. Of course, our use and indeed reliance of technology is not new. Indeed, it is one of the defining characteristics of humans and society, our fashioning of tools, instruments and technologies to help shape our world and lives. In this talk I will discuss how we have already used technology to fashion and construct our present reality and explore ways we might create new scaffolds for the future such as enhancing our senses for a myriad of reasons from correction to replacement and enhancement.

 

 

Oct 14 / Aaron Quigley

MobileHCI 2014, MobileHCI conference series, UIST 2014 and UIST 2015

 

MobileHCI 2014 General Co-Chairs

MobileHCI 2014 General Co-Chairs

In late September 2014 a number of members from SACHI were involved with MobileHCI 2014 in Toronto Canada. Aaron Quigley was the general co-chair for this conference and Daniel Rough was the registration chair. Per Ola Kristensson, an external member of SACHI, presented a paper and was a session chair during the conference. MobileHCI brings together people from diverse areas which provides a multidisciplinary forum for academics, hardware and software developers, designers and practitioners to discuss the challenges and potential solutions for effective interaction with and through mobile devices, applications, and services. MobileHCI LogoThis year MobileHCI was able to have a single track for the entire program which allowed everyone to see all the papers, posters, demos, design contest, panels etc. without trying to change sessions. Some images from this conference can be found here. Aaron is now the chair of the MobileHCI conference series steering committee until August 2015.

In early October a number of SACHI members were again involved with or attended UIST 2014, the ACM Symposium on User Interface Software and Technology. We organised this conference, UIST 2013 here in St Andrews last year. In 2014, Per Ola Kristensson was the demo co-chair and  Jakub Dostal was the registration co-chair. Per Ola was also awarded a lasting impact award during UIST 2014. Aaron Quigley will be the keynote chair for UIST 2015 in Charlotte, NC Nov 8-11, 2015. UIST-2014The ACM Symposium on User Interface Software and Technology (UIST) is the premier forum for innovations in human-computer interfaces. UIST brings together people from diverse areas including graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW.

You can read Aaron’s full blog post about the papers he noted to SACHI here.