Skip to content
Some highlights of 2014 to date

New Lectureship | RSE/Makdougall Brisbane Medal | AHRC funding for Palimpsest Project
General Chair MobileHCI'14 | Program Chair ITS'14 | Program Chair PerDis'14 | New SICSA theme
Best paper and honorable mention at CHI 2014 | Best paper at AVI 2014 | Best paper at DIS 2014
JISC funded Trading Consequences Launch | 9 papers and other works at CHI 2014.

Our newsfeed has details of these all these activities and research.

Mar 30 / David Harris-Birtill

LitLong Launch

The Palimpsest project involving the University of St Andrews’ SACHI team collaborating with the University of Edinburgh’s English literature and text-mining group is launching the end product of their work in LitLong Edinburgh, which is launched today (30th march 2015).

LitLong_web_vis LitLong_app

Lit Long: Edinburgh features a range of maps and accessible visualisations, which enable users to interact with Edinburgh’s literature in a variety of ways, exploring the spatial relations of the literary city at particular times in its history, in the works of particular authors, or across different eras, genres and writers. Lit Long: Edinburgh makes a major contribution to our knowledge of the Edinburgh literary cityscape, with potential to shape the experience and understanding of critics and editors, residents and visitors, readers and writers.

Give the web visualisation a try here.

SACHI’s Dr Uta Hinrichs created the web visualisation and Dr David Harris-Birtill created the mobile app.

This work is featured on the Guardians website and mentioned in Edinburgh University’s news.

Mar 30 / David Harris-Birtill

SICSA Medical Imaging and Sensing Theme Events Success

Three exciting events from the theme have already taken place: the theme launch at the Royal Society of Edinburgh, a Masterclass in Medical Imaging and Sensing at the University of St Andrews, and a Workshop on Medical Image Analysis at the University of Dundee.

David Harris-Birtill Launching SICSA Medical Imaging and Sensing Theme

Dr David Harris-Birtill Launching SICSA Medical Imaging and Sensing Theme

Over fifty researchers from across Scotland from industry and academia came to speak, listen and network at the SICSA Medical Imaging and Sensing Theme Launch in the Royal Society of Edinburgh on the 18th February 2015, hosted by Toshiba Medical. Speakers included: Dr Diana Morgan (Censis), Dr Jano van Hemert (Optos), Dr Ian Poole (Toshiba Medical), Dr Tom MacGillivray (CRIC Edinburgh), Prof Stephen McKenna (University of Dundee),  Dr Tom Kelsey (University of St Andrews), Prof Lynne Baillie (Glasgow Caledonian University), Dr Bobby Davey (Toshiba Medical) and Dr David Harris-Birtill (University of St Andrews).

At the SICSA Masterclass in Medical Imaging and Sensing, hosted by the University of St Andrews, thirty early career researchers came to learn more about how they can apply their skills in this exciting field on the 4th March 2015. Speakers included:  Prof Manuel Trucco (University of Dundee), Dr Bobby Davey (Toshiba Medical), Dr David Harris-Birtill (University of St Andrews) and Dr Neil Clancy (Imperial College London).

The Workshop in Medical Image Analysis, held at the University of Dundee on the 27th March 2015 was a huge success with excellent oral and poster presentations. Over forty researchers came to hear the invited talks from Prof Giovanni Montana (Kings College London) and Prof David Wyper (SINAPSE) as well as the other excellent research talks.

We look forward to future events within the theme. If you are interested in hosting an event within the theme please email theme leader Dr David Harris-Birtill (dcchb@st-andrews.ac.uk) with a short proposal.

Mar 23 / Daniel John Rough

April 13th, seminar by Nicolai Marquardt: Towards Ad-hoc Collaboration Spaces with Cross-Device Interaction Techniques

Speaker: Nicolai Marquardt, University College London
Date/Time: 1-2pm April 13, 2015
Location: CS1.33a, University of St Andrews

Abstract:
Despite the ongoing proliferation of devices and form-factors such as tablets and electronic whiteboards, technology often hinders (rather than helps) informal small-group interactions. Whereas natural human conversation is fluid and dynamic, discussions that rely on digital content—slides, documents, clippings—often remain hindered due to the awkwardness of manipulating, sharing, and displaying information on and across multiple devices. Addressing these shortcomings, in this talk I present our research towards fluid, ad-hoc, minimally disruptive techniques for co-located collaboration by leveraging the proxemics of people as well as the proxemics of devices. In particular, I will demonstrate a number of cross-device interaction techniques—situated within the research theme of proxemic interactions—that support nuanced gradations of sharing. I will also introduce different novel hybrid sensing approaches enabling these interaction techniques and discuss future research directions.

Bio:
Nicolai Marquardt is Lecturer in Physical Computing at University College London. At the UCL Interaction Centre he is working in the research areas of ubiquitous computing, physical user interfaces, proxemic interactions, and interactive surfaces. He is co-author of the books Proxemic Interactions: From Theory to Practice (Morgan & Claypool 2015) and Sketching User Experiences: The Workbook (Elsevier, Morgan Kaufmann 2012).

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Mar 17 / Aaron Quigley

Introduction to HCI related research and teaching in St Andrews

In this talk academic staff members working on topics related to HCI will present an overview of their research and how this relates to the HCI MSc modules and program.

Feb 23 / Aaron Quigley

Helsinki Seminar: March 6th 2015

Prof. Aaron Quigley from the University of St Andrews is giving a talk on Friday 6th March.  The talk will take place in Otaniemi in the TUAS building at 14:00.
TIME
Friday 6th March, 14:00-15:00

PLACE
TUAS

TITLE
Public-displays to the left of me, head-mounted displays to the right, Here I am, stuck with the mobile phone that is you!

ABSTRACT
Displays are all around us, on and around our body, fixed and mobile, bleeding into the very fabric of our day to day lives.  Displays come in many forms such as smart watches, head-mounted displays or tablets and fixed, mobile, ambient and public displays. However, we know more about the displays connected to our devices than they know about us. Displays and the devices they are connected to are largely ignorant of the context in which they sit including knowing physiological, environmental and computational state. They don’t know about the physiological differences between people, the environments they are being used in, if they are being used by one or more people.

In this talk we review a number of aspects of displays in terms of how we can model, measure, predict and adapt how people can use displays in a myriad of settings. With modeling we seek to represent the physiological differences between people and use the models to adapt and personalize designs, user interfaces. With measurement and prediction we seek to employ various computer vision and depth sensing techniques to better understand how displays are used. And with adaptation we aim to explore subtle techniques and means to support diverging input and output fidelities of display devices. The talk draws on a number of studies from recent UMAP, IUI, AVI and CHI papers.

Our ubicomp user interface is complex and constantly changing, and affords us an ever changing computational and contextual edifice. As part of this, the display elements need to be better understood as an adaptive display ecosystem rather than simply pixels.

SHORT BIO
Professor Aaron Quigley is the Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews, UK. Aaron’s research interests include surface and multi-display computing, human computer interaction, pervasive and ubiquitous computing and information visualisation. He has published over 135 internationally peer-reviewed publications including edited volumes, journal papers, book chapters, conference and workshop papers and holds 3 patents. In addition he has served on over 80 program committees and has been involved in chairing roles of over 20 international conferences and workshops including UIST, ITS, CHI, Pervasive, UbiComp, Tabletop, LoCA, UM, I-HCI, BCS HCI and MobileHCI.

Feb 23 / Daniel John Rough

March 10th, seminar by Nick Taylor: Sustaining Civic Engagement in Communities

Speaker: Nick Taylor, University of Dundee
Date/Time: 2-3pm March 10, 2015
Location: CS1.33a, University of St Andrews

Abstract:
Engagement with local issues is typically very low, despite digital technologies opening up more channels for citizens to access information and get involved than ever before. This talk will present research around the use of simple physical interfaces in public spaces to lower barriers to participation and engage a wider audience in local issues. It will also explore the potential for moving beyond top-down interventions to support sustainable grassroots innovation, in which citizens can develop their own solutions to local issues.

Bio:
Nick Taylor is a Lecturer and Dundee Fellow in the Duncan of Jordanstone College of Art and Design at the University of Dundee. His research interests involve the use of novel technologies in social contexts, particularly in communities and public spaces. This has involved the exploration of technologies to support civic engagement in local democracy, public displays supporting community awareness and heritage, as well as methods of engaging communities in design.

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Dec 5 / uta

Now on Sale: FatFonts World Population Maps

fatFontPoster

Looking for a gift for a visualization aficionado? We are happy to announce that the first ever FatFonts World Population Map is now available in the Axis Maps store. All proceeds from the maps will be used to fund more FatFont-related research.

The map shows how the population of the world is distributed. It uses a typographic visualization technique–FatFonts–which allows you to read the exact number of people living in a particular area with a precision within 100,000 people. Each number in the world map corresponds to the population in an area of approx. 40,000 km².

detail_1_600

FatFonts – first conceived and designed by Miguel Nacenta and Uta Hinrichs – are digits that can be read as numbers, but also encode the information visually in the amount of ink that each digit uses. For example, digit eight 8 has eight times the amount of ink of digit one 1, digit seven 7 seven times and so on and so forth. This technique turns a table of numbers into a graphical representation where darker areas (with thicker numbers) represent higher population density. Stepping away from the map gives you an overview of which areas are heavily populated, coming closer lets you read the exact values.

To represent population densities from the tens of millions in a square (e.g., in New York City or Istanbul) to the hundreds of thousands, we use two layers: the FatFont numbers with orange backgrounds represent tens of millions of people. For example, the square that contains Buenos Aires shows you that fourteen million people live in that square of the world (the smaller 4 within the larger 1 represents the smaller order of magnitude). Tiles without an orange background represent populations between 9.9 million people to 100,000 (one order of magnitude lower).

This is an effective way to represent several orders of magnitude. The effect is quite mesmerising, and it gives you a good idea of where people actually live. Although it is possible to represent the same data with colours (i.e., colour scales), it is something different to see also the number itself. With the number you can easily make comparisons, calculate proportions, and relate what you see with the knowledge that you have already.

After a few minutes of looking at the map it starts to really sink in how empty some areas of the planet really are (Australia!), and how the real population centroid of the world is clearly in South East Asia. The map uses an equal-area projection; the numbers that you read are, therefore, also population densities. The representation is derived from the 2005 estimations for 2015 of the GPWFE dataset made available by SEDAC, Columbia University. 15 insets highlight interesting areas of high and low population in more detail, such as Northern China, Mexico City, Egypt, Western Japan, Bangladesh and Africa’s Great Lakes region.

Angle Australia_600

Nov 27 / Gonzalo Mendez

Exciting Collaboration with Wacom to Investigate Pen+Touch Interaction

wacomClose

Manipulation of visual information on the Wacom Cintiq 24HD touch display.

As part of a joint initiative to better understand pen+touch interaction in multi-touch devices, the SACHI lab has started a collaborative research endeavour with Wacom Co., Ltd. As a result, we recently welcomed some new arrivals to our lab: a  Cintiq 24HD touch display and a Cintiq Companion Hybrid tablet.

This equipment has an ergonomic design with a high resolution screen which combines multi-touch and pen capabilities. We intend to use them to explore new interaction possibilities and provide insights that can be incorporated in the design process of new multi-touch devices. Specifically, we will study user interaction within the creative space of Complex Graphic Manipulations, and with children in the context of handwriting.

SACHI looks forward to keeping you up to date with our discoveries.

wacomCollab

SACHI researchers collaborating with Wacom devices.

 

Nov 20 / Gonzalo Mendez

December 2nd, seminar by Eve Hoggan: Augmenting and Evaluating Communication with Multimodal Flexible Interfaces

Speaker: Dr. Eve Hoggan, Aalto Science Institute and the Helsinki Institute for Information Technology

Date/Time: December 2, 2014 / 2-3pm

Location: Jack Cole Building 1.33a, School of Computer Science

Title:  Augmenting and Evaluating Communication with Multimodal Flexible Interfaces

Abstract:

This talk will detail an exploratory study of remote interpersonal communication using the ForcePhone prototype. This research focuses on the types of information that can be expressed between two people using the haptic modality, and the impact of different feedback designs. Based on the results of this study and other current work, the potential of deformable interfaces and multimodal interaction techniques to enrich communication for users with impairments will be discussed. This talk will also present an introduction to neurophysiological measurements of such interfaces.

Bio:

Eve Hoggan is a Research Fellow at the Aalto Science Institute and the Helsinki Institute for Information Technology HIIT in Finland, where she is vice-leader of the Ubiquitous Interaction research group. Her current research focuses on the creation of novel interaction techniques, interpersonal communication and non-visual multimodal feedback.  The aim of her research is to use multimodal interaction and varying form factors to create more natural and effortless methods of interaction between humans and technology regardless of any situational or physical impairment.

More information can be found at www.evehoggan.com

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.

Nov 4 / Gonzalo Mendez

November 11th, seminar by Jason Alexander: Supporting the Design of Shape-Changing Interfaces

Speaker: Dr. Jason Alexander, School of Computing and Communications, Lancaster University

Date/Time: November 11, 2014 / 2-3pm

Location: Jack Cole Building 1.33a, School of Computer Science

Title:  Supporting the Design of Shape-Changing Interfaces

Abstract:

Shape-changing interfaces physically mutate their visual display surface to better represent on-screen content, provide an additional information channel, and facilitate tangible interaction with digital content. The HCI community has recently shown increasing interest in this area, with their physical dynamicity fundamentally changing how we think about displays. This talk will describe our current work supporting the design and prototyping of shape-changing displays: understanding shape-changing application areas through public engagement brainstorming, characterising fundamental touch input actions, creating tools to support design, and demonstrating example implementations. It will end with a look at future challenges and directions for research.

Bio:

Jason is a lecturer in the School of Computing and Communications at Lancaster University. His primary research area is Human-Computer Interaction, with a particular interest in bridging the physical-digital divide using novel physical interaction devices and techniques. He was previously a post-doctoral researcher in the Bristol Interaction and Graphics (BIG) group at the University of Bristol. Before that he was a Ph.D. student in the HCI and Multimedia Lab at the University of Canterbury, New Zealand. More information can be found at http://www.scc.lancs.ac.uk/~jason/

This seminar is part of our ongoing series from researchers in HCI. See here for our current schedule.