St Andrews HCI Research Group

19

Oct 2016

RadarCat at UIST 2016 in Tokyo, Japan


RadarCat (Radar Categorization for Input & Interaction) was presented at UIST 2016 this week in Tokyo, Japan. RadarCat is a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices.

The RadarCat paper which appears in the Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST ’16) can be accessed via:

dsc07648

RadarCat has been highlighted earlier in the University news, the Courier and Gizmodo and in a Google I/O ATAP 2016 session.
dsc07637
In this paper we demonstrate that we can train and classify different types of objects which we can then recognize in real time. Based on established research designs, we report on the results of three studies, first with 26 materials (including complex composite objects), next with 16 transparent materials (with different thickness and varying dyes) and finally 10 body parts from 6 participants.
dsc07608
SACHI’s contribution to Project Soli featured in a previous blog post SACHI contribute to Google’s Project Soli, in May. Read more about RadarCat for object recognition on the SACHI blog.