RadarCat (Radar Categorization for Input & Interaction) was presented at UIST 2016 this week in Tokyo, Japan. RadarCat is a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices.
The RadarCat paper which appears in the Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST ’16) can be accessed via:
- The ACM SIGCHI OpenTOC page for UIST 2016
(search for RadarCat), for free until Oct 2017 - Directly to the ACM Digital Library page for RadarCat
- Or via the University of St Andrews Research portal.
RadarCat has been highlighted earlier in the University news, the Courier and Gizmodo and in a Google I/O ATAP 2016 session.
In this paper we demonstrate that we can train and classify different types of objects which we can then recognize in real time. Based on established research designs, we report on the results of three studies, first with 26 materials (including complex composite objects), next with 16 transparent materials (with different thickness and varying dyes) and finally 10 body parts from 6 participants.
SACHI’s contribution to Project Soli featured in a previous blog post SACHI contribute to Google’s Project Soli, in May. Read more about RadarCat for object recognition on the SACHI blog.