The small size of wearable devices limits the eﬃciency and scope of possible user interactions, as inputs are typically constrained to two dimensions: the touchscreen surface. We present SoundTrak, an active acoustic sensing technique that enables a user to interact with wearable devices in the surrounding 3D space by continuously tracking the ﬁnger position with high resolution. The user wears a ring with an embedded miniature speaker sending an acoustic signal at a speciﬁc frequency (e.g., 11 kHz), which is captured by an array of miniature, inexpensive microphones on the target wearable device. A novel algorithm is designed to localize the ﬁnger’s position in 3D space by extracting phase information from the received acoustic signals. We evaluated SoundTrak in a volume of space (20cm x 16cm x 11cm) around a smartwatch, and show an average accuracy of 1.3 cm. We report on results from a Fitts’ Law experiment with 10 participants as the evaluation of the real-time prototype. We also present a set of applications which are supported by this 3D input technique, and show the practical challenges that need to be addressed before widespread use.
Paper: SoundTrak: Continuous 3D Tracking of a Finger Using Active Acoustics. ACM IMWUT 2017
Team: Cheng Zhang, Qiuyue Xue, Anandghan Waghmare, Sumeet Jain, Yiming Pu, Jordan Conant, Sinan Hersek.
Advisor: Kent Lyons, Kennenth Cunefare, Omer Inan, Gregory Abowd.