The EchoWrist wristband tracks hand movements and interactions using AI and inaudible sound waves, changing how we control and interact with devices.
Researchers at Cornell University have created a wristband device that uses AI-powered, inaudible sound waves to continuously monitor hand positioning and the objects with which the hand interacts.
This technology has a variety of potential applications, including monitoring hand positions for virtual reality (VR) systems, enabling the control of smartphones and other devices through hand gestures, and understanding user activities. For instance, a cooking app could provide step-by-step narration as the user chops, measures, and stirs ingredients. The device is compact enough to be integrated into a commercial smartwatch and can operate on a standard smartwatch battery all day.
EchoWrist is part of the latest advancements in low-power, body pose-tracking technology developed by the Smart Computer Interfaces for Future Interactions (SciFi) Lab. Cheng Zhang, an assistant professor of information science at Cornell Ann S. Bowers College of Computing and Information Science, directs the lab.
The device utilizes two small speakers on the top and bottom of a wristband to emit inaudible sound waves that bounce off the hand and any objects it holds. Two adjacent microphones capture the echoes, which are then processed by a microcontroller. A battery smaller than a quarter supplies power to the device.
To interpret a user’s hand posture from the echoes, the team developed an artificial intelligence model inspired by brain neurons, known as a neural network. They trained this network by comparing echo profiles with videos of users performing various gestures and reconstructed the positions of 20 hand joints based on the sound signals.
The researchers enlisted 12 volunteers to evaluate how effectively EchoWrist identifies objects like cups, chopsticks, water bottles, pots, pans, and kettles and actions such as drinking, stirring, peeling, twisting, chopping, and pouring. The device demonstrated a 97.6% accuracy rate. This accuracy enables users to follow interactive recipes that monitor the cook’s progress and provide the next step verbally, allowing cooks to keep their screens clean.
Reference: Chi-Jung Lee et al, EchoWrist: Continuous Hand Pose Tracking and Hand-Object Interaction Recognition Using Low-Power Active Acoustic Sensing On a Wristband, arXiv (2024). DOI: 10.48550/arxiv.2401.17409