Researchers at Cornell University have developed a wearable with batlike sonar that could improve upper-body tracking in virtual reality and other applications. The Cornell team fitted a generic pair of eyeglasses with a tiny sonar system, demonstrating how acoustic signals can be used instead of cameras to capture the body’s movement.
Not only would sonar be more efficient in terms of battery consumption, the team told the Cornell Chronicle, but it would also do away with the privacy risks that come with headsets’ externally facing cameras. The system, dubbed PoseSonic, uses two pairs of microphones and speakers to send and receive acoustic signals, according to a recently published paper. With help from their deep learning model, it can then estimate 3D poses at nine different points — the shoulders, elbows, wrists, hips and nose — as these signals bounce off the upper body.
The team tested it both in the lab and “semi-in-the-wild,” and found it wasn’t negatively affected by environmental noise in any significant way. With this technique, “we use less instrumentation on the body, which is more practical, and battery performance is significantly better for everyday use,” senior author Cheng Zhang told the Cornell Chronicle.
In addition to its potential use in augmented and virtual reality, the researchers say sonar could make for better health tracking by capturing more detailed information on the body’s movements. They’ve only got the upper-body covered at the moment, though — VR legs continue to elude us.