Radars can replicate the ability of an optical sensor to detect motion and capture gestures as well as detect micro-gestures like finger movement.
« Previously: Sensors go beyond human touch
The first obvious products where Infineon/XMOS’ radar and MEMS microphone solution will find its home are most likely in smart speakers for the home.
However, voice capture isn’t the only space where Infineon is pushing its radar chips. Working with Google on Project Soli, Infineon expects much broader applications for its radar chips, ranging from wearables to smart speakers and AR/VR.
Google’s Soli project is designed to use radar to enable new types of touchless interactions.
As Google’s Advanced Technology and Projects (ATAP) group explains, the Soli sensor technology “works by emitting electromagnetic waves in a broad beam.”
Objects within the beam scatter this energy, reflecting some portion back toward the radar antenna. Properties of the reflected signal, such as energy, time delay and frequency shift to capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance and velocity.
Urschitz said: “Radar scans your face and whole body–including size, position, speed and angle–in a fraction of a second.” In his opinion, compared with a variety of sensors including optical, ultrasound and infrared, the company’s radar chip can comfortably beat its competition due to the radar chip’s real-time scanning capability with no latency. The [radar] technology “is super robust,” he said, while offering “more resolution than ultrasound.”
Picture a user walking around a room (several feet from a smart speaker) gesturing to reduce the volume–simply by using his or her fingers by turning an invisible knob. The radar-enabled system is designed to recognise fine finger motions, Urschitz explained.
__Figure 1:__ *Andreas Urschitz, president of the division Power Management & Multimarket at Infineon, shows how Infineon’s radar chip can be used for detecting micro-gestures by his fingers, signalling to reduce the volume. (Source: EE Times)*
Presumably, radars can replicate the ability of an optical sensor to detect motion and capture gestures, and do more, detecting micro-gestures like finger movement.
Similarly, ultrasound technology is also vying for the micro-gesture control market.
Chirp introduced at the MWC last week what it describes as “the first high-accuracy, ultra-low power ultrasonic sensing development platform for wearables.” The start-up claims that its MEMS-based time-of-flight sensor senses “tiny ‘micro-gestures’ with 1mm accuracy.”
One of the most intuitive applications for such micro-gesture controls is applying the technology to wearables, such as a smartwatch with a tiny screen, or a smart band with no screen, explained Michelle Kiang, Chirp’s CEO. By embedding MEMS-based ultrasonic transducers inside a smartwatch, users can use the finger–fat or otherwise–for making gestures, without screen contact, to control watch functions.
Next: Elliptic Labs: Gesture controls don’t go anywhere »