The Soli radar system used for the hands-free gesture navigation on the Pixel 4 is the staple of the phone’s marketing as it’s the first and only phone to utilize the tech. Just the other day, Google gave us a look at what the radar sees and it’s fascinating.
Check out the three clips below. Google says the tech doesn’t require a detailed image of the user or the user’s body, eliminating all privacy concerns about the Soli radar. The first image shows a person getting closer to the phone, the second one is reaching for it while the third is the swipe gesture.
That’s all the radar needs, so from then on, Google’s developers had to create an algorithm for the software to recognize the different actions. They had to take into account every person’s unique way of performing a said gesture. After all, not all humans move their hands in the same way – some gestures would look different depending on the user. Moreover, the developers also had to think of an algorithm to filter out the background movement so it doesn’t get detected as a gesture..
Interestingly, Google was faced with a challenge to develop new signal processing techniques to reduce the interference from audio vibration. With this issue out of the way, the company was able to make Motion Sense work with music playback.