Now that we know for certain that the Pixel 4 will come equipped with a Soli radar chip for recognizing hand movements, we’re curious how Google intends to improve it over time. In the company’s own words, the Motion Sense feature will “evolve” as it ages, so let’s allow this post to push Google in the right direction in terms of what we, the consumers, would like to see, yes?
As of right now, we know Motion Sense will be capable of skipping songs, snoozing alarms, and silence phone calls. For us, that’s not nearly enough functionality to warrant that much real estate on the device’s front side.
We would also like to see the ability to swipe through photos in gallery applications (Photos, Gallery, etc.), scrolling in Chrome or other applicable applications, switch camera modes or apply different filters in the Camera app, control a device’s volume output, a gesture to send calls to Screen Call (Assistant answers the phone for you), possibly a dedicated gesture to fire up Google Assistant, and maybe even 3rd-party support for app developers to integrate the controls into their own work. For example, Snapchat could let you gesture swipe through filters or Instagram and Twitter could let you scroll through feeds of content.
When Google was first detailing Project Soli at Google I/O in 2016, they discussed imagining Soli being worked into nearly everything — smartwatches, alarm clocks, and even clothing. Essentially, any physical thing that requires specific hand movements could be morphed into a gesture that could be recognized by Soli, making it a very useful piece of technology. As for another suggestion, in at least one or two demos we saw during I/O, Google showed off a flicking motion, which could be used to switch apps maybe or snap back to your most recent app. That could be sweet.
We’ve given you a few examples of what we’d like to Soli be able to do, but what would you like to see? Is there an action that Soli could help you perform more easily instead of physically touching your finger to the display?