One of the signature features of the Google Pixel 4 is Motion Sense, which allows you to perform certain (very specific) tasks on your phone with a wave of your hand. Unfortunately, Google intends for the Pixel 4’s Motion Sense gestures to stay limited for the near future, with no plans to open an API for developers.
project soli Stories October 31
project soli Stories September 4
The Google Pixel 4 is shaping up to be one of the most exciting flagships of 2019, and one interesting feature is what it will do with a Soli radar chip. Google’s reveal of these “Motion Sense” gestures for the Pixel 4 confirmed that it would only be available in some countries, and today, a launch list of those countries has been discovered.
project soli Stories January 1
The Advanced Technology and Projects (ATAP) group is responsible for experimental initiatives inside Google. With Project Soli, the company set out to create a radar-based input method that provides the tactile feedback of interacting with your fingers. The FCC on Monday approved a waiver for Soli’s higher-than-normal frequencies, thus permitting certification and allowing Google to continue Soli as intended.
project soli Stories November 10, 2016
Google first introduced us to Project Soli last year as miniature radar hardware that allows gesture control of devices. Earlier this year, it somehow managed to squeeze the tech into a smartwatch. A research team from Scotland has now expanded Soli’s smarts, allowing the radar to identify objects as well as gestures, putting it into a device it calls RadarCat.
We have used the Soli sensor, along with our recognition software to train and classify different materials and objects, in real time, with very high accuracy […] Our studies include everyday objects and materials, transparent materials and different body parts.
While this work was previewed at Google I/O earlier this year, the team has now made the full paper available, together with a longer video, below …
project soli Stories May 21, 2016
At Google I/O 2016, the Mountain View company decided — although admittedly not an entirely new theme — that it would be a good idea to spread its announcements across three days. The keynote showed off Google’s vision for the future: virtual reality, its new AI and machine learning initiatives, Google Home hardware to take advantage of them, and a few sprinkles of Android Wear 2.0 goodness. The second day saw the announcement of the Play Store coming to Chrome OS.
But the third day was ATAP day, admittedly my favorite day of Google I/O. Last year the Advanced Technologies and Projects group at Google showed off Project Jacquard, Project Soli, some more details on Project Ara, and more. And then the company went silent. For pretty much an entire year.
Maybe that’s a good thing, as Google tends to show its projects and technologies off a little early in general. It’s not exactly out of Google’s character to show a product or service, say that it’s coming in 6 months, it not arrive for 12 months or 18 months, and then the final product share hardly any resemblance to what was originally announced. Admittedly that’s happening with some of ATAP’s projects either way (I’m looking at you, Ara), but at least it’s not a constant barrage of teases and false hope.
Anyway, Google ATAP finally came out of hiding on the third day of I/O yesterday, and with it came updates on Project Jacquard, Project Soli, Project Ara, and Spotlight Stories. Jacquard brought the announcement of the first retail product based on the tech, Ara brought a little update on how progress is coming including the most current prototype device with new module connectors (and promise of a dev kit coming soon), and the Spotlight Stories mention came with some progress in VR storytelling. All cool stuff.
But Soli is what makes my jaw drop.
project soli Stories May 20, 2016