Over the past month, we’ve been slowly learning more about what the Project Soli powered “Motion Sense” on Google Pixel 4 can and can’t do. Now, we’ve found evidence of how Google would open Motion Sense on the Pixel 4 to app and game developers, including a Unity game engine plugin.
In our digging into how Motion Sense will work on the Google Pixel 4, we’ve uncovered a lot of details explaining precisely where and when you can use it. Admittedly, these initial details were pretty disappointing, considering only 23 media apps in 53 regions have been confirmed. Thankfully, however, there may be more to the story.
Last week, our Dylan Roussel discovered the existence of a game that may be bundled with Google Pixel 4 phones, called Pokemon Wave Hello, that demos Motion Sense using cute Pokemon. Given that we found no reference to this Pokemon mini-game within the Motion Sense app code, our APK Insight team started trying to figure out how Pokemon Wave Hello works.
“Oslo” Unity plugin
The game is built upon the Unity game engine, which makes a great deal of sense, considering how many times Google has partnered with Unity in the past. Digging a little deeper, we noticed the inclusion of an “OsloUnityPlugin.”
“Oslo,” we’ve learned, is one of the two codenames for Motion Sense on the Google Pixel 4 — the other being “Aware.” Thus, the package name “com.google.OsloUnityPlugin” pretty clearly paints that Google has developed a dedicated Motion Sense plugin for use with Unity.
Motion Sense Bridge
Reading through the plugin’s code, we find that it connects to a separate “Motion Sense Bridge” application that will be pre-installed on the Pixel 4. Using this bridge, app developers will have access to the same four core gestures of Motion Sense — flick, presence, reach, and swipe.
Beyond this, we’ve learned that the bridge should also provide developers with the following bits of in-depth information about each Motion Sense gesture.
Uses in games and apps
As we learned last month, the base Motion Sense app uses fairly basic presets, such as almost any “swipe” gesture being good enough to dismiss a timer. By contrast, the Motion Sense Bridge clearly offers developers more in-depth options, like being able to act differently based on the “intensity” of a swipe.
This extra information could be useful, for example, in a physics-based game, where a more “intense” swipe could launch an object farther. Or an app could respond differently based on the angle at which you’re “reaching” for your phone, labeled as the azimuth, above.
Admittedly, though, this is still a far cry from the original promises of how we would use Project Soli, when Google debuted it four years ago. Hopefully Google will expand on the core gestures of Motion Sense, or give developers more direct access to the underlying Soli sensor.
When can developers get started?
So does this mean that, from Day 1, any third-party developer can start working with Motion Sense on the Pixel 4? Unfortunately, the answer to that is still no. We obtained the Motion Sense Bridge app, and inside we found what appears to be a whitelist, meaning only those apps on the list have access. For now, there are only five apps on the whitelist to use the Motion Sense Bridge, and two of them are Pixel 4 “retail demo” apps.
It’s clear that this Motion Sense Bridge is intended to give app and game developers more direct access to the Motion Sense gestures. Whether Google intends to allow third-party developers to use Motion Sense in the future isn’t clear. For the time being, it appears Google will need to give each application express permission, which means they’ll be partnering directly with Android app and game developers.
Dylan Roussel contributed to this article.