Google has announced a ton of new stuff this week at I/O 2018, but some of the most interesting announcements have come outside of the main keynote. Yesterday, the company revealed on its blog that it is making some big updates to ARCore.
The best gifts for Android users
As you might have seen earlier today, Google’s ARCore is picking up a new collaborative aspect. As Google says in its post, some things are “better when you do them with other people,” and a demo of the company’s “Just a Line” app proves that.
With the addition of “Cloud Anchors,” Google is making it possible for ARCore developers to deliver augmented reality experiences which can be used by multiple users at the same time on different devices. The company explains:
Many things in life are better when you do them with other people. That’s true of AR too, which is why we’re introducing a capability called Cloud Anchors that will enable new types of collaborative AR experiences, like redecorating your home, playing games and painting a community mural—all together with your friends. You’ll be able to do this across Android and iOS.
Google’s “Just a Line” app will pick up this functionality “in the coming weeks” on both Android and iOS.
Along with that, Google is also improving ARCore with the addition of vertical plane detection. Essentially, this makes it possible to use the walls around you within an ARCore experience. “Augmented Images” also uses this tech to do some pretty cool things, like you can see in the video below.
ARCore now features Vertical Plane Detection which means you can place AR objects on more surfaces, like textured walls. This opens up new experiences like viewing artwork above your mantlepiece before buying it. And thanks to a capability called Augmented Images, you’ll be able to bring images to life just by pointing your phone at them—like seeing what’s inside a box without opening it.
Finally, Google is also trying to make AR development faster with “Sceneform.” The company explains:
With Sceneform, Java developers can now build immersive, 3D apps without having to learn complicated APIs like OpenGL. They can use it to build AR apps from scratch as well as add AR features to existing ones. And it’s highly optimized for mobile.