One of the more impressive developer announcements at I/O 2017 was related to standalone virtual reality. With Seurat, mobile VR headsets can handle high-fidelity graphics typically associated with full desktop rigs. Ahead of this year’s I/O, Google is open sourcing the technology.
Seurat goes hand-in-hand with the full positional tracking found on the Lenovo Mirage Solo thanks to Google’s WorldSense technology. Essentially, the tool — named after the French post-Impressionist painter — simplifies very complex 3D scenes “into a representation that renders efficiently on mobile hardware.”
Seurat works by taking advantage of the fact that VR scenes are typically viewed from within a limited viewing region, and leverages this to optimize the geometry and textures in your scene. It takes RGBD images (color and depth) as input and generates a textured mesh, targeting a configurable number of triangles, texture size, and fill rate, to simplify scenes beyond what traditional methods can achieve.
ILMxLAB used it on a Rogue One: A Star Wars Story experience, as did the Blade Runner: Revelations game that launched with standalone Daydream.
To create the look and feel for Revelations, Seismic used Seurat to bring a scene of 46.6 million triangles down to only 307,000, improving performance by more than 100x with almost no loss in visual quality.
From my untrained eye, the most notable difference appears to be how the Seurat processed image (right) tones down shadows and makes them more generic.
With it now open sourced, developers can use Seurat “to bring visually stunning scenes to [their] own VR applications and have the flexibility to customize the tool for [their] own workflows.”