At I/O 2018, Google detailed a big update to Google Lens a year after first announcing it. In addition to a new design and expanded availability in camera apps, Lens is slated to get real time lookup and smart text selection. This revamp is now slowly beginning to launch.
Google Lens features a new intro prompt that details its capabilities. Instead of a dialogue bubble like on Assistant, Lens features a new rounded sheet at the bottom of the screen that notes how users can “Tap on objects and text.” Sliding that screen up notes what Lens can identify: Text, Products, Books & Media, Places, and Barcodes.
A microphone is accessible from the bottom-right corner, while the suggestion chips to “Remember this,” “Import to Keep,” and “Share” are now gone.
The new augmented reality features announced at I/O are already live. This includes the real-time search capability where users can just point their camera at the world and have Lens surface proactive information. Recognized objects in the world now feature anchored dots that immediately load information when tapped.
Google notes that this is made possible by on-device machine learning and the use of cloud TPUs to “to identify billions of words, phrases, places, and things in a split second.”
Meanwhile, other operational features are smart text selection and the ability to search for similar items. This is useful for fashion, as well as product search.
We’ve only spotted the new Google Lens on one device, with availability due to a server-side update rather than any APK Insight enabling by us. It appeared after installing the Android P Beta and again later following a downgrade to Android Oreo.
A OnePlus 6 was used in both occasions, but this is most likely a server-side update with the device making no impact. We’ve checked on several Pixel and non-Pixel devices with the new update not yet available. Google at I/O noted that these features are rolling out in the coming weeks.
Dylan contributed to this article