Google Lens has a ton of potential, and the bits Google showed off with it at I/O this May and GDD in September are incredible. However, basically none of that is live right now, and functionality is incredibly limited within the Photos app. Over time it’s going to improve, though, and a couple of Googlers have hit Twitter to give us more information.
Nomad case for Pixel 3
At the moment, Google Lens is in a preview form on the Pixel 2 and Pixel 2 XL, and as of the other day, on the original Pixels too. Built into the Photos app, Lens can recognize things like addresses, books, and lots of objects, but it’s nowhere near as powerful as it will be. One of the biggest problems with it right now is that it’s not exactly easy to access.
Currently, accessing Lens requires users to open the camera, take a photo, wait for the photo to process, view it in the Photos app, and then press the Lens button to have it analyze the shot. That’s significantly slower than the Assistant integration we’ve all seen and want. Thankfully, that’s coming soon.
Two Google employees revealed in tweets earlier this week that Lens is coming to Assistant while in its preview form, but it’s going to be a little while. Both say that it will come “in a few weeks,” with one giving us the reason that Google is still polishing off the UI and feature set.