Much like how Google Assistant was announced at I/O 16 and initially premiered with the Pixel last year, Google Lens is seeing a similar release schedule. These “set of vision based computing capabilities” for performing tasks like visual search is launching first on the Pixel 2 and Pixel 2 XL.
Google Lens is launching first on the Assistant and Google Photos. In the former product, the panel that pops up after tapping and holding the home button now has a Google Lens icon in the bottom right corner. A similar button is available for individual pictures in Photos.
Google Lens is launching first on the Assistant and Google Photos. In the former product, the panel that pops up after tapping and holding the home button now has a Google Lens icon in the bottom right corner. A similar button is available for individual pictures in Photos.
Exclusive to the Pixel, there is an “AR Stickers” feature made by Google and other third-parties. Stickers are active, moving, and reacting to the environment. Multiple stickers or characters can be added to a scene, with the scene able to be captured via a photo or video.
In Assistant, the Lens can be used to recognize flowers, restaurants on the street, and for text translation. It also integrates with Android to automatically fill in the Wi-Fi username and password on router labels. A similar functionality allows it to add events that you see printed in the real world right into your calendar.
Meanwhile in Photos, Lens can identify artwork and media, like the cover of books, music, movies, and video games. It can also recognize places in the real world such as buildings and landmarks, as well as text. Items like phone numbers, dates, and addresses — whether in screenshots or in an image — can be tapped on to launch the appropriate app or easily copied.
Google Lens is launching Assistant on the original Pixel and Pixel 2 this year. It will be coming to more devices soon.
FTC: We use income earning auto affiliate links. More.
Comments