Announced at I/O 2019, the new Google Lens with filters is beginning to roll out. This redesign adds five modes to the bottom of the screen and is already available via Google Assistant for some users. The Google Lens revamp is also appearing in Google Photos.
The new Google Lens allows users to enter one of five modes to specify what tools they want to analyze an image. This may seem like added complexity, but this level of granularity is more precise and allows for more advanced features to be surfaced.
Tapping the center button acts as a shutter to begin image recognition. The main carousel will disappear as the results sheet slides up from the bottom. Filters are hidden in the left-corner, while users can now focus Lens with a new crop tool to the right.
- Translate: Point at text to translate
- Text: Point at text to copy
- Auto: Point at objects for details
- Shopping: Point at items or barcodes
- Dining: Point at dishes on a menu
Dining was announced at I/O 2019 to help users see reviews and images of food from Google Maps by just pointing their phone at a menu. Taking a picture of receipts in this mode will also open a tool to split checks.
Translate features auto-detection and can overlay your preferred language directly over the live image to retain context, while Text performs OCR to quickly copy what’s in an image. Shopping is meant to find similar pieces of clothing, furniture, or artwork, as well as read barcodes. Auto is the default mode and works like before.
This Google Lens revamp is live on two Pixel devices we checked running the latest Google app 9.91 beta that rolled out last night. However, the new Lens is not yet widely available. For users that have it enabled, the new Google Lens is also live in Google Photos.
FTC: We use income earning auto affiliate links. More.