With the first Routines launching yesterday, Assistant is now adding another feature that Google announced last month. Meanwhile, text selection in Google Lens for Assistant is now widely available.
Google Home users can now set location-based reminders that will be triggered on phones when appropriate. This allows for alerts that were previously restricted to the phone-based Assistant, like “pick up more coffee at the grocery store,” to be set from smart speakers.
Meanwhile, Google Lens in Assistant (as noted by Android Police) also picks up the very useful ability to select any text in the current frame. Previously limited to Lens in Photos, this Now on Tap-like feature is very powerful and might increase usage of Google’s visual search functionality for some.
After focusing on an image, Lens will now take a second to analyze the image and highlight all recognizable text. Tapping on any will bring up Android’s standard text selection sliders where users can adjust, search, and copy. Meanwhile, when appropriate, Lens will suggest Search results at the bottom of the screen, but users can return to selecting text by swiping down at the card.
While text is grouped and categorized together, users are able to ‘select all’ by adjusting the sliders. In brief testing, we’ve found that it works quite well, even when just pointing to text on a screen.
Google announced last month that the Lens preview is expanding to “flagship” Android devices in the coming weeks. Meanwhile, Lens in Photos is already widely rolled out for all English-language users and coming soon to iOS.
Check out 9to5Google on YouTube for more news:
FTC: We use income earning auto affiliate links. More.
Comments