Skip to main content

Google Lens revamp in development with ‘filters’ for Dining, Shopping, & Translate

At I/O 2018, Google revamped Lens with real-time recognition, smart text selection, and native camera app integration. A year later, another redesign for Google Lens that adds specific recognition “filters” is in development.

About APK Insight: In this ‘APK Insight’ post, we’ve decompiled the latest version of an application that Google uploaded to the Play Store. When we decompile these files (called APKs, in the case of Android apps), we’re able to see various lines of code within that hint at possible future features. Keep in mind that Google may or may not ever ship these features, and our interpretation of what they are may be imperfect. We’ll try to enable those that are closer to being finished, however, to show you how they’ll look in the case that they do ship. With that in mind, read on.

For the past several beta releases, the Google app has been adding new features for Lens. Version 9.61 from earlier this month detailed various filters for Lens, including Translate, Dining, and Shopping.

<string name=”translate_filter_name”>Translate</string>

<string name=”dining_filter_name”>Dining</string>

<string name=”shopping_filter_name”>Shopping</string>

With Google app 9.72 this evening, we have enabled the new Google Lens interface. Still in development (and currently lacking any branding), this revamp places five filters at the bottom of the screen. Each filter presumably puts Lens in a precise mode or accesses a specific visual tool.

Google Lens filters revamp

In the middle is a magnifying glass icon that’s likely for general search. To the left is the aforementioned “Translate” filter that will likely focus Lens on translating foreign text into your designated language. From past strings, these translation feature are more advanced than the current implementation that just recognizes text and offers to open the Google Translate app.

The new Lens could “Auto-detect” a language, with users having the option to “Change languages” from the “Source” text to the “Target.”

<string name=”lens_translate_apply_button”>Apply</string>

<string name=”lens_translate_filter_params_auto_language”>Auto-detect</string>

<string name=”lens_translate_filter_params_auto_language_long”>Auto-detect language</string>

<string name=”lens_translate_filter_params_label”>%1$s → %2$s</string>

<string name=”lens_translate_filter_params_title”>Translate options</string>

<string name=”lens_translate_source_label”>Source</string>

<string name=”lens_translate_target_label”>Target</string>

The next tool could be optimized for copying text. Google Lens has long featured smart text selection for quick optical character recognition (OCR), like taking action on addresses, numbers, and copying large swathes of text.

Google Lens

To the right of the main visual search mode — which would still be leveraged for identifying monuments, plants, animals, etc. — are “Shopping” and “Dining” icons. The former presumably works to identify clothing and furniture, with results specifically tuned to product search. AR Shopping is another feature in development.

The Dining filter is more mysterious — as an equivalent feature does not currently exist in Lens, but strings suggest something more akin to Google Maps AR. Users might be able to point their camera at the real world and only see restaurants highlighted in the viewfinder. One string in Google app 9.66 requested permission for location to “Search restaurant nearby” in possible conjunction with this mode.

<string name=”search_nearby”>Search restaurant nearby</string>

<string name=”permission_not_granted”>To see popular dishes, turn on location</string>

The idea of adding filters (or specific modes) to Google Lens might be counterintuitive to the idea of just pointing a camera at the world and getting results. However, this added layer of complexity can provide more refined results. Users are now able to choose what help they receive rather than just have Google attempt to automatically figure it out and return an unrelated result.

Before Google Lens was announced, we spotted it in development as “Visual Search.” One of the final iterations before Lens officially launched was a similar interface to what we enabled today. However, that UI was likely for testing and helped refine the accuracy of Lens by requiring testers to manually specify the search category.

It’s not clear when this Google Lens filters revamp will launch, but I/O is a good bet given that the visual search feature first debuted at the developer conference in 2017, and received a major revamp the year after. Be sure to check out our full APK Insight of Google app 9.72 where we also enabled a Sleep Timer for Google Podcasts and several other features.

Dylan contributed to this article

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Check out 9to5Google on YouTube for more news:

Comments

Author

Avatar for Abner Li Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: abner@9to5g.com