One of the most interesting announcements at Google’s Search-focused keynote in September was a big upgrade to Lens that lets you take a photo and ask questions about it. “Multisearch” in Google Lens is now available to beta test on Android and iOS.
Multisearch is Google’s “entirely new way to search” that aims to address how you sometimes don’t “have all the words to describe what you were looking for.” It starts by taking a picture (or importing an existing one) with Google Lens and then swiping up on the results panel and tapping the new “Add to your search” button at the top.
This will let you enter a “question about an object in front of you or refine your search by color, brand or a visual attribute.” Examples include:
- Screenshot a stylish orange dress and add the query “green” to find it in another color
- Snap a photo of your dining set and add the query “coffee table” to find a matching table
- Take a picture of your rosemary plant and add the query “care instructions”
Fashion and home decor use cases are prominently highlighted today, with Google noting that – currently – the “best results [are] for shopping searches.” That said, the last example above means you don’t have to first use Lens to identify a plant and then perform a separate text search for “care instructions” after identification.
Google credits the “latest advancements in artificial intelligence” as making possible Lens multisearch. That said, it’s not using the Multitask Unified Model and can’t handle complex queries yet. For example, MUM was responsible for the demo where you could take a picture of broken bicycle gears and get instructions on how to repair.
More on Google Lens:
- Lens on Android updated to make browsing for images on your device easier
- Google tests adding Lens to desktop Search on the web
- Assistant and Lens had a quiet 2021 as foundational advancements remain in the wings
FTC: We use income earning auto affiliate links. More.