At I/O in May, Google announced Multitask Unified Model (MUM) as a big improvement to Search. MUM is now set to come to Google Lens “early next year.”

After taking a picture in Lens or analyzing an existing image/screenshot, swiping up will continue to show “Visual matches.” There will soon be a new “Add questions” button at the very top of the sheet that opens a very slick UI. These “Related searches” or any entered query let you refine the search.

Google showed off two demos at Search On 2021 today. Both are intended to let you combine visual and text input.

The most impressive example involves taking a picture of a broken bike part and asking, “how do you fix this.” Google will correctly identify what exactly needs to be fixed and how to do so through videos, forum results, and other help articles on the web.

By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways.

Another example involves capturing an image of a style — e.g. on a shirt — and asking for other pieces of clothing (“socks with this pattern”) that share the same design.

MUM in Google Lens will be available in the coming months. The company is actively testing and conducting “rigorous testing and evaluation” of the AI model before launching.

Other Search On 2021 announcements today include:

More about Google Lens:

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news:

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: