Announced last year at I/O, Google is announcing a number of new capabilities during this year’s keynote. The visual search feature is getting a new look and being integrated with third-party camera apps from OEMs. Meanwhile, it is adding three new features, the most notable being real-time answers.
First off, Lens adopts the same rounded look as many of Google’s other apps and services today. Meanwhile, there is a persistent bar at the bottom that slides up when results are loaded, as well as a microphone icon to the right.
Smart text selection allows Google Lens to connect what you see in the real-world and offer actions. For example, when scanning a page of words, Lens will surface relevant information and photos.
Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea. This requires not just recognizing shapes of letters, but also the meaning and context behind the words. This is where all our years of language understanding in Search help.
Style Match lets Lens find similar objects and is born out of how sometimes users don’t always want to find the exact thing they are viewing. In addition to the usual results for that particular home decor item or outfits, Lens will now surface related fashion styles, in the latter case
Lastly, Lens is adding real-time results. Users no longer have to specifically select an item and wait for results to load. In addition to this proactive information, results are anchored to the object as you move your camera’s viewfinder.
Google achieved this through both on-device machine learning and Cloud TPUs, with billions of words, phrases, and places identified “in a split second.”
Meanwhile, Google Lens is coming directly to third-party camera apps in the future. This includes the Google Pixel and devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, and Asus.
Check out 9to5Google on YouTube for more news:
FTC: We use income earning auto affiliate links. More.
Comments