In recent years, Google has worked to make Image Search more capable with Lens integration and Fact Check labels. Results in Google Images will now include Knowledge Graph facts to tell users about what they’re seeing.
Google wants to provide you with “quick facts” about the results that appear in Image Search. Underneath a picture, (often generic) page title, and the “Visit” button, there are three or so relevant entries. They are about the people, places, or things most related to that photo.
Once expanded, you’ll encounter a short description and carousel of other topics to search. That information is sourced from Google’s Knowledge Graph of facts, bios, and relationships. This added context is meant to help you “better understand the image you’re viewing and whether the web page is relevant to your search.”
For example, let’s say you’re searching for beautiful state parks to visit nearby. You want to swim during your visit, so you tap on a picture of a park with a river. Beneath the photo you might see related topics, such as the name of the river, or which city the park is in.
Or perhaps you’re looking for information about a famous architect’s work to inspire a home renovation or art project. You might come across this article about the architect winning an award and be able to easily learn more about the woman who is the namesake of that prize.
Behind-the-scenes, Google selects the relevant Knowledge Graph entries by using deep learning to evaluate an “image’s visual and text signals” and then cross-referencing it with the text that appears on the original web page.
This feature is first coming to mobile in the US for a few images of people, places, and things before expanding to more languages and services.
FTC: We use income earning auto affiliate links. More.