Google Maps Live View uses augmented reality to overlay arrows, directions, and now landmarks over the real world to make navigating easier. The feature is getting a number of big updates today.

Outside of walking directions, Live View today lets you see how far away and in what direction a location is. Google will now overlay nearby landmarks to let you “quickly and easily orient yourself and understand your surroundings.”

Live View will show you how far away certain landmarks are from you and what direction you need to go to get there. These landmarks can include iconic places, like the Empire State Building in New York and the Pantheon in Rome, and easily recognizable places, like local parks and tourist attractions.

Landmarks in Live View is rolling out soon to Google Maps for Android and iOS in the following cities:

Amsterdam, Bangkok, Barcelona, Berlin, Budapest, Dubai, Florence, Istanbul, Kuala Lumpur, Kyoto, London, Los Angeles, Madrid, Milan, Munich, New York, Osaka, Paris, Prague, Rome, San Francisco, Sydney, Tokyo, Vienna 

Meanwhile, the Live View button will be appearing in more places throughout Google Maps. In addition to directions and opening a direct listing, the AR button will appear when navigating by transit. Google says this is “particularly useful when you exit a transit station and don’t know which way to go.”

Last month’s Android 11 Pixel Feature Drop introduced Live View Location Sharing. This feature is now coming to all Android and iOS users worldwide. Essentially the other person, instead of a static location, becomes the destination. Along with Live View in Transit, this is rolling out over the “coming weeks.”

When a friend has chosen to share their location with you, you can easily tap on their icon and then on Live View to see where and how far away they are–with overlaid arrows and directions that help you  know where to go.

Lastly today, Google touts improvements to the global localization technology that powers Live View in Maps. Pins dropped into the real world will now take into account elevation. This is leveraging Google’s understanding of topography and machine learning so places that are nearer do not appear “far off into the distance.”

Below, you can see how Lombard Street—a steep, winding street in San Francisco—previously appeared far off into the distance. Now, you can quickly see that Lombard Street is much closer and the pin is aligned with where the street begins at the bottom of the hill.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Google on YouTube for more news:

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author

Abner Li

Editor-in-chief. Interested in the minutiae of Google and Alphabet. Tips/talk: