Skip to main content

Meta’s Ray-Ban smart glasses adding ‘Live AI’ that works like Google’s Project Astra

Meta has announced a new software update for its Ray-Ban smart glasses which will add “Live AI,” a feature that can use a video feed to gather context for questions, similarly to Google’s Project Astra.

A new update rolling out to Ray-Ban Meta smart glasses, v11, which brings a few new options.

That includes Shazam integraiton, which will allow users to ask the glasses “Hey Meta, what is this song” and then have the result read aloud. This feature will be available in the US and Canada.

Beyond that, Meta is also introducing new AI features, and they look enticing. The first of these new features is “Live AI,” which allows Ray-Ban Meta glasses to capture video which is then used by the AI to offer “real-time, hands-free help” on the things you’re actively doing.

Meta says that, eventually, this data will be used to offer suggestions before you even have to ask.

The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.

“Live translation,” meanwhile, will be able to translate speech in real-time, with the other person’s speech being output in English through the glasses (and also transcribed on your phone). This works for Spanish, French, and Italian.

Meta will only be rolling these features out through a waitlist, an only in the US and Canada for now.

Google is working on something just like this.

At Google I/O 2024 in May, the company showed off “Project Astra,” a new AI project that would be able to use a video feed to gather context, then being able to answer questions based on what it saw. Google teased the functionality on glasses, but has yet to roll anything out. The announcement of Gemini 2.0 earlier this month saw Google detailing new updates to Astra will be able to converse in multiple languages, store up to 10 minutes of memory, improve latency, and more. It’s unclear how Meta’s “Live AI” will compare, but it’s certainly exciting to see this functionality coming so soon, especially as we won’t see it fully realized for Google until sometime next year.

More on Smart Glasses:

Follow Ben: Twitter/XThreads, Bluesky, and Instagram

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Schoon Ben Schoon

Ben is a Senior Editor for 9to5Google.

Find him on Twitter @NexusBen. Send tips to schoon@9to5g.com or encrypted to benschoon@protonmail.com.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
You are subscribed to notifications