Meta has announced a new software update for its Ray-Ban smart glasses which will add “Live AI,” a feature that can use a video feed to gather context for questions, similarly to Google’s Project Astra.
A new update rolling out to Ray-Ban Meta smart glasses, v11, which brings a few new options.
That includes Shazam integraiton, which will allow users to ask the glasses “Hey Meta, what is this song” and then have the result read aloud. This feature will be available in the US and Canada.
Beyond that, Meta is also introducing new AI features, and they look enticing. The first of these new features is “Live AI,” which allows Ray-Ban Meta glasses to capture video which is then used by the AI to offer “real-time, hands-free help” on the things you’re actively doing.
Meta says that, eventually, this data will be used to offer suggestions before you even have to ask.
The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.
“Live translation,” meanwhile, will be able to translate speech in real-time, with the other person’s speech being output in English through the glasses (and also transcribed on your phone). This works for Spanish, French, and Italian.
Meta will only be rolling these features out through a waitlist, an only in the US and Canada for now.
Google is working on something just like this.
At Google I/O 2024 in May, the company showed off “Project Astra,” a new AI project that would be able to use a video feed to gather context, then being able to answer questions based on what it saw. Google teased the functionality on glasses, but has yet to roll anything out. The announcement of Gemini 2.0 earlier this month saw Google detailing new updates to Astra will be able to converse in multiple languages, store up to 10 minutes of memory, improve latency, and more. It’s unclear how Meta’s “Live AI” will compare, but it’s certainly exciting to see this functionality coming so soon, especially as we won’t see it fully realized for Google until sometime next year.
More on Smart Glasses:
- One day… realized: Hands-on with Google’s Android XR glasses
- Meta shows off ‘Orion’ AR glasses
- Google announces Android XR, launching 2025 on Samsung headset
Follow Ben: Twitter/X, Threads, Bluesky, and Instagram
FTC: We use income earning auto affiliate links. More.
Comments