Skip to main content

Meta’s smart glasses will watch what you eat to track nutrition data

Meta has revealed a new update for its smart glasses – Ray-Ban and Oakley – that allow you to track what you’re eating using the camera on your face.

Alongside last week’s announcement of new prescription-focused smart glasses styles, Meta also announced some new features coming to its smart glasses. This includes new capabilities on Meta’s display-equipped glasses as well as widgets, but also some new options for the audio-only glasses that have been popular on the market.

Meta says that its AI glasses will now be able to “extract key nutrition details” using a “simple voice prompt or quick photo.” A food log will be added to the Meta AI app, with Meta saying that this data will be used to personalize responses around food recommendations such as “What should I eat to increase my energy?” and other queries.

The company further teases that, in the future, Meta AI on smart glasses will be able to “automatically log your food,” presumably using the camera to do so. That would come with “ongoing software updates,” but the starting point with a food log and image recognition is set to arrive in the US “soon” for users over the age of 18. As Gizmodo points out, there are certainly privacy concerns with Meta’s plans of future automatic recognition, so an age limit is probably for the best.

Advertisement - scroll for more content

Meta explains:

First up, we’re making nutrition tracking on our AI glasses easier. With a simple voice prompt or quick photo, you can log what you eat hands-free, and Meta AI will extract key nutrition details and add them to your food log in the Meta AI app. Over time, your food log powers increasingly personalized insights that get more useful, helping you make healthier, more informed choices. When you want guidance in the moment, you can ask Meta AI questions about what to eat next — like “What should I eat to increase my energy?” — with answers that take your food log and goals into account. This will be available to users aged 18+in the U.S. on Ray-Ban Meta and Oakley Meta glasses soon, and Meta Ray-Ban Display glasses later this summer.

As we shared this past Connect, with ongoing software updates, Meta AI on glasses will transition from something you have to prompt with a question each time, to a more continuous, in-the-moment assistant that can help throughout the day. For example, in the future, we’ll get to a point where your AI glasses can understand what you’re eating and automatically log your food. So you can get even richer, more personalized nutrition insights without having to remember to log every meal.

Meta’s Ray-Ban Display glasses won’t get support until “this summer.”

More on Smart Glasses:

Follow Ben: Twitter/XThreads, Bluesky, and Instagram

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Schoon Ben Schoon

Ben is a Senior Editor for 9to5Google.

Find him on Twitter @NexusBen. Send tips to schoon@9to5g.com or encrypted to benschoon@protonmail.com.