Meta has revealed a new update for its smart glasses – Ray-Ban and Oakley – that allow you to track what you’re eating using the camera on your face.
Alongside last week’s announcement of new prescription-focused smart glasses styles, Meta also announced some new features coming to its smart glasses. This includes new capabilities on Meta’s display-equipped glasses as well as widgets, but also some new options for the audio-only glasses that have been popular on the market.
Meta says that its AI glasses will now be able to “extract key nutrition details” using a “simple voice prompt or quick photo.” A food log will be added to the Meta AI app, with Meta saying that this data will be used to personalize responses around food recommendations such as “What should I eat to increase my energy?” and other queries.
The company further teases that, in the future, Meta AI on smart glasses will be able to “automatically log your food,” presumably using the camera to do so. That would come with “ongoing software updates,” but the starting point with a food log and image recognition is set to arrive in the US “soon” for users over the age of 18. As Gizmodo points out, there are certainly privacy concerns with Meta’s plans of future automatic recognition, so an age limit is probably for the best.
Meta explains:
First up, we’re making nutrition tracking on our AI glasses easier. With a simple voice prompt or quick photo, you can log what you eat hands-free, and Meta AI will extract key nutrition details and add them to your food log in the Meta AI app. Over time, your food log powers increasingly personalized insights that get more useful, helping you make healthier, more informed choices. When you want guidance in the moment, you can ask Meta AI questions about what to eat next — like “What should I eat to increase my energy?” — with answers that take your food log and goals into account. This will be available to users aged 18+in the U.S. on Ray-Ban Meta and Oakley Meta glasses soon, and Meta Ray-Ban Display glasses later this summer.
As we shared this past Connect, with ongoing software updates, Meta AI on glasses will transition from something you have to prompt with a question each time, to a more continuous, in-the-moment assistant that can help throughout the day. For example, in the future, we’ll get to a point where your AI glasses can understand what you’re eating and automatically log your food. So you can get even richer, more personalized nutrition insights without having to remember to log every meal.
Meta’s Ray-Ban Display glasses won’t get support until “this summer.”
More on Smart Glasses:
- Meta ‘Blayzer’ and ‘Scriber’ Ray-Bans are designed for prescription lenses with slimmer design, $499 [Gallery]
- Latest Meta smart glasses update brings Gemini-like voice commands to Ray-Ban, Oakley
- Samsung talks smart glasses details, stops short of confirming display
Follow Ben: Twitter/X, Threads, Bluesky, and Instagram
FTC: We use income earning auto affiliate links. More.

Comments