Meta Inches Into Health Wearables with New Food Logging Feature for Ray-Ban Smart Glasses

Meta announced it’s pushing an update to Ray-Ban and Oakley Meta smart glasses that’s slated to make nutrition tracking easier by letting Meta AI visually suss out food before you eat it.

The News

Over time, the company says that a user’s food log will inform “increasingly personalized insights that get more useful, helping you make healthier, more informed choices.”

Meta says it will be somewhat of a manual process though, as users need to prompt Meta AI to log their food in addition to inputting specific nutrition goals.

Ray-Ban Meta (Gen 2) | Image courtesy Meta

While we’re not there yet, Meta says in the future glasses will be able to understand what you’re eating and automatically log your food, which in turn opens up even more personalized nutrition insights since you don’t have to remember to log every meal.

For now though, the company envisions users asking Meta AI questions like “What should I eat to increase my energy?” which will output a suggestion based on your food log and fitness goals.

Meta says the new feature will be available to users aged 18+in the US “soon” across all Ray-Ban Meta and Oakley Meta smart glasses, with its Meta Ray-Ban Display glasses getting the update sometime later this summer.

SEE ALSO
New Vision Pro App Bets on Apple’s Persona Avatars to Form Genuine Connections

My Take

Meta doesn’t do health tracking; its smart glasses don’t track your heart rate, steps, activity, sleep (of course not), calories burned, O² levels—nothing.

Granted, they can link with Garmin smart watches which can do those things, although the glasses themselves essentially only act as a sort of audio relay, repeating the info sensed and stored by the Garmin app, meaning Meta can’t really do anything truly useful with the bulk of your health data. Notably, Meta smart glasses don’t tie into Samsung Health or Apple Health either, putting a majority of users’ health data out of Meta’s reach.

Meta Ray-Ban Display Glasses & Neural Band | Image courtesy Meta

But it probably won’t always be that way. Meta seems to be leveraging what it can feasibly (and cheaply) do right now without having to cut any expensive licensing deals with dominant players in the smart watch segment.

The company does have a vector to get all of that data one day though. Meta Ray-Ban Display comes with a wrist-worn Neural Band controller that uses surface electromyography (sEMG) which lets users quietly write out messages and manipulate UI. I can imagine a near future where Neural Band has a packet of sensors similar to a smart watch, albeit without the display.

Provided Meta goes that specific route, the company wouldn’t need to integrate with existing health ecosystems at all for its future smart glasses. It will already have everything it needs to close the loop on what you’re eating and how you’re burning it off.

,