Meta Adds Real Time AI Video to Its Smart Glasses

4 days ago 3

ray ban meta smart glasses

Ray-Ban’s Meta smart glasses have rolled out several new upgrades, including real-time AI video capability, live language translation, and Shazam.

The new features are included in the v11 software update for Ray-Ban’s Meta smart glasses, which began rolling out on Monday.

The company first teased live AI and real-time translation during its annual Meta Connect conference in September.

Live AI adds video to Ray-Ban’s Meta smart glasses and lets wearers ask the device questions about what it can see in front of them.

A man wearing glasses, a blue jacket, and a matching hat smiles while looking at a green aloe vera plant indoors.The upgraded Ray-Ban Meta glasses lets the wearer ask the product questions about what it can see in front of them.

Live AI allows the wearer to naturally converse with Meta’s AI assistant while it continuously views their surroundings. For example, if a wearer is gardening, they would theoretically be able to ask Meta’s AI what plant they’re looking at.

Alternatively, if the wearer is grocery shopping, they could ask Live AI for meal prep ideas based on the food they’re looking at.

Meta says users can use the live AI feature for roughly 30 minutes at a time on a full charge. The company says that live AI will eventually give useful suggestions to the wearer even before they ask.

Meanwhile, the live translation update means that Ray-Ban’s Meta smart glasses will be able to translate speech in real-time between English and either Spanish, French, or Italian.

“When you’re talking to someone speaking one of those three languages, you’ll hear what they say in English through the glasses’ open-ear speakers or viewed as transcripts on your phone, and vice versa,” Meta writes in a press release.

The wearer must download language pairs in advance and specify their own language as well as the language spoken by their conversation partner.

Meta also added Shazam, an app that lets users identify songs, to the smart glasses. When a user hears a track, they can say, “Hey Meta, what is this song?” and Shazam will try to find the tune that’s playing.

The live AI and live translation features are currently limited to members of Meta’s Early Access Program in the U.S. and Canada. Meanwhile, Shazam support on Ray-Ban’s Meta smart glasses is available for all users in the U.S. and Canada.

In a news release, Meta warns that the new feature, especially live AI and live translation, might not always get things right.

“We’re continuing to learn what works best and improving the experience for everyone,” the company writes.


 
Image credits: All photos via Meta.

Read Entire Article