Meta Ray-Ban Display Hands-On: The Smart Glasses You Were Waiting For

5 hours ago 4

There’s one thing people want to know when they see my first-gen Ray-Ban smart glasses, and it’s got nothing to do with AI, or cameras, or the surprisingly great open-ear audio they put out. They want to know what’s probably front-of-mind right now as you’re reading this: Do they have a screen in them? The answer? Sadly, no… until now.

At Meta Connect 2025, Meta finally unveiled its Ray-Ban Display smart glasses that, as you may have gathered from the name, have a screen in them. It doesn’t sound like much on the surface—we have screens everywhere, all the time. Too many of them, in fact. But I’m here to tell you that after using them in advance of the unveil, I regret to inform you that you will most likely want another screen in your life, whether you know it or not. But first, you probably want to know exactly what’s going on in this screen I speak of.

The answer? Apps, of course. The display, which is actually full-color and not monochrome like previous reporting suggested, acts as a heads-up display (HUD) for things like notifications, navigation, and even pictures and videos. For the full specs of that display, you can read the news companion to my hands-on here. For now, though, I want to focus on what that screen feels like. The answer? A little jarring at first.

Ray Ban Glasses Top Down© James Pero / Gizmodo

While the Ray-Ban Display, which weigh 69g (about 10 more grams than the first-gen glasses without a screen) do their best not to shove a screen in front of your face, it’s still genuinely there, hovering like a real-life Clippy, waiting to distract you with a notification at a moment’s notice. And, no matter what your feelings are about smart glasses that have a screen, that’s a good thing, since the display is the whole reason you might spend $800 to own a pair. Once your eyes adjust to the screen (it took me a minute or so), you can get cracking on doing stuff. That’s where the Meta Neural Band comes in.

The Neural Band is Meta’s sEMG wristband, a piece of tech it’s been showing off for years now that’s been shrunk down into the size of a Whoop fitness band. It reads the electrical signals in your hand to register pinches, swipes, taps, and wrist turns as inputs in the glasses. I was worried at first that its wristband might feel clunky or too conspicuous on my body, but I can inform you that it’s not the case—this is about as lightweight as it gets. The smart glasses also felt light and comfortable on my face despite being noticeably thicker than the first-gen Ray-Bans.

Meta Ray-Ban Display© James Pero / Gizmodo

More importantly than being lightweight and subtle, it’s very responsive. Once the Neural Band was tight on my wrist (it was a little loose at first, but better after I adjusted), using it to navigate the UI was fairly intuitive. An index finger and thumb pinch is the equivalent of “select,” a middle-finger and thumb pinch is “back,” and for scrolling, you make a fist and then use your thumb like it’s a mouse made of flesh and bone over the top of said fist. It’s a bit of Vision Pro and a bit of Quest 3, but with no hand-tracking needed. I won’t lie to you, it feels like a bit of magic when it works fluidly.

Personally, I still had some variability on inputs—you may have to try to input something once or twice before it registers—but I would say that it works well most of the time (at least much better than you’d expect for a literal first-of-its-kind device). I suspect the experience will only get more fluid over time, though, and even better once you really train yourself to navigate the UI properly. Not to mention the applications for the future! Meta is already planning to launch a handwriting feature, though it’s not available at launch. I got a firsthand look… kind of. I wasn’t able to use handwriting myself, but I watched a Meta rep use it, and it seemed to work, though I have no way of knowing how well until I use it for myself.

Meta Ray-Ban Display© James Pero / Gizmodo

But enough about controls; let’s get to what you’re actually doing with them. I got to briefly experience pretty much everything that the Meta Ray-Ban Display have to offer, and that includes the gamut of phone-adjacent features. One of my favorites is taking pictures in a POV mode, which imposes a window on the glasses display that shows you what you’re taking a picture of right in the lens—finally, no guess and check when you’re snapping pics. Another “wow” moment here is the ability to pinch your fingers and tweak your wrist (like you’re turning a dial) to zoom in. It’s a subtle thing, but you feel like a wizard when you can control a camera by just waving your hands around.

Another standout feature is navigation, which imposes a map on the glasses display to show you where you’re going. Obviously, I was limited in testing how that feature works since I couldn’t wander off with the glasses in my demo, but the map was quite sharp and bright enough to be used outdoors (I did test this stuff in sunlight, and the 5,000 nits brightness was sufficient). Meta is leaving it up to you whether you use navigation while you’re in a vehicle or on a bike, but it will warn you of the dangers of looking at a screen if it detects that you’re moving quickly. It’s hard to say how distracting a HUD would be if you’re biking, and it’s something that I plan to eventually test in full.

Meta Neural Band© James Pero / Gizmodo

Another interesting feature you might actually use is video calling, which pulls up a video of the person you’re calling in the bottom-right corner. The interesting part about this feature is that it’s POV for the person you’re calling, so they can see what you’re looking at. It’s not something that I’d do in any situation, since usually the person you’re calling wants to see you and not just what you’re looking at, but I can confirm that it works at least.

Speaking of just working, there’s also a live transcription feature that can listen in on your environment and superimpose what the other person is saying onto the display of the smart glasses. I had two thoughts when using this feature: the first one is that it could be a game-changer for accessibility. If your hearing is impaired, being able to actually see a live transcript could be hugely helpful. Secondly, such a feature could be great for translation, which is something that Meta has already thought of in this case. I didn’t get a chance to use the smart glasses for translating another language, but the potential is there.

One problem I foresee here, though, is that the smart glasses may pick up other conversations happening nearby. Meta thought of this too and said that the microphones in the Ray-Ban Display actually beamform to focus just on who you’re looking at, and I did get a chance to test that out. While one Meta rep spoke to me in the room, others had their own conversations at a fairly normal volume. The results? Kind of mixed. While the transcription focused mostly on the person I was looking at, it still picked up stray words here and there. This feels like a bit of an inevitability in loud scenarios, but who knows? Maybe beamforming and AI can fill in the gaps.

meta ray-ban display© Meta

If you’re looking for a killer feature of Meta’s Ray-Ban Display smart glasses, I’m not sure there necessarily is one, but one thing I do know is that the coupling of the glasses with its Neural Band should be nothing short of a game-changer. Navigating the UI in smart glasses has been a constant issue in the space, and until now, I haven’t seen what I thought was a killer solution, but based on my early demos, I’d say that Meta’s “brain-reading” wristband could be the breakthrough we were waiting for—at least until hand or eye tracking at this scale becomes possible.

I’ll know more about how everything works when I get a chance to use Meta Ray-Ban Display on my own, but for now I’d say Meta is still clearly the frontrunner in the smart glasses race, and its head start just got pretty massive.

Read Entire Article