Google’s first stab at smart glasses since Google Glass will be here sooner rather than later, and we may have just gotten our first good look at what the glasses can do.
The unexpected preview comes courtesy of a user on Reddit who says they discovered an early version of Google’s smart glasses companion app hidden inside a preview build of Android Studio. While the companion app doesn’t reveal everything, it does point to a few features and choices that make Google’s smart glasses stand out.
According to Android Authority, which pored over strings of code in the app, there are references to features like “conversation detection,” which sounds like a novel feature among smart glasses I’ve used over the past year. Basically, the idea is that the glasses can use the on-device microphone to silence notifications while you’re speaking so as not to distract you from your conversation. That’s not game-changing, but it’s thoughtful and could be nice to have for anyone trying to minimize distractions.
Google’s first smart glasses since Google Glass are slated to arrive this year. ©Raymond Wong / GizmodoAs for the privacy, strings in the code also indicate that Google is taking measures to prevent security risks inherent with a device that hears what you’re saying all the time. As Android Authority notes, one string indicates that, “To protect your privacy, all conversation detection processing happens on your glasses. No raw audio, images, or conversation data is shared with Google or other services.” To be honest, that’s better than Meta, which does use data to train its AI, in particular, images gathered when using the smart glasses for computer vision. We’ll have a clearer picture of just how privacy-sensitive Google’s glasses are when they’re released, though.
Speaking of distractions, it looks like Google’s smart glasses might also let you pause what could be a major one by turning the display off when it’s not in use to use the smart glasses in an audio-only state. That capability is evidenced by a string of code found by Android Authority titled “displayless mode.” Again, that’s not groundbreaking, but it’s thoughtful, and as someone who’s used both display glasses and non-display glasses extensively over the past year, I could see that capability coming in handy.
What if you’re on a bike, for example, and you want to listen to audio, but you don’t want a notification flying across your vision and sending you flying into a car? Tiny feature tweaks like that could elevate smart glasses in a big way. (Turning off the screen could also extend battery life.)
Outside of those standouts, there are references to some features already found in Ray-Ban Meta AI glasses and the Meta Ray-Ban Display, like the ability to record 3K video, and, just like Meta, it looks like you won’t be able to record video or take pictures when the privacy light indicator (an LED that lets people know you’re recording) is obscured.
There’s not a ton to go off of yet, but if there’s one thing that’s clear, it’s that Google has learned a thing or two since the Google Glass days, at least in terms of protecting privacy and designing thoughtful features. Leave it to Meta to make Google look privacy-conscious, I guess?







English (US) ·