I Tried Google and Samsung's Android XR Headset and Glasses: Gemini AI Can See My Life Now video

1 week ago 4

Android XR, coming in 2025, is arriving along with a Samsung mixed reality headset and glasses after that. I tried a ton of demos, and the always-on AI that can see what I'm seeing is amazing and full of unanswered questions.

Google has announced Android Xr Samsung has a mixed reality headset for Google and there are glasses in the works too. This is a lot to digest and I just got out of the demo of this stuff. Let me talk to you about it. Google's been promising a collaboration with Samsung and Qualcomm for a year. Now talking about this future of mixed reality and we already have Apple in the landscape, we already have meta in the landscape and we also have other French players as well. Now, Google is back in the landscape. They had been out of VR and A R for the most part and they're back in Android. Xr is a new platform and it's going to work with phones and it's going to work with other things and it's Gemini powered in that Gemini A I is going to run throughout it to work, not only on mixed reality headsets, but Google says it'll be on A I glasses A R glasses and all sorts of other emerging wearables for your face. Now, what I got to try because right now it's only for developers is a headset made by Samsung that's a mixed reality headset and also a pair of notification glasses that Google calls Project Astra. These have been in the works for a little while and they're gonna start being field tested soon. Now, bear with me because I'm gonna be talking a lot and we don't have any photos or video of my demo because they were not allowed. That's pretty standard issue for early VR and A R in my experience. So, first of all, project Mohan, which is the name of the mixed reality headset for now that Samsung is going to release next year looks a lot like a Vision Pro or it looks like uh the meta quest pro. If you remember that headset, it's a visor type headset that fits on your head. It's got lenses over here can block out light if you want Titans the back. It has a nice crisp display, it has hand tracking and eye tracking. That's not what the exciting part is. A lot of that feels very familiar. But when I tried demos in this headset, bringing up two D apps in a very Android like interface. I was able to bring up familiar Google apps and navigate using my hands and bring up different pains. But I could also turn on Gemini to be my companion throughout the whole journey. Now, you already have things like Siri working on Apple Vision Pro and there are some voice A I things and uh the meta quest. But this is something that can also see what you're doing and why that gets fascinating is I could actually point to something and say, hey, what is that? Or tell me more about this and move my hand there and Gemini will tell me. And what I found I was able to do is browse the web. I was able to ask it a question about something or ask about where I live. It pulled up a map. And Google also has immersive apps for this. At the moment, youtube and Maps were the most notable Apple doesn't even have a vision pro immersive maps app. Yet, maps allowed me to throw this big 3d immersive environment up zoom into my home. But also I was able to point out things in the landscape and ask about them and Gemini could tell me what those were and that combination got really interesting. I felt like I was starting to get lost in the apps and navigate around. Gemini cannot just recognize what you're doing, but it can be your memory. When I got done with a whole bunch of demos, I asked Gemini to recap what I've been doing and it told me all the different things I've been doing and it has tricks I haven't seen before like I played a youtube video and Gemini improvised captions to the youtube video. Google has a few other tricks up at sleeve with Android Xr. One of them is auto converting two D photos to 3D, which is something that Apple also does now with Vision pro but they also do it for videos. I looked at a video clip on the headset that was already pre converted. Uh Google said, but also I watched a youtube clip that had been converted. I didn't get to see the live conversion process but it was pretty wild to see that that capability is going to be there. So that's all the mixed reality stuff which is going to be on this developer headset that is meant for people to start getting a feel for it and build a kind of a starting point for everything else. But then there are the glasses I put on these glasses that look like uh Meta's ray bands, kind of chunky but pretty normal and wireless. And there was one display on these glasses, a little area here etched in with wave guides and a micro led display projector over here that was able to give me a heads up display. What I was able to do with that. A whole bunch of stuff I could activate Gemini with a button here and turn it into kind of an always on Gemini mode. Identify works of art on the wall in a fixed living room set, identify questions about objects. I had questions about books. I was able to again indicate certain things and and have it tell me about them and translation. You could translate things and then also ask for the language back. And what I found fascinating is they also do live interactive translations. So somebody in the room came up and spoke Spanish to me, it instantly brought up not just the captions for what they were saying to me in English, but they also brought up still in English when they started speaking in Spanish. All of this stuff assumes that you're gonna have an always on Gemini assistant in your life. Now with conversational responses that could feel very intrusive, but on the glasses, you're able to pause it by simply tapping on the side and basically suspending it. But it was very different than something like meta ray bands which assume you won't be using it until you ask for it here. It was like it would assume you have it on until you want to pause it. And the same thing goes for VR and mixed reality. Am I gonna turn on Gemini and then suddenly have this run all the time that could really impact the experience of how I use apps? It's fascinating for things like let's say I'm playing a game and I want to know a tutorial or I have questions about something. It kind of breaks the fourth wall of VR and A R and it may change the way that apps are developed. But let's back up a step. This is all for developers right now and Android Xr is gonna be unveiled in a Fuller state next year. According to Google and Samsung is going to announce and go into more details on what it's potentially very expensive. Next reality headset will be again, expected to be kind of like a vision pro and a lot of people may not necessarily want to buy it, but it's going to be there. The other thing that's really interesting is where all the glasses go and that could be the year after starting with these assistive types of glasses, there are already other partners in the works like X reel. We just saw some glasses made by them. Google is gonna have a lot of different partners making A I and A R glasses that work now that they're gonna be working with phones. Well, the ball is gonna be in Apple's Court. When does Apple introduce generative A I to the Vision pro? And when will Apple start allowing connections to the iphone for vision products? Because right now that doesn't happen. But I think it's really key for what comes next and it looks like it's going to make uh the idea of everyday glasses feel a lot more instantaneous, but we don't really know about battery life yet. So that is one wild card here that could be pretty significant. That's a lot to digest and I'm still digesting it in my head. You have questions which I'm sure you do leave them below and I'll be following up with more stuff as we learn more about this. But I was just glad to get the demo and I'm asking a lot more questions about what A I is going to be in VR and A R than I ever had before. Thanks for watching.

Read Entire Article