At Meta Connect 2025 I got to try the Meta Ray-Ban Display glasses and the technology honestly blew my mind. It was one of those moments where a piece of hardware stops feeling like a concept and starts feeling real.
While I was wearing them, what really clicked for me was not just the glasses themselves, it was what they point to next. Having a display on your face that sits over your real world already starts to show how this kind of technology could enhance everyday life. It is not hard to imagine checking information, getting guided prompts, or having digital content appear when you need it without fully pulling yourself out of your surroundings.
That is what made me start thinking about AR glasses properly.
Quest MR has already shown the behaviour
A big reason this felt so real to me is because Quest MR has already shown a lot of the behaviour that makes this useful.
A good example is the new Meta Quest virtual keyboard. We can already start integrating 2D apps into the headset and placing them into real world space. That is a small thing on its own, but it matters because it shows how digital interfaces can live around you instead of just on a flat monitor or phone screen.
That is where MR starts to matter. It is not just about novelty. It is about digital content existing in your space in a way that still lets you stay aware of the world around you.

What I mean by true AR glasses
When I say true AR glasses, I do not just mean display glasses.
Right now display glasses can show content, but they do not really understand your surroundings in a spatial way. True AR glasses would be different. Digital objects would exist in your real world space. They could sit on a wall, stay locked in place, and be cut off properly when something in the real world moves in front of them.
That is the part that changes it from a wearable display into actual augmented reality.
Why displays in glasses matter so much
For me, display glasses are a major step because they make this kind of technology feel more accessible and more wearable.
They also represent a big jump in what I think of as the Wearable Acceptability Range. Headsets can do a lot, but glasses are something people are far more likely to accept in everyday life if the experience is useful enough. That is why getting displays into glasses matters so much. Even before full AR is here, it is already moving the form factor closer to something people would actually wear day to day.
Why this feels closer now
This feels a lot closer than it did a few years ago.
For a long time the closest version of this idea was either single colour displays or screen mirroring from another device. That always felt like an early step, but still limited.
What stood out to me here is that this is running natively on the glasses. That changes the feeling of it completely. It starts to feel less like a companion gadget and more like the beginning of a real platform.
When you put that together with what Quest MR is already proving, it does not feel that hard to imagine where this is going next.
Final thoughts
Trying the Meta Ray-Ban Display glasses made it feel very clear to me that this is a real step in the right direction.
They are not full AR glasses yet, but they do feel like an important move toward that future. And after trying them for myself, I cannot wait to experience a pair of glasses that can deliver full AR capability in a way that feels natural and useful.
Thanks for reading. If you’re interested in collaborating on digital delivery, XR, or AI-enabled platforms, I’m always open to a conversation.
You can find me on LinkedIn, browse more writing on the blog, or explore my recent work.