Meta Connect 2025 is barely two days away at the time of writing, but Snap’s latest AR glasses update is threatening to make me forget all about Meta before it even has a chance to showcase its new tech.
Snap’s current best model of its Spectacles AR glasses are one of the most technically impressive gadgets I’ve been able to demo. Unfortunately they’re intended to be a developer device meaning they aren’t yet available to regular folks – nor are they the most stylish or cheapest tech out there.
Thankfully Snap promises “to introduce lightweight immersive Specs to the public next year,” and ahead of that launch I got to demo the upgraded operating system – complete with some new apps and tools – that will bring its AR glasses to life.
Boy was I impressed.
My Snap OS 2.0 demo began with the spatial tips app – a feature powered by AI. It starts out much like the Look and Ask tool on Meta’s smart glasses where you ask the glasses to tell you about the things it can see.
Thanks to its displays, Snap’s AR glasses could in fact label everything of not in my vision with surprising accuracy – including a custom ‘modular couch’ in the shape of the Snap logo.
It tools things a step further by also being able to provide me with tips on how to use objects when asked – for example I asked for a trick I could try using a yellow skateboard the office had as decoration (unfortunately I wasn’t able to follow the instructions as the demo organiser was, perhaps rightly, worried I might injure myself).
We then move onto the Specs’ AR translation tools.
The first demo was cool, but a tad on the basic side. I could drag my hand over a menu written in mandarin to highlight it and then have the glasses provide me with a translation of what it says.
I call this basic as while the app was very easy to use, the translation appeared in a separate window rather than being next to the menu or overlaid like with, say, the Google translation tools on my phone. In the real world this AR setup would make it difficult to know which translation corresponded to which item on the menu – though I appreciate this app is still a work in progress, so I hope an update will help solve my one gripe.
This is also juxtaposed to the live conversation translations which were a dream.
As my host spoke in mandarin I saw a line of english text automatically appear below their head like real-life subtitles.
What was especially neat was this app can support multiple users beyond just two, and supports over 40 languages. So you could have a small group of people all talking in their respective native languages and each other person (provided they’re using Snap’s AR glasses) would see translated subtitles in the language they understand.
Seeing the demo in action was like seeing into the future – a feeling Snap’s AR glasses gave me the last time I demoed them.
Lastly, I got to experience some new games like Synth Riders – an AR rhythm game ported from the VR version – and also browse the internet and videos on Snap’s Spotlight platform.
Nothing here was especially mind-blowing, but these are the sorts of day-to-day tools that will help make consumer AR specs feel like a generally useful tool, rather than a hyper-specialized device.
I could imagine myself digitally flicking through the news on my AR glasses’ browser while eating my breakfast each morning.
The AR future is coming
This demo reaffirmed my belief that AR glasses will likely be the next big thing – maybe even supplanting smartphones as our go-to gadget in the next decade or sooner.
Sure, the hardware is a little goofy looking right now, but the utility Snap’s Spectacles provided me – and also the inventive social interactions they facilitate – more than made up for how silly they might make me look.
And as the tech improves, they should only get slimmer and more normal looking.
Snap’s rivals should be on notice, especially meta as Connect is fast approaching, as the AR revolution is coming, and at least one of the players aiming for the top spot is already bringing its A game.