The race to place augmented reality smart glasses on your face warms up. Snap glasses are transformed into “specifications” and will be launched as lighter and more powerful scars in 2026.
CEO Evan Spiegel announced the brand new specifications on stage at the XR event awe that promised smart glasses that are less, significantly lighter and “with a ton of more capacity.”
The company did not spell a specific timeframe or price, but in 2026 the launch plan puts Meta on warning, which is busy preparing its exciting Orion AR glasses for 2027. It seems that Snap Specs will face Samsung/Google Android XR-based glasses, which is also expected sometime in 2026.
As far as consumers can expect from specifications, snap them on the same snap us used in its fifth generation of glasses (and probably still use a few Qualcomm Snapdragon XR chips). This means that all interface and interaction metaphors that gesture-based controls remain. But there are a significant number of new features and integrations that begin to emerge this year, long before specifications arrive, including AI.
Upgrade of the platform
Spiegel explained the updates by first revealing that Snap started working on glasses “before Snapchat” was even one thing and that the company’s overall goal is “to make computers more human.” He added that “with progress in AI thinks and functions computers as human beings more than ever before.”
Snaps plan with these updates to Snap OS is to bring AI platforms into the real world. They bring Gemini and Openai models into Snap us, which means that some MultoModel AI capabilities will soon be part of the fifth generation Spectacles and eventually specifications. These tools can be used for text translation and currency conversion on aircraft.
The updated platform also adds tools to Snap lenses Builders that integrate with Spectacles ‘and specs’ AR Waveform-based display features.
For example, a new SNAP3D API will allow developers to use Genai to create 3D objects in lenses.
The updates will include a depth module AI that can read 2D information to create 3D cards that will help anchor virtual objects in a 3D world.
Companies that implement Spectables (and eventually specifications) may appreciate the new Fleet Management app that will let developers manage and externally monitor several specifications at once, and the opportunity to implement the specifications of guided navigation, for example, a museum.
Later, Snap adds us webxr -support to build AR- and VR experiences in web browsers.
Let’s make it interesting
Spiegel claimed that snap through lenses in Snapchat has the largest AR Platform in the world. “People use our scars in our camera 8 billion times a day.”
It’s a lot, but it’s almost everything through smartphones. Currently, only developers use the voluminous glasses and their lenses.
The consumer release of specifications could change it. When I tried glasses last year, I was impressed with the experience and found them while I was not as good as meta-orion glasses (the lack of tin tracking stood out for me), full of potential.
A lighter form factor that approaches or surpasses what I found with Orion and has seen in some Samsung Android XR glasses could vault click specifications in the AR glasses. That is, they do not cost $ 2000.



