- Meta develops its Aria Gen 2 smart glasses filled with sensors and AI features
- The smart glasses can track your gaze, movement and even heartbeat to measure what is happening around you and your feelings about it
- The smart glasses are currently used to help scientists train robots and build better AI systems that could be integrated into consumers’ smart glasses
Ray-Ban Meta Smart Glasses are still relatively new, but Meta is already smoking with its new Aria Gen 2 Smart Glasses. Unlike Ray-Bans, these smart glasses are only for research purposes for now, but are packed with enough sensors, cameras and treatment power that it seems inevitable that some of what Meta is learning of them will be incorporated into future laptops.
Project Aria’s research level tools, like the new smart glasses, are used by people working on computer vision, robotics or any relevant hybrid of contextual AI and neuroscience that draws Meta’s attention. The idea for developers is to wear these glasses to devise more effective methods of teaching machines to navigate, contextualize and interact with the world.
The first Aria -smart glasses came out in 2020. Aria Gen 2S is far more advanced in hardware and software. They are lighter, more accurate, packing more power and similar to much more glasses that people wear in their ordinary lives, though you would not be mistaken for a standard pair of glasses.
The four computer vision cameras can see an 80 ° bow around you and measure depth and relative distance, so it can tell both how far your coffee mug is from your keyboard or where a drone’s landing equipment may be on the way. It’s just the beginning of the sensory equipment in the glasses, including a surrounding light sensor with ultraviolet condition, a contact microphone that can fetch your voice, even in noisy environments, and a pulse detector embedded in the nose pad that can estimate your heartbeat.
Future facewear
There are also plenty of eye tracking technology that is capable of telling where to look when you flash, how your students change and what you focus on. It can even trace your hands, measure common movement in a way that can help train robots or learn movements. Combined, the glasses can find out what you are looking at, how to hold an object, and if what you see gets your heartbeat up due to an emotional reaction. If you hold an egg and see your sworn enemy, AI may find out that you want to throw the egg on them and help you fix it exactly.
As mentioned, these are research tools. They are not for sale to consumers and Meta has not said if they will ever be. Researchers need to apply to access and the company is expected to start taking these applications later this year.
But the implications are far greater. Meta’s plans for smart glasses go far beyond controlling for messages. They will connect human interactions with the real world to machines and teach them how to do the same. Theoretically, these robots could see, listen and interpret the world around those that humans do.
It won’t happen tomorrow, but Aria Gen 2 -smarte glasses prove that it’s much closer than you might think. And it’s probably only a matter of time before some version of Aria Gen 2 ends up for sale to the average person. You have the powerful AI brain sitting on your face, remember where you left your keys and sent a robot to pick them up for you.



