Meta has unveiled its Aria Gen 2 experimental smart glasses, a research-only device packed with advanced sensors to drive breakthroughs in AI, machine perception, and robotics. This initiative is key to developing foundational technologies for future augmented and mixed reality, potentially speeding up the arrival of the next computing platform.
Marking a significant evolution from its 2020 predecessor, the Aria Gen 2 emphasizes improved wearability with a lighter design, eight size variations for enhanced fit, and newly added folding arms for better portability, according to Meta’s announcement. Project Aria’s core mission, as outlined by Meta, is to enable researchers worldwide to advance the state of the art in machine perception, contextual AI, and robotics by providing access to this cutting-edge hardware alongside open-source datasets, models, and tooling. Meta believes this will forge future innovations defining the next computing platform.
Researchers utilizing Aria Gen 2 gain access to a formidable suite of data-capturing tools. The device integrates four computer vision cameras—double its predecessor’s count—equipped with a global shutter sensor offering a high dynamic range of 120 decibels (dB), up from 70 dB in Gen 1, for superior performance in varied lighting. This configuration facilitates advanced 3D hand and object tracking.
Meta also highlighted a substantial boost in stereo camera overlap to 80 degrees, a significant increase from Gen 1’s 35 degrees. This enhancement improves depth perception and spatial awareness, vital for crafting immersive mixed reality environments.
Advanced Sensing And On-Device Intelligence
Beyond its visual capabilities, Aria Gen 2 incorporates several novel sensors. These include a calibrated Ambient Light Sensor (ALS) with an ultraviolet mode to differentiate between indoor and outdoor settings. A contact microphone is embedded in the nosepad for clearer audio in noisy conditions. Additionally, a photoplethysmography (PPG) sensor, which measures changes in blood volume for heart rate estimation, is also in the nosepad.
Meta further explains that the glasses use SubGHz radio technology for precise, sub-millisecond time alignment between multiple Aria Gen 2 devices, a crucial feature for complex, synchronized research projects.
Powered by Meta’s custom energy-efficient coprocessor, Aria Gen 2 executes advanced machine perception tasks directly on the device. This encompasses Visual Inertial Odometry (VIO), a process that uses camera and motion sensor data to track the device’s position and orientation, also known as six-degrees-of-freedom (6DOF) tracking.
An advanced camera-based eye-tracking system monitors detailed gaze information, including per-eye gaze, vergence points, blink detection, pupil center, and diameter. Furthermore, a sophisticated hand-tracking solution generates articulated 3D hand-joint poses. Meta suggests these advanced signals provide a deeper understanding of a wearer’s visual attention and intentions, thereby “unlocking new possibilities for human-computer interaction.”
Meta intends to open applications for researchers to engage with Aria Gen 2 later this year. The company will present the device at the CVPR 2025 conference in Nashville in June. While Aria Gen 2 remains a research tool, its technological strides will undoubtedly influence Meta’s future consumer products. According to Project Aria, applications for the existing Aria Research Kit with Gen 1 glasses are still being accepted.
Meta’s Evolving XR Strategy And Market Context
The Aria Gen 2 development unfolds within a dynamic and challenging XR market for Meta. The company has found consumer traction with its Ray-Ban Meta smart glasses, which surpassed one million units sold in 2024. Meta is also developing more advanced consumer models, such as the codenamed ‘Hypernova’ glasses, anticipated to include a display and gesture controls. However, the ambitious Orion AR headset project has been delayed due to production costs.
Recent industry observations suggest a strategic realignment in Meta’s consumer headset plans. Meta is now prioritizing an ultralight Horizon OS headset, codenamed ‘Puffin,’ for a potential 2026 release, possibly deferring or canceling traditional Quest 4 designs.
This Puffin device, confirmed by Meta CTO Andrew Bosworth to The Verge, is conceptualized as exceptionally lightweight. It may utilize a tethered compute puck and is geared towards productivity and virtual screen applications. This strategic shift occurs against a backdrop of financial headwinds in Meta’s Reality Labs division. The division has faced substantial operating losses and underwent restructuring, including layoffs. Despite these cuts, Meta has been actively hiring AI engineers.
Ethical Considerations And The Competitive Arena
The sophisticated sensing technologies embedded in devices like Aria Gen 2 inevitably spotlight ongoing discussions about privacy and ethics. Concerns regarding facial recognition in smart eyewear were notably highlighted by projects like the ‘I-XRAY’ system demonstration. Meta is reportedly re-evaluating such features for upcoming consumer products.
Andrew Bosworth commented on this delicate balance to CNBC, stating, “If people don’t want this technology, we don’t have to supply it. The product is going to be fine either way. There are some nice use cases out there, if it’s something people are comfortable with.”
The competitive field for AI-enhanced eyewear is also intensifying. Google is partnering with Warby Parker on AI glasses. However, Warby Parker’s co-CEO Dave Gilboa informed PYMNTS.com that these products will not be available in 2025, describing them as potentially “mind-blowing.” Other significant entities, including Apple with its ‘Atlas’ research initiative, and Baidu, are actively exploring the AI eyewear domain.
Meanwhile, Meta continues to refine its current Ray-Ban smart glasses through AI-powered software updates, such as live translation and enhanced visual assistance. These are all managed via the recently launched Meta AI app, which is built upon the Llama 4 AI models. The AI models have faced scrutiny over training data, alleging copyright infringement.