Meta is set to unveil Ray-Ban smart glasses with integrated displays in 2025, marking a significant evolution in the company’s wearable technology. Sources familiar with Meta’s plans revealed that the displays would offer basic features like notifications and responses from the company’s AI assistant (Financial Times).
This highlights Meta’s intent to enhance the interactivity of its smart glasses without yet entering full augmented reality (AR) territory.
AI Innovations Build the Foundation
Meta just introduced advanced AI tools to its current line of Ray-Ban glasses, adding live visual assistance, real-time translation, and Shazam-powered music recognition. These features leverage the glasses’ built-in camera and speakers to provide hands-free functionality, from identifying objects to interpreting speech during conversations.
Unlike traditional voice assistants, Meta’s AI removes the need for a wake word, allowing fluid, contextual interactions. For example, you can just point the camera at products in a store and instantly get recipe suggestions.
Real-time translation covers four languages—English, Spanish, French, and Italian—and works offline with pre-downloaded language packs, ensuring seamless usage during travel or in low-connectivity environments.
Privacy Concerns Intensify
Despite these advancements, Meta’s smart glasses have faced criticism over privacy risks. In October Harvard students exposed how the glasses could be paired with public facial recognition tools, such as PimEyes, to identify individuals and access their personal information in real time.
Their demonstration, dubbed “I-XRAY,” revealed that Meta’s indicator light, designed to alert others when the glasses are recording, is often overlooked in crowded or bright environments. Critics argue that Meta’s reliance on user responsibility to obtain recording consent is insufficient, calling for stronger safeguards.
Privacy advocates have highlighted the need for regulatory measures as wearable AI technology becomes increasingly sophisticated.
Competitive Strategies in the Wearable Market
Meta’s aggressive approach to enhancing smart glasses functionality positions it as a leader in the sector, but competitors are making strides.
- Baidu Xiaodu AI Glasses: Launched in November 2024, these glasses integrate real-time object recognition and live translations at a price point of under $290, appealing to cost-conscious consumers.
Baidu’s Xiaodu AI Smart Glasses offer AI analysis of visible objects (Image: Baidu) - Apple Atlas Project: Apple is adopting a cautious approach, focusing on extensive testing to refine its eventual smart glasses offering, which is still years away from release.
- Google and Samsung Collaboration: Google’s Android XR, designed for AR and mixed-reality devices, will debut on Samsung’s Project Moohan headset in 2025, showcasing multimodal AI capabilities.
- Solos AirGo Vision: This U.S.-based startup prioritizes privacy with modular frames that allow users to detach cameras entirely.
Meta’s Incremental Approach to AR
Meta’s focus on gradual advancements reflects its strategy to bridge the gap between current wearables and future AR systems. The upcoming Ray-Ban glasses aim to provide practical enhancements while setting the stage for more sophisticated products, such as the Orion prototype.
The Orion glasses, still in development, feature advanced AR capabilities, including holographic displays and a wristband interface for gesture control. However, the high production cost has delayed their commercial release.
By integrating displays into its Ray-Ban glasses, Meta seeks to strike a balance between affordability, functionality, and the broader vision of AR wearables.