Meta has announced significant AI-driven enhancements for its Ray-Ban Meta smart glasses on Global Accessibility Awareness Day, aiming to profoundly impact users with blindness or low vision. The company is rolling out more detailed environmental audio descriptions via Meta AI and expanding its “Call a Volunteer” service, a feature developed with Be My Eyes, to all 18 countries where Meta AI is active. These updates are designed to foster greater independence and provide crucial navigational support by leveraging the glasses’ onboard camera and AI to interpret and articulate the visual world.
For individuals using the glasses, this translates to the AI offering richer, more context-aware descriptions of their surroundings, moving beyond basic object identification to a more holistic understanding. The new features will deploy in the US and Canada over the coming weeks, with a broader international rollout anticipated.
Enhanced Visual Understanding Through AI
A cornerstone of the accessibility upgrade is Meta AI’s enhanced ability to provide detailed responses when users inquire about their environment. This function, activated through the Accessibility settings in the Meta AI app, allows the system to process and interpret visual scenes with greater contextual understanding.
Meta illustrated this by explaining the AI could describe a waterside park, noting subtle details like “well manicured” grassy areas, thereby offering a more comprehensive understanding crucial for navigation and situational awareness. While initially launching in the US and Canada, Meta confirmed this enhanced descriptive capability will expand to additional markets in the future.
‘Call a Volunteer’ Expands Its Global Reach
The “Call a Volunteer” feature, a significant collaboration with the Be My Eyes organization, is also set for a major expansion. Having been in a limited rollout since November 2024, Meta announced it will launch in all 18 countries where Meta AI is supported later this month. The service connects Ray-Ban Meta smart glasses wearers to an extensive network of over eight million sighted volunteers.
By using a simple voice command, such as asking Meta to “Be My Eyes,” users can initiate a live video feed from their glasses’ camera. This allows volunteers to offer verbal guidance for a variety of everyday tasks, from reading product labels and identifying items to navigating unfamiliar locations.
Building on an Evolving AI Ecosystem
The new accessibility tools are integrated into Meta’s continuously advancing AI platform and leverage the existing functionalities of the Ray-Ban Meta smart glasses. The new Meta AI app acts as the central control system for the glasses and their AI features, having replaced the earlier Meta View software.
Foundational AI capabilities, such as the global availability of live language translation and the “Look and ask” visual AI paved the way for these more specialized assistive functions. The “Look and ask” feature, enabling users to query the AI about their visual field, is a core component now enhanced for more detailed descriptions. Meta has characterized these ongoing developments as part of its broader strategy toward creating a more intuitive and personalized AI assistant.
Balancing Innovation with Ethical Considerations
As Meta continues to innovate in the smart eyewear domain, the company needs to address ethical questions surrounding such powerful technology. While the current announcement centers on accessibility benefits, the broader conversation about smart glasses increasingly touches upon privacy implications.
A demonstration by Harvard students, for example, showed how smart glasses could potentially be combined with public facial recognition tools. Meta has reportedly explored integrating facial recognition into future iterations of its glasses.
Meta’s CTO, Andrew Bosworth, commented on this complex issue to CNBC, stating that “If people don’t want this technology, we don’t have to supply it.” He added that while “the product is going to be fine either way,” there are “nice use cases out there, if it’s something people are comfortable with.” These ongoing dialogues highlight Meta’s efforts to balance technological advancement with user trust as its wearable AI offerings mature.