Meta’s Ray-Ban AI Smart Glasses Get Smarter: Live Translation for Everyone, Visual AI Expands

Meta's Ray-Ban smart glasses now offer live translation for all users, with live AI visual assistance beginning its global expansion rollout.

Meta is pushing software updates to its Ray-Ban smart glasses lineup, making the previously beta-tested live language translation feature available to all owners globally. The update also marks the beginning of a wider rollout for the glasses’ multimodal AI capabilities, which allow users to ask questions about what they are seeing through the built-in camera. Alongside these functional upgrades, Meta introduced new frame styles and announced plans to bring the glasses to more markets. 

The glasses, equipped with a 12MP ultra-wide camera, 32GB of storage, and a five-microphone array for spatial audio capture, aim to blend standard eyewear aesthetics with connected features. Adding to the options, Meta introduced the Skyler frame, a cat-eye style available in Shiny Chalky Gray or Shiny Black with various lens choices, described as suited for smaller faces.

A low bridge fit option for the classic Wayfarer style is also now available. Further expanding their reach, Meta intends to bring the glasses to Mexico, India, and the United Arab Emirates shortly.

Translation Unlocked for All Users

Previously limited to an early access program initiated in December 2024, the live translation feature is now available worldwide. Users can initiate real-time translation between English, Spanish, French, and Italian.

Meta describes the experience: When you’re speaking to someone in one of those languages, you’ll hear what they say in your preferred language through the glasses in real time, and they can view a translated transcript of the conversation on your phone.

Activation is handled via voice command: “Hey Meta, start live translation.” Meta specified that language packs must be downloaded in advance for the feature to function without a Wi-Fi or cellular connection, a benefit for travelers.

AI Gets Eyes: Visual Assistance Rolls Out

Meta is also broadening access to its “live AI” features, moving beyond the initial Early Access Program participants in the US, Canada, and a UK rollout that started April 10. The core of this is the “Look and ask” capability, where the glasses use their camera feed to understand visual context.

Users can ask questions about objects or scenes within their view, receiving audio responses. Meta suggests use cases like getting recipe ideas based on ingredients seen, identifying plants, or navigating unfamiliar places. The AI is designed to handle follow-up questions without needing the wake phrase repeatedly, maintaining conversational flow. Meta plans for the AI to eventually offer proactive suggestions based on the user’s environment.

The expansion started this week in several EU countries (Germany, Austria, Belgium, Denmark, Norway, Sweden, Finland), with the visual AI component activating there starting next week. Other updates include expanded access to integrated music apps like Spotify and Apple Music (for English language users) and upcoming support for handling Instagram Direct messages and calls.

Privacy Questions Linger Amidst Feature Expansion

The reliance on an always-ready camera for live AI functions directly engages with ongoing privacy debates surrounding smart eyewear. An October 2024 demonstration highlighted how the glasses could potentially be used with external facial recognition tools for identification, sparking debate about surveillance risks.

While Meta includes an LED recording light, its effectiveness as a notice mechanism has been questioned. As these visual AI features become more widely available, the balance between utility and safeguarding bystander privacy remains a key consideration for camera-equipped wearables. This inherent tension in camera-based wearables provides an opening for competitors.

Competing Visions for Smart Eyewear

Competitors like Solos emphasize this point, offering modular glasses where the camera can be removed. Solos co-founder Kenneth Fan previously stated, One thing we promised to deliver on was allowing consumers to have control of their experience with AI and smart technology, particularly with privacy options in mind. Other market players include Baidu with its utility-focused Xiaodu glasses and Halliday’s camera-less projection display glasses.

Meanwhile, Apple is reportedly exploring the smart glasses space as part of its longer-term AR ambitions, distinct from its current Vision Pro headset strategy. Google is also building its Android XR platform for headsets like Samsung’s upcoming device.

User feedback suggests the current Ray-Ban Meta glasses offer around four hours of mixed usage, with the charging case providing additional power. While the current software updates enhance the existing hardware, reports suggest Meta is already working on future iterations, possibly integrating displays next year or launching a more advanced model like ‘Hypernova’ with gesture controls. For now, the focus is on delivering these new translation and AI experiences to a wider audience.

SourceMeta
Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x