Meta CEO Mark Zuckerberg announced updates to the company's Ray-Ban Meta smart glasses on Wednesday at Meta Connect 2024. Continuing its case that smart glasses could be the next blockbuster consumer device, Meta announced that the Ray-Ban Meta will come with some new AI capabilities and smartphone-familiar features later this year.
New features for Meta include real-time AI video processing and live language translation. Other announcements, such as QR code scanning, reminders, and integration with iHeartRadio and Audible, promise to give Ray-Ban Meta users the smartphone features they already know and love.
Meta says its smart glasses will soon have real-time AI video capabilities, meaning you can ask the Ray-Ban Meta glasses questions about what's in front of you and the Meta AI will respond verbally in real time. Currently, the Ray-Ban Meta glasses can only take photos and describe them or answer questions, but the video upgrade should, at least in theory, make for a more natural experience. These multimodal capabilities are expected to be available later this year.
In the demo, users were able to ask Ray-Ban Meta questions about the meal they were cooking or the cityscape in front of them, with real-time video capabilities allowing Meta's AI to process the live action and respond with voice.
However, this is easier said than done, and we need to see just how fast and seamless this feature is in practice. We've seen demos of these real-time AI video features from Google and OpenAI, but Meta is the first to deploy such a feature in a consumer product.
Zuckerberg also announced live language translation for Ray-Ban Meta. English-speaking users will be able to talk to French, Italian, and Spanish speakers, and the Ray-Ban Meta glasses will be able to translate what they're saying into the language of their choice. Meta says the feature will be available later this year, with more languages to come.
Ray-Ban Meta Glasses will have a reminder feature that allows users to ask the Meta AI to remind them about something they saw through the smart glasses. In the demo, a user was able to get the Ray-Ban Meta Glasses to remember a jacket they were looking at and later share the image with a friend.
Meta has announced that integrations with Amazon Music, Audible, and iHeart will be coming to its smart glasses, allowing users to easily listen to music from their preferred streaming services using the glasses' built-in speakers.
Ray-Ban Meta glasses will also add the ability to scan QR codes and phone numbers from the glasses: users can tell the glasses to scan something, and the QR code will instantly open on their phone without any further interaction.
The smart glasses will also feature new Transitions lenses that react to UV light and adapt to the lighting in the room.