For years, Silicon Valley and Wall Street have questioned Mark Zuckerberg's decision to invest tens of billions of dollars in Reality Labs. This week, Meta's wearables division unveiled a prototype of its Orion smart glasses. This is a form factor that the company believes could one day replace the iPhone. The idea seems crazy…but maybe a little less crazy than it was a week ago.
Orion is a prototype headset that combines augmented reality, eye and hand tracking, generative AI, and a gesture-detecting wristband. Through its micro-LED projector and silicon carbide lenses (which are very expensive), Meta appears to have broken through a long-standing AR display challenge. The idea is that while you can see through Orion like glasses, you can also see application windows projected onto the lenses, which appear to be embedded in the world around you. Ideally, you will be able to navigate your environment using your hands, eyes, and voice.
Orion smart glasses require a wristband and wireless computing pack to function. (Meta) Image credit: Meta
Obviously, Meta's Orion smart glasses are thicker than your average reader, are said to cost $10,000 each, and won't be sold anytime soon. This is a few years from now. All of Orion's technology is relatively new, and all of it needs to be cheaper, better, and smaller before it can make its way into smart glasses you can buy at the mall. Zuckerberg said the company has been working on Orion for 10 years, but has yet to find a path to a marketable product.
But Meta isn't the only company trying to offer users an alternative to smartphones.
This month, Snap announced its latest generation of Spectacles smart glasses. It is larger than Orion and has a more limited field of view. One former Snap engineer called the latest Spectacles “obviously bad,” but you can actually order them. Google hinted at its I/O conference in May that it too is working on smart glasses, likely an improvement on the failed Google Glass experiment of the past decade. Apple is reportedly working on AR glasses that are very similar to Orion. And we can't rule out the possibility of Jony Ive's new startup, LoveFrom. He recently confirmed that he's working on an AI wearable using OpenAI (though he's not sure if it's glasses, pins, or something else entirely).
There's a race among Big Tech's wealthiest companies to develop sophisticated smart glasses that can do everything a smartphone can do, and hopefully more. Meta's prototype revealed two things. There's something there, but we're not there yet.
These devices are very different from the Quest virtual reality headsets and Apple's Vision Pro, which Meta has been pushing for years. There are many similar technologies, such as eye tracking and hand tracking, but the experience is completely different. VR headsets are bulky, uncomfortable to wear, and staring at the display can make you feel nauseous. Sunglasses and eyeglasses, on the other hand, are relatively comfortable to wear and are used by millions of Americans every day.
To Zuckerberg's credit, he's been pushing the eyewear form factor for quite some time, but it certainly wasn't common at the time. Meta's CEO has long been reported to hate having to access his company's popular social media apps via Apple phones (perhaps this is what led to the ill-fated Facebook Phone). . Now, Meta's competitors are also making serious inroads into eyewear computing.
Andrew Bosworth, Meta's CTO and head of Reality Labs, wears clear Orion smart glasses. (David Paul Morris/Bloomberg via Getty Images)
Meta's initial investment here seems to be paying off. Zuckerberg's Orion keynote presentation on Wednesday is one we won't soon forget. The room full of skeptical journalists was filled with electricity and excitement. TechCrunch hasn't demoed Orion yet, but initial reviews have been very positive.
Meta's current offering is Ray-Ban Meta. It's a pair of glasses with a camera, microphone, speakers, sensors, on-device LLM, and the ability to connect to your phone and the cloud. The Ray-Ban Meta is much simpler than the Orion, but at a relatively affordable price of $299, it's actually not that different from a regular Ray-Ban. Although similar to the Spectacles 3 that Snap released a few years ago, the Ray-Ban Meta glasses seem to be more popular.
Despite the huge difference in price and features, Orion and Ray-Ban Meta are more related than you might think.
“Orion is the future, and we are ultimately aiming for a fully holographic experience. Consider Ray-Ban Meta as the first step in that journey,” said Meta's Vice President of Products. Li-Chen Miller, who leads the wearables team at , said in an interview with TechCrunch. “We need to really establish the fundamentals: it’s comfortable, people want to wear it, people find value in it every day.”
One of the things Meta is trying to achieve with Ray-Ban Meta is AI. Currently, smart glasses use Meta's Llama model to answer questions about what's in front of them by taking photos and running the images through an AI system in response to the user's verbal requests. The current AI capabilities of Ray-Ban Meta are far from perfect. The lag is worse than OpenAI's natural-feeling Advanced Voice mode. Meta AI requires very specific prompts to work properly. It sees hallucinations. It's also not tightly integrated with many apps, making it less convenient than simply picking up your iPhone (probably by Apple's design). However, an upcoming update to Meta later this year seeks to address these issues.
Lichen Miller, VP of Product during MetaConnect 2023 (David Paul Morris/Bloomberg via Getty Images)
Meta has announced that it will soon release live AI video processing for Ray-Ban. This means the smart glasses stream live video and verbal requests to one of Llama's multimodal AI models, which generate verbal responses in real time based on that input. Basic features like reminders and app integration have also been improved. If it works, the whole experience should be much smoother. Miller says these improvements will be reflected in Orion, which runs on the same generative AI system.
“Sometimes one form factor makes more sense than another, but we're certainly cross-pollinating,” Miller says.
Similarly, she says some of Orion's features may be narrowed down as her team focuses on making AR glasses more affordable. Orion's various sensors and eye trackers are not cheap technology. The problem is that Orion needs to get better and more economical.
Another challenge is typing. Smartphones have keyboards, but smart glasses do not. Before joining Meta, Miller spent nearly 20 years developing keyboards at Microsoft, and she says Orion's lack of a keyboard is “freeing.” She claims that using smart glasses is a more natural experience than using a phone. You can navigate Orion just by talking, gesturing with your hands, and looking at things. All things that come naturally to most people.
Another device that was criticized for not having a keyboard was, ironically, the iPhone. Former Microsoft CEO Steve Ballmer infamously mocked the iPhone in 2007, saying its lack of a physical keyboard made it unappealing to business customers. But people adapted, and more than 15 years later his comments still sound naive.
I think making Orion feel natural is more of a goal than a reality at this point. In its hands-on review, The Verge noted that at times the windows filled the entire lens of the glasses, completely blocking the user's view of the world around them. That is far from natural. To get there, Meta will need to improve its AI, typing, AR, and a long list of other features.
“With Ray-Ban Meta, we really narrowed it down to a few things, and it worked out really well,” Miller said. “On the other hand, if you want to build a new futuristic computing platform, [with Orion]we have to do a lot of things, and we have to do them all well. ”