Meta’s announced a new update for its Ray-Ban smart glasses, which will now be able to provide more info based on what you can see around you, while they’re also getting new styles, and live video calling connectivity via Meta’s messaging apps.
The main new functional addition is multimodal AI, which will enable you ask your glasses what you’re seeing, and get info then and there.
Multimodal Meta AI is rolling out widely on Ray-Ban Meta starting today! It's a huge advancement for wearables & makes using AI more interactive & intuitive.
— Ahmad Al-Dahle (@Ahmad_Al_Dahle) April 23, 2024
Excited to share more on our multimodal work w/ Meta AI (& Llama 3), stay tuned for more updates coming soon. pic.twitter.com/DLiCVriMfk
That also expands to translations, which could be very handy while traveling.
We’re launching new @rayban meta styles + features ????You can now make video calls with your glasses via WhatsApp and Messenger and ask your glasses about what you’re seeing to get helpful information, hands-free.https://t.co/wP7SQXkNHP pic.twitter.com/uif5ZaFlMY
— Meta Newsroom (@MetaNewsroom) April 23, 2024
Meta launched multimodal functionality for the glasses in testing back in December, before expanding that text to more functions, like popular landmark info in March.
It’s now bringing this expanded capacity to all Ray-Ban Meta smart glasses in the U.S. and Canada in beta, ahead of a global launch at a later date.
You’ll also now be able to conduct video calls via WhatsApp and Messenger via the device, completely hands-free.
As per Meta:
“At the grocery store and not sure which brand of kombucha to buy? Can’t tell if that pineapple is ripe? Now you can hop on a video call with your mom and get her advice based on what you see. Video calling on Ray-Ban Meta smart glasses is rolling out gradually, so if you don’t see the update right away, sit tight — it’s on the way!”
A kombucha example seems to speak directly to the target market for a new tech device, but functionally, it’s another way that you can use your glasses to connect in real time.
Meta added the capacity to live-stream from the glasses back in December.
Finally, Meta’s also adding a range of new colors and styles, providing more options for those looking to buy their own smart glasses.
The new styles include “Skyler” frames, with a cat eye design, and a low bridge option for its “Headliner” variant.
That’ll give you more considerations for your own glasses, and if you have an odd-shaped head, you’ll now have more options to find one that suits.
Meta’s also trying out its first limited edition variant, with an exclusive Scuderia Ferrari colorway for Miami 2024.
“Ray-Ban Meta for Scuderia Limited Edition brings together the legacy of Ferrari, timeless Ray-Ban design and cutting-edge tech from Meta, available April 24, 2024.”
Adding a collectible element could be another way to increase hype around the product, which Meta says is already driving more interest over time.
Indeed, Meta says that the latest version of Ray-Ban Stories has been selling out “faster than we can make them”. That’s likely also been boosted by the arrival of wearable AR, with Apple’s VisionPro device. And while Meta’s Ray-Ban Stories may not be fully AR-enabled as yet, that is coming, and these new glasses are another step towards that next stage.
Providing functional, helpful digital elements, that can interact with real-world environments, is another advance in Meta’s broader vision, and Meta does seem to be taking the right approach, with good-looking, viable alternatives for regular sunglasses, which enhance the appeal of the device.
I mean, if you’re buying new sunglasses anyway, maybe you should consider upgrading to a digitally enhanced option.
The more functional additions Meta can add in this respect, the more likely consumers will be making this exact call, and that could see Meta eventually winning out in the broader wearables race.