Meta has announced updates to its line of Ray-Ban smart glasses, which take videos, photos, livestream and more.
Now, the tech giant is rolling out multimodal AI, a feature Meta has been officially testing since December.
In short, Multimodal AI allows users’ glasses to “see what you’re seeing,” providing users with the ability to ask questions about what they’re seeing, while receiving an automated response in real time. With five microphones and a camera built into the glasses, users can ask to identify a plant or bird song, a specific landmark, or a recipe based on an assortment of ingredients the user is staring at.
This AI feature will also be able to tackle real-time translations:
advertisement
advertisement
“Say you’re traveling and trying to read a menu in French,” Meta explains. “Your smart glasses can use their built-in camera and Meta AI to translate the text for you, giving you the info you need without having to pull out your phone or stare at a screen.”
As the feature rolls out gradually, users in the US and Canada will be able to conduct video calls via WhatsApp and Messenger directly on their Ray-Bans, hands-free.
In addition to multimodal AI, Meta has also announced new frame designs for the line, including cat-eye and low-bridge options, as well as a limited edition collectible pair with a Scuderia Ferrari colorway connected to Miami 2024.