Ray-Ban's smart glasses Meta were the talk of the town. They capture videos, take photos, livestream, and function as a viable alternative to headphones, all while looking like a regular pair of sunglasses. However, everyone was eagerly awaiting the addition of multimodal AI to the mix. And now, it's finally here.
So, what exactly is multimodal AI? Simply put, it's a set of tools that allow artificial intelligence to process various types of information, including images, videos, text, and audio. It's AI that can see and understand the world around you in real-time. This is the basic concept behind Meta's more entrenched version of AI promises, and we have been impressed during our initial encounters with it.
Multimodal Meta AI is launching with the Ray-Ban Meta starting today! This is a huge leap for wearable tech and makes AI interaction more interactive and intuitive.
Excited to share more about our multi-modal work with Meta AI (and why 3), stay tuned for more updates coming soon. pic.twitter.com/DLiCVriMfk
– אחמד אל-דהלה (@Ahmad_Al_Dahle) April 23, 2024
Here's how it works. The glasses have a camera and five microphones, operating as the eyes and ears of artificial intelligence. Yet with that in mind, you can ask the glasses to describe anything you're looking at. Do you want to know the breed of the dog before you approach and pet it? Just ask the glasses. Meta says it can also read signs in different languages, great for travel. We enjoyed saying, "Hey Meta, look at this and tell me what it means," and listening in the process. There's even a feature for landmark identification, though it wasn't available for testing.
There are several potential scenarios for use, like identifying ingredients on a countertop in the kitchen and asking the AI to come up with a relevant recipe. However, we need a few weeks of real people to put the technology through its paces to gauge how truly beneficial it is. Real-time translation is going to be a killer app, especially for tourists, but we hope it will minimize misunderstandings. Mark Zuckerberg has been shown the AI picking out clothes for him to wear, but, bless his heart, it's pretty much a piece of cake.
Multimodal AI wasn't the only update announced for the smart glasses today. Meta unveiled integration for hands-free video calls with WhatsApp and Messenger. There are also some new frame designs for fashion aficionados. These new styles can come with prescription lenses and are available for preorder right now. Ray-Ban's Meta smart glasses start at $300, which isn't a huge change but definitely preferable to $700 for a tangible attachment.