Gadgets

Meta is rolling out live AI and Shazam integration to its smart glasses

It already works as well as a head-mounted camera and open headphones, but now the Meta has access to live AI without the need for a wake-up voice, live translation between several different languages, and access to Shazam to identify music.

Meta most of these features in September. Live AI lets you start a “live session” with Meta AI that gives the assistant access to whatever it sees and lets you ask questions without saying “Hey Meta.” If you need your hands-free to cook or fix something, Live AI should keep your smart glasses handy even if you need to focus on whatever you’re doing.

Live translation lets your smart glasses translate between English and French, Italian, or Spanish. If live translation is enabled and someone is speaking to you in one of the selected languages, you’ll hear whatever they’re saying in English through the smart glasses’ speakers or as typed text in the Meta View app. You’ll need to download some models to translate between each language, and live translation needs to be enabled before it can work as an interpreter, but it seems more natural than holding your phone to translate something.

With Shazam integration, your Meta smart glasses will also be able to identify any song you hear playing around you. A simple “Meta, what’s this song” will get the smart glasses’ microphones to pick up whatever you’re listening to, just like using Shazam on your smartphone.

All three of these updates take the wearable a step closer to Meta’s ultimate goal of a true pair of unsophisticated glasses that can replace your smartphone, the idea of ​​which is a preview of real life. Pairing AI with VR and AR seems to be an idea many tech giants are toying with, too. Google’s new XR platform, , is built on the idea that artificial AI like Gemini can be the glue that makes VR or AR compelling. It’s still years before any company is willing to replace your field of vision with holographic images, but for now smart glasses seem like a reasonably useful break.

All owners of Ray-Ban Meta smart glasses will be able to enjoy Shazam integration as part of the Meta’s v11 update. To get live rendering and live AI, you’ll need to be part of the Meta Early Access Program, which you can join right now .


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button