Gadgets

Meta smart glasses can now tell you where you parked your car

Meta is introducing some of the features on Ray-Ban’s AI-powered smart glasses to users in the US and Canada. CTO Andrew Bosworth on Threads that today’s update to the glasses includes more natural language recognition, which means that the strict “Hey Meta, look and tell me” commands should go away. Users will be able to engage the AI ​​assistant without having to “look” with the application part.

Many of the other AI tools shown off during last month’s Connect event also hit the frames today. That includes voice messages, schedules and reminders. The glasses can also be used to make Meta AI dial a phone number or scan a QR code. CEO Mark Zuckerberg showed off the new reminders feature as a way to find your car in the parking garage on the Instagram reel. One notable omission from this update is a live translation feature, but Bosworth didn’t have a timeline for when that feature will be ready.

Meta’s smart glasses have already made headlines once again today after two Harvard University students actually used them. Their combination of facial recognition technology and a large language processing model was able to reveal addresses, phone numbers, family member information and part of Social Security Numbers.


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button