13.2 C
Casper
Saturday, October 12, 2024

Meta’s AI: Object Recognition and Language Translation for Ray-Ban

Must read

Saying “Hey Meta” while wearing the Ray-Ban smart glasses will summon a virtual assistant who sees and hears what’s happening around you.

Meta will finally let people try its splashiest AI features for the Meta Ray-Ban smart glasses, though in an early access test. Meta announced that it will start rolling out its multimodal AI features that can tell you about things Meta’s AI assistant can see and hear through the camera and microphones of the glasses.

Mark Zuckerberg demonstrated the update in an Instagram reel where he asked the glasses to suggest pants that would match a shirt he was holding.

It responded by describing the shirt and offering a couple of suggestions for pants that might complement it. He also had the glasses’ AI assistant translate text and show off a couple of image captions.

Zuckerberg revealed the multimodal AI features for Ray-Ban glasses like this in an interview with The Verge’s Alex Heath in a September Decoder interview. Zuckerberg said that people would talk to the Meta AI assistant “throughout the day about different questions you have,” suggesting that it could answer questions about what wearers are looking at or where they are.

The AI assistant also accurately described a lit-up, California-shaped wall sculpture in a video from CTO Andrew Bosworth. He explained some of the other features, which include asking the assistant to help caption photos you’ve taken or ask for translation and summarization — all fairly common AI features seen in other products from Microsoft and Google.

The US test period will be limited to “a small number of people who opt-in,” Bosworth said. Instructions for opting in can be found here.

More articles

Latest posts