Last year, Ray-Ban’s Meta sunglasses introduced artificial intelligence visual search with some impressive (and worrying) features, but a new feature in the latest beta looks quite it works. Meta CTO Andrew Bosworth wrote in a Threads post that it can identify landmarks in different locations and tell you more information about them, acting as a tour guide for travelers.
Bosworth showed several sample images, explained why the Golden Gate Bridge is orange (easier to see in fog), the history of San Francisco’s “Painted Ladies” houses, and more. For these, the description appears as text below the image.
On top of that, Mark Zuckerberg took to Instagram to show off the new features with some videos shot in Montana. This time around, Glass uses audio to verbally describe the history of Big Sky Mountain and Roosevelt Arch, while explaining (like a caveman) how snow is formed.
Meta previewed the feature at last year’s Connect event as part of a new “multimodal” feature that enables it to answer questions based on your environment. This will be possible when all of Meta’s smart glasses can access real-time information (not until 2022, as before), powered in part by Bing search.
The feature is part of Meta’s Google Lens-like feature, which allows users to “show” what they see through the glasses and ask the AI questions about it, such as fruit or foreign text that needs to be translated. It’s available to anyone in Meta’s early access program, but numbers are still limited. “For those who still don’t have access to the beta, you can add yourself to the waitlist while we work to make the beta available to more people,” Bosworth said in the post.
This article contains affiliate links; if you click on such links and make a purchase, we may earn a commission.
1 Comment
Pingback: Ray-Ban’s Meta sunglasses can now identify and describe landmarks – Tech Empire Solutions