Meta’s announced some new elements for its Ray-Ban Meta glasses, which are already seeing higher sales momentum heading into the holiday season.
Meta’s stylish smart glasses are quickly becoming a must-have for the tech savvy, and Meta’s looking to build on this by adding more AI powers into the device, in order to broaden its responsive and interactive capabilities.
First off, Meta’s adding “Live AI”, which will give you a constant AI companion for up to 30 minutes at a time.
As you can see in this example, posted by Meta CTO Andrew Bosworth, Live AI will enable you to interact with Meta AI hands-free, so that you can ask questions in a more natural, conversational way.
As per Meta:
“During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.”
Which is an interesting use of generative AI, though I do question whether we’ve considered the mental health impacts of creating AI companions as yet.
Like, what happens if people come to rely on AI tools as their friends, then the provider cuts them off? Could enhancing connection in this way, which feels like a real conversation with a real person, actually end up being risky for connection and engagement?
I guess we’ll find out, but like social media before it, my concern is that we’re only going to recognize these harms in retrospect, with the desire to innovate overruling the need for related impact assessment.
As Bosworth also notes, Meta Ray Bans are also getting a new Shazam integration for users in the U.S. and Canada, so you can ask Meta AI what song is playing at any time.
Yep, that’s Extreme Zuck, at it again, fresh from a bout of jiu jitsu and out in the paddock, thrashing his dirt buggy. I don’t really know why Meta is so intent on presenting Zuck with more personality these days, though I do know that his recent donation to President-elect Trump’s inaugural fund hasn’t helped in this department.
Finally, Meta Ray Bans are also getting live translation, which could be an especially handy update:
“Through this new feature, your glasses will be able to translate speech in real time between English and either Spanish, French, or Italian. When you’re talking to someone speaking one of those three languages, you’ll hear what they say in English through the glasses’ open-ear speakers or viewed as transcripts on your phone, and vice versa.”
So now, if you’re in a situation where one of these languages is being spoken, you’ll know if they’re talking about you, and what they’re saying. I mean, you’ll likely be disappointed, because they’re probably not talking about you, while you’ll also have to wear your sunglasses inside like a weirdo to translate this. But it’ll be handy in many situations nonetheless, and as Meta adds more languages, this could become a killer application for the device.
As noted, sales of Meta’s Ray Ban glasses have been steadily rising over time, and you can bet that a lot of people are going to get a pair under the Christmas tree next week. And as Meta continues to evolve the device, it may well become an essential connector, in many ways, which would boost the value of Meta’s AI tools, and help guide its product development direction.
And definitely, the use of Meta AI in this context is more valuable than a chatbot on Facebook or IG.
Which is where Meta’s development of AI is actually more interesting, because it has more places and ways to use AI to bridge the gap between where we are now, and how we’ll connect in future.
So while Meta shoving its AI chatbot into all of its apps seems unnecessary, and even annoying, there are other ways in which it’ll be able to maximize use of its AI tools.