At Meta Connect, a host of new mixed reality hardware and AI updates presented by CEO Mark Zuckerberg
At the 2024 Meta Connect developer conference yesterday, CEO Mark Zuckerberg and other executives debuted a number of new products and features to power the giant platform’s vision for AI, mixed reality and online content.
Here’s a look at what the company showcased during its annual keynote.
Quest 3S mixed reality headset
Zuckerberg unveiled the new Meta Quest 3S mixed reality headset, a follow-up to the Quest 3 released a year ago. Priced starting at $299, the 3S has the same mixed-reality features as the original model but for $200 less. (The lower price prompted Zuckerberg to say “Hell yeah” in response to audience applause.)
The Quest 3S also features various hardware and software upgrades. Along with spatial audio by Dolby Atmos, it also can be a remote desktop for Windows 11 PCs through Meta’s partnership with Microsoft. Along with enabling digital interactions with real-world environments, Meta also previewed a new feature called Hyperscape that allows people to recreate physical spaces in a virtual environment. To demo the feature, Zuckerberg showed an example of virtual artist Daniel Arsham’s art studio and the recording studio where Green Day recorded their “Dookie” album.
Other content partnerships around the Quest 3S include new apps for Prime Video, Amazon Music and Twitch. Zuckerberg also showed off other new mixed reality social features for Meta Horizons for gaming and co-viewing content on YouTube.
“A lot of the magic in mixed reality is you can feel a sense of presence,” Zuckerberg said. “It’s different from anything any other platform has ever built and different from anything any other platform can provide.”
While the lower price of Meta’s headsets might make VR more accessible, cost isn’t the only hurdle to adoption, noted Forrester analyst Mike Proulx. He thinks headsets are still too cumbersome and wear people out faster over longer periods of time, which is why upgrades for smart glasses could be especially promising. There aren’t any clear marketing applications for AR glasses yet. However, Proulx said it’s more about preparing for what’s possible in the future, adding that Meta “pretty convincingly demonstrated a future 3D computing platform that solves for the many headwinds that VR and AR headsets won’t overcome.”
“Since Meta’s glasses see and interact with the context around them, there are natural connection points for engagement with brands’ advertising whether physical or online,” Proulx continued. “I can imagine print ads or billboards easily coming to life, but that’s just the start. I think the real brand opportunity lies with improving the customer experience. This kind of computing interface makes it far more easy and instant to transact with brands.”
Meta also announced a range of updates for its AI products including the debut of its newest AI model, Llama 3.2, which will be able to understand both images and text. Other features include using Meta AI by conversing with it – similar to what OpenAI recently added for ChatGPT. The voice function for Meta AI will include system voices along with several celebrity voices including John Cena, Dame Judi Dench, Kristen Bell, Keegan-Michael Key and Awkwafina.
The celebrity voices come just a few months after OpenAI debuted an AI voice that sounded so much like Scarlett Johannson, the actress threatened to sue the startup for appropriating her voice without permission.
“I think that voice is going to be a way more natural way of interacting with AI than text,” Zuckerberg said. “And I think it has the potential to be one of — if not the most — frequent ways we all interact with AI.”
Meta AI’s voice capabilities also include a way to do real-time language translation through Meta’s Ray-Ban glasses (see below) and a companion app. To demo the feature, Meta had UFC fighter Brandon Moreno onstage for a brief conversation with Zuckerberg where Meta translated between Spanish and English. Another previewed AI feature was automatic video dubbing to translate video content into users’ native languages on Instagram and Facebook.
Updated features like language translation could be especially beneficial for content creators and marketers to reach new audiences, said Gartner analyst Nicole Greene. Multimodal features like video AI could also power more shoppable media. The AI assistant feature in the Meta Ray-Bans could also provide new ways for people to not just create content but also discover it. However, the new AI features also come with the ongoing risks like external threats of impersonation, disinformation and data privacy.
“For business, how do they showcase their unique brand differentiation in a way that is relevant to customers in an increasingly cluttered content environment,” Green said. “How does their brand stand out when consumers might turn toward convenience, and how do they use the new technology to seamlessly integrate into experiences rather than chasing after customers with hyper personalized messaging through the platform.”
The new version of Llama 3.2 will be available everywhere starting this week except in the European Union, following Meta’s decision in July to not release upcoming AI models in the EU because of regulatory issues. However, Zuckerberg said “I remain eternally optimistic that we will figure that out,” which prompted a few people in the audience to awkwardly clap.
Although advertising wasn’t the focus of Meta Connect, the company said more than 1 million advertisers are using Meta’s generative AI tools for producing ad creative – with more than 15 million ads created with the tools last month. On stage, Zuckerberg said Meta AI now has nearly 500 million monthly active users.
Meta also debuted an upgraded version of its Ray-Ban smart glasses, with updates including easier ways to talk with Meta AI and have it identify real world objects. Other new features include video AI, real-time translation, better memory capabilities, and ways to send audio messages through WhatsApp and Messenger. It also has more advanced content integrations with Calm, Spotify and Amazon Music along with new partnerships with Audible and iHeart.
Perhaps the biggest and most surprising news was at the end when Zuckerberg debuted a prototype of Orion, Meta’s first “true AR” glasses. Previously under the code name Project Nazare, the glasses have been in the works for years. Rather than using traditional screens, Orion uses light diffraction to create holographic displays projected onto an environment.
“The display is different from every other screen you have ever used and that is because it is not actually a screen,” Zuckerberg said. “It is a completely different kind of display architecture with these tiny projectors in the arms of the glasses that shoot light info waveguides and nanoscale 3D structures etched into the lenses so they can diffract light and put holograms at different depths and sizes into the world in front of you.”
Although no release date has yet been set, the expanding suite of smart eyewear is signaling to some that Meta is looking to create a more cohesive connection between smartphones and other devices.
Meta’s AR glasses could open up new possibilities for product integration, data-driven insights, and location-based applications to give brands new ways to personalize content, said Sasha Wallinger, founder of Blockchain Style Lab. Wallinger — a longtime marketer focused on the intersection of creativity and innovation — also imagined a potential “Pokemon Go meets Google Ads, with the optimization of curating categories that the wearer actually wants to see.
“Aside from the marketing and future technology elements, I’m also really excited about how Meta’s collection of frames that has the potential to make technology more accessible to the fashion, beauty and eyewear space,” Wallinger said.
https://digiday.com/?p=556487