Meta CTO Andrew Bosworth discussed key developments with CNET, shedding light on the upcoming Quest 3 headset, the potential impact of AI assistants on glasses, the progress of eye tracking technology, and the availability of Beat Saber for mixed reality.
Meta has ambitious plans to integrate AI into its VR and AR hardware, with the upcoming smart glasses set to incorporate AI capabilities next year. This strategic move aligns with Meta's vision of infusing AI into their immersive technology solutions. The company's recent Connect conference presented a collection of notable product announcements. Notably, Meta revealed the highly anticipated Quest 3, featuring enhanced graphics capabilities, as well as the imminent release of camera- and audio-enabled Ray-Ban glasses. Additionally, Meta introduced a range of personality-infused AI chatbots and a generative AI tool called Emu, which enables users to create images and stickers.
My personal interest in Meta's advancements within the VR and AR realm dates back to the acquisition of Oculus by Facebook, and I recently had the opportunity to visit Meta's research labs to gain insights into their future direction. As we near the end of 2023, it becomes increasingly evident that our perception of "VR" and "smart glasses" is undergoing a transformative shift. The Quest 3 boasts mixed-reality capabilities that bring to mind the functionality of the Apple Vision Pro, providing a seamless blend of augmented reality and virtual reality. Looking ahead, the glasses slated for release next year will incorporate AI that can recognize objects and translate text, presenting a form factor reminiscent of Google Glass or early prototypes of AR glasses, albeit without displays. Furthermore, both the Quest 3 and the glasses are expected to support various conversational AI functionalities, potentially benefitting from Qualcomm's latest generation of powerful chips.
To gain further insights into how Meta intends to merge VR, AR, and AI, I engaged in a conversation with Andrew Bosworth, Meta's CTO and product head. During our discussion, I posed questions about Samsung's anticipated device, the absence of eye tracking on the Quest 3 despite its presence on the Quest Pro, and the availability of Beat Saber in mixed reality.
Below is an edited transcript of our conversation for the purpose of clarity and conciseness.
Q: Where do you see the relationship between Meta's Quest 3, smart glasses, and AI?
Bosworth: If you were to depict our long-envisioned AR architecture in a box-and-arrows diagram, one of the boxes would undoubtedly represent AI... (laughs) It is a rarity in our industry for a technology to emerge that solves a problem without actively pursuing it. Yet, that is precisely what has occurred with AI.
If you had asked me or Michael Abrash, Meta Reality Labs Chief Scientist, this same question two years ago or even last year, the most significant obstacle to AR would have undoubtedly been AI. Despite the difficulties associated with displays and rendering capabilities, the crux of the matter lies in meeting human expectations, which demand an interface capable of perceiving as we do and possessing common sense, an area where our current capabilities fall short.
Our confidence in this new iteration of Meta AI is sky-high since it has effectively resolved the issue we once thought required more time to address. AI has always occupied a crucial position in our vision, and now we finally have the means to implement it.
Q: Meta has long promised AI assistant smart glasses that embody a user's perspective. How are these developments taking shape with Ray-Bans next year?
Bosworth: Presently, the glasses necessitate activation from a power standpoint. However, our objective is to eventually reach a stage where low-power sensors can detect events that trigger an awareness capable of activating AI. This represents our ultimate aspiration. We are diligently working on developing these sensors and refining event detection. Previously, we grappled with finding an optimal solution for what we referred to as "the conductor." This element determines the appropriateness of displaying certain interfaces on future AR glasses, taking factors such as personal interactions into account. For instance, suppose we are engaged in a face-to-face conversation. In that case, the glasses should refrain from displaying notifications about mundane tasks like grocery shopping from a text message. However, if an urgent message regarding the well-being of my children is received, it should display such information promptly. Achieving this delicate balance represents a challenge we are actively addressing.
Our transition from the first to the second generation, including the introduction of these Ray-Ban Meta glasses, has taught us invaluable lessons. Progress can now be observed on two fronts: hardware improvements, where we strive for gradual enhancement while maintaining cost effectiveness, and significant strides in addressing critical software challenges associated with AI.
It is important to note that Meta's AI chatbots, embodying distinct personalities and even celebrity faces, will be integrated with Facebook apps and VR on the Quest 3. However, their availability on smart glasses has yet to be confirmed.
Q: Will these AI glasses also possess personalities, or will they primarily function as general assistants?
Bosworth: Meta AI is structured around an agent model. Consequently, I expect the future of AI to be divided between external agents that wield their unique atmospheres, necessitating active engagement, and what I would term as personal assistants.
AR glasses have the capacity to perceive precisely what I see, thereby encompassing my private experiences.

0 Comments