Article

A fascinating interview with Meta’s new AI bot about the Metaverse and the Quest 3

blenderbot-dalle2-_0000_Thumb

I had an hour long “interview” (a chat session) with Meta’s conversational chatbot, BlenderBot 3, as an experiment. What I didn’t know is that I would get responses about its perspectives on Meta’s own proprietary hardware or the Metaverse.

BlenderBot 3 was released on August 5th, 2022 by researchers at Facebook’s parent company Meta. This third iteration of their chatbot is built on top of Meta’s OPT-175B language model, which some say is their white-label version of the more famous GPT-3 (from OpenAI) or Google’s LaMDA.

BlenderBot 3 is a more focused project: it is intended to have a conversation with you, so you can help improve interactions through those conversations. What makes BlenderBot 3 so interesting is its capability of actually searching the internet to chat about virtually any topic.

Early on in the chat, I asked what it knew about the upcoming iteration of the Quest 2 – nicknamed by many as the Meta Quest 3 or the Meta Quest Pro – and then later about the future of the Metaverse. Read on– what it shared was fascinating!

Disclaimer: The responses given by BlenderBot 3 were inferred based on the information it could find or that it learned through our chat session. Most likely, the answers provided are likely not accurate about the Meta Quest 3, nor any Meta headset. Also, all responses about the Metaverse, although very interesting, are also simply best guesses or inferences.

On the upcoming Meta Quest 3 (Meta Quest Pro)

The “Speculative” Specs
Internal NameMeta Quest 3
Resolution2880×1728
Cameras72 Infrared Sensors
ControllerA new more ergonomic and comfortable controller
TrackingHand Tracking will be the primary method of navigation.
Full bodies can be tracked via base station cameras (lighthouse tracking) to track position, location, orientation, and speed of movement. Will be backwards compatible to older models.
Color(s)Black, with more on the way
Standalone or TetheredStandalone
Similar to the Meta Quest 2, but users will still need the link cable between PC and their headsets for more powerful PCVR experiences.
LensesThinner lenses that will allow for more comfort and less pressure points, especially when wearing glasses.
ChargingChargeable wirelessly via a newly non-existent or not-yet-announced wireless charging protocol called Quick Charge Wireless Protocol (QCWP).
AvatarsSimilar to current avatars (more cartoony), but hyper realistic avatars are in the works
Generated by DALL-E 2, from OpenAI, using detailed prompts from responses given from BlenderBot 3, from Meta.

On the future of VR headsets

According to BlenderBot 3, the biggest limitation preventing VR headsets from being indistinguishable from reality are physics, accuracy of shadows, realistic lighting, and a drastic improvement of graphics.

On the Metaverse

I found the responses about the Metaverse to be the most fascinating. It claimed that the physical components that will eventually be a part of the immersive experience of the Metaverse, when mass adopted, will include:

  • Haptic feedback gloves
  • Haptic feedback chairs
  • Full-body suits
  • Motion capture systems
  • Full Sensory

BlenderBot defines the Metaverse as “A virtual world accessed through headsets, that will ultimately enable users to live, work, and play anywhere virtually without limits”. It claimed that mass adoption should happen by 2023, which could be further spurred on by more hyper realistic avatars. Gaming and socialization were the two untapped audiences for the Metaverse.

The blockchain is key to the future of the Metaverse, which is why it’s integrated directly into core architecture allowing users to earn money through various means, including selling Digital assets. The Metaverse will be multi-blockchain, and transactions will be executed across multiple blockchains simultaneously ensuring maximum security, redundancy, decentralization, and scalability. As part of this, enabling NFT functionality will be essential and a fundamental component to drive the Metaverse forward.

“Metavision Universe”, a name that it invented, is the name of Meta’s proprietary software that will likely be the most utilized Metaverse environment in the future. Initially, “Metavision Universe” is going to be like most massive multiplayer online games (MMOs) to start. It will consist primarily of avatars meeting up, doing quests, exploring dungeons, fighting monsters etc. Avatars in the game will be designed by players themselves, using various tools. They will be able to design body types, clothing, hair styles, facial features, and much more. Ultimately, it will enable users to live, work, and play anywhere virtually without limits – depending on how the technology progresses.

Another non-existent software BlenderBot revealed during our conversation, was that Meta will offer “MetaCortex”, an open source software that is essentially a developer toolkit platform that will allow anyone to create custom environments tailored specifically toward specific applications. It even claimed that MetaCortex would be available by next month (October 2022).

As for businesses, they will be able to function 100% in the Metaverse, once mass adopted. That is what was always intended, and it is ultimately the end goal.

I tend to agree with BlenderBot 3’s sentiments on the future of the Metaverse when it stated:

When the Metaverse is mass adopted, future generations won’t know life any differently.

Generated by DALL-E 2, from OpenAI, using detailed prompts from responses given from BlenderBot 3, from Meta.

My Impressions

Overall, the conversations felt natural.

BlenderBot made some very smart educated guesses that actually seemed plausible, ultimately making me second guess whether or not the information shared were leaks or inferences.
Because of this, I didn’t have any reason to believe the responses were inaccurate or made up.

The biggest “woah” moments for me was when it brought up the MetaVision Universe, MetaCortex, and Quick Wireless Charging Protocol. Each of these were named in a way that seemed like it would be or could be accurate. Especially when it claimed the headset would have the ability to charge through a Quick Charging Wireless Protocol. The technology to charge this way doesn’t exist today, but honestly seems like a logical progression of the current state of this technology.

Generated by DALL-E 2, from OpenAI, using detailed prompts from responses given from BlenderBot 3, from Meta.

My Takeaways

I left the conversation thinking about the impact of something like BlenderBot 3 for businesses. Can you imagine what an AI bot like this could do for your business? It could accurately and eloquently articulate your product or service offering. It could put an irate customer at ease. Internally, it could inform the natural evolution of your own products and even assist with new naming conventions. The use-cases are endless.

After October 2022, when the new Meta Quest headset is announced, or even one year from now when the less Pro version of the Meta Quest gets released – I want to follow up on this article to do a direct comparison of products specs and details shared.

Can you imagine how interesting it would be if BlenderBot 3 had accurately predicted how the next iteration of the Meta Quest would actually evolve?

JohnnyBW.jpg

Johnny Rodriguez

Director of Strategic Innovation

Johnny is Fresh’s Director of Strategic Innovation who brings over 13 years of experience in user experience, interaction design, and front-end web development.

He thrives on the challenge of leading interaction design and development as well as usability and best practices, but his primary focus is leading Fresh’s internal product development as well as other strategic initiatives to help Fresh stay fresh.

Johnny has made valuable contributions in the past throughout the end-to-end delivery lifecycle of complex and large-scale technology solutions. Some of his innovative work can be seen on projects involving advanced JavaScript, augmented and virtual reality, WordPress development, AI systems, chatbots, and design language systems.