Facebook (Meta) Ambition on Metaverse

Facebook… Oh we probably should say Meta just revealed their new vision in metaverse in Facebook Connect (Facebook’s annual developer conference) last week. We could see Meta has been working on developing different technology to increase human’s interoperability in metaverse. However, when it comes to the so-called real metaverse-like experience, we still have a lot of barriers to overcome. Today, let’s see what is Meta’s vision and what challenge Meta should overcome. Btw, strongly recommend to see Meta’s keynote speech so you can more easily to understand how the applications work and look like.

  1. How we connect to Metaverse?

In Meta’s vision, you should be able to enter the metaverse via different devices, such as headset, AR glasses or even just PC or mobile phone. Of course, with different devices, users will have totally distinct experiences and different level of interoperability.

This time Mark revealed two product they are working on. One is VR headset (Project Cambria) and another is AR glasses (Project Nazare). I think it is very obvious that Meta has an ambition to control the entrance of metaverse.

Project Cambria is an upgraded version of Oculus Quest 2. This high-end, more advanced, mixed-reality headset is able to track your eye movement and detect your facial expressions so your avatar in metaverse will have eye contact with others and could reflect your own facial expressions. It will increase the level of interoperability and make the avatar more lookalike a human.

Next hardware is AR glasses called Nazare. You probably noticed Facebook just released a smart glasses co-branding with Ray-Ban. But this time, Mark mentioned Nazare is a real AR-based glasses with hologram displays, projectors, batteries, radios, custom silicon chips, cameras, speakers, sensors to help user to map the world.

2. Avatar

Avatar is just like today’s profile picture but in a 3D, living representation of you in metaverse. You are able to redesign your avatar in any time and could change your clothes or wear different accessories in different places. Moreover, your avatar not only could be photorealistic and as same as what you look like in the real world but also could be robot-like or super surreal. It all depends on how you want to interact with others.

Basically, avatar is you in the metaverse, and in order to make you feel more in presence, we need to equip with lots of technology to do so, including eyeball tracking, facial expression detecting, gesture identifying, voice interaction and so on.

3. Horizon world (virtual space)

One of the most important elements in the metaverse is the virtual space. Users can teleport from place to place at will. There are three different scenarios that Meta proposed: Horizon World, Horizon Home and Horizon workspace and all of them are VR or MR based.

  • Horizon world allows you to create a virtual space to participate in with other people. You could create a game and let people to join the game. Apart from gaming, Horizon world could also be a tool to test, simulate or train any 3D application or AI model like Omniverse (created by Nvidia), Epic Game application or Roblox application.
  • Horizon Home is your virtual home that you can build and customize on your own. You can invite your friends to your virtual home and watch movies, play games or just chill there.
  • Horizon workplace can help you to create an efficient working space when it comes to remotely working. You could see your colleagues and discuss with them in a virtual place. It will also allow you to use 2D application like Dropbox, Slack, Facebook or Instagram on it.

4. Presence Platform & developer tool (SDK)

Presence Platform, which is a broad range of machine perception and AI capabilities that empower developers to build mixed reality experiences. The capabilities include environmental understanding, content placement and persistence, voice interaction, standardized hand interactions.

Meta will provide Interactive SDK (software development kit) & Voice SDK to help creators more easily to add hand interaction and integrate voice input in VR applications. This is going to help people accelerate the build time for new and existing titles that allow people to use their hands and voice more naturally in more virtual experiences.

Challenges Meta needs to conquer

  • How we control avatar in metaverse?
    Like I mentioned above, hand & voice interaction could be key ways of controlling the avatar and increasing the level of interoperability, but we still have long way to go. Furthermore, if we need an even easier and more intuitive way to interact with virtual content, neural interface is wild but feasible. One day it might be able to translate those neuromotor signals into digital commands that enable us to control our devices.
  • Environment understanding & object recognition
    Another crucial barrier is how we accurately simulate the real-world and input data back to device for content placement. Environment understanding requires every single object in it, including not only location, but also the texture, geometry, and function of each one. It takes time to detect the real world and load data back to device and it’s hard to scan and transmit real time data instantly.

This week, please forgive me to let Meta cut the line, coz the key note speech really knocks my socks off. Mark just said they would spend at least 10B dollars on developing metaverse-related technology which just make the competition more fiercer. it would be fun to know what will be going on in this industry. Ok next week let’s understand Roblox together!

Reference:
Facebook Connect Key note speech
Facebook Connect Key note speech transcript
Meta Official Website