The metaverse

How to build a metaverse

October 17, 2022 | By David Wilkins

The metaverse, despite its singular name, will encompass millions of metaverses by 2030. Some will be standalone virtual destinations. Others will be open so visitors can move easily among them – leveraging their digital identities, assets and payment options across environments. Even in its current fledgling form, the metaverse has the potential to significantly impact the ways in which we work, learn, shop, socialize and do business.

Technological advances are converging to make metaverses possible, but tech limitations still prevent a fully realized version. In its most basic form, a metaverse has a sense of immersion, real-time interactivity, user agency and persistence — it’s always on. A fully developed metaverse will be interoperable across platforms and devices, which will allow concurrency so thousands of people can interact simultaneously, with use cases spanning human activity and industry segments.

So how do we get there? Dave Fleming, the executive vice president for research and development at Mastercard Foundry, reveals what it will take to bring this vision to life.

What is the current state of technologies needed to build the metaverse? How do you see it evolving over the next few years?

Fleming: Today’s metaverse offerings, such as Sandbox, Horizon Worlds and Fortnite, primarily use a combination of existing technologies with some additions and variations. They are created with a combination of 3D environments — in some cases AR or VR technology, low-cost VR headsets, blockchain, NFTs and AI — using cloud computing and today’s always-on networks. In the end-state vision of a metaverse, however, a lot more is envisaged that will require significant development to become mainstream. This will include desirable, cost-effective AR glasses — these will be a tipping point for adoption, in my opinion — as well as bodysuits, holo-screens, omnidirectional treadmills and other accessories to improve metaverse navigation. But now we’re starting to stray into a certain movie that paints a compelling picture of the metaverse future!

What’s required to achieve rich, immersive and intelligent experiences? What are the biggest hurdles to overcome?

Fleming: One of the main obstacles impeding the mass adoption of metaverse devices is the processing power and energy consumption required to render high-definition graphics and run spatial computing applications. It’s a challenge we will overcome as CPU and GPU chipsets continue to achieve higher performance and greater efficiency. Miniaturizing this technology will be needed to reach the full potential of the metaverse.

In addition, network and low-latency communication to the cloud and on the edge will be vital to ensure a flawless and lossless experience. Next-generation wireless networks will play a big role in allowing VR/AR wearables to stream high-quality 3D content on the fly and for AI-driven experiences. The metaverse also will depend heavily on AI to build impactful user experiences. As a fundamental building block of the metaverse, AI will help us navigate voice recognition and natural language processing, understand our surroundings using advanced computer vision models and more.

What challenges do developers face as they enter this domain and begin building new iterations of the metaverse?

Fleming: A significant amount of an engineer’s existing skill set will be transferable when building for the metaverse, but they will need to learn some new technologies — which engineers love — and some architectural differences and work with additional partners. For example, 3D development will require mastery of new tools such as Unity and Unreal Engine. The good news for engineers is that Unity primarily leverages the C# programming language and JavaScript, while Unreal Engine uses C++.

From an architectural point of view, engineers will need to become familiar with new building blocks including the devices themselves, world-generating engines, customization engines that allow application developers to create new experiences, and asset creation tools to create virtual land, buildings, vehicles, apparel and avatars, among other things. There is also a need to become comfortable with cloud-oriented storage and blockchains, which will be used in many metaverse environments to support payments and decentralized finance capabilities and to enable NFTs to make virtual assets and objects transferable and tradable. Also, we’ll need to engage partners to design compelling 3D objects to bring these worlds alive. This is an exciting journey for engineers and their product partners.

What’s the difference between a centralized and decentralized metaverse, from an engineering perspective?

Fleming: The tech stacks will vary, as will the architectural choices. Centralized metaverse platforms will use some open choices like Unreal Engine, as Fortnite does, while Oculus is supported by both Unreal and Unity, but others will make less transferable choices. Decentralized metaverses utilize blockchain and decentralized storage such as IPFS to decentralize as much of the environment as possible over many servers — this is the point when the metaverse movement intersects with the Web3 movement. The terms “metaverse” and “Web3” are often used interchangeably, but although they are related, they are different concepts. Metaverse can be thought of as the new experience layer of the internet, while Web3 provides decentralized technology and protocols used to build the back end of the metaverse and enable new ecosystems, communities and economies to develop.

Is it plausible for remote engineering teams to one day collaborate successfully in the metaverse, using VR and other technologies?

Fleming: Engineers collaborate successfully today fully through digital tools. As the original digital natives who built the digital world, they are already plugged in for hours on end while working with colleagues in a 100% digital environment. Despite the graphical UI revolution over the past few decades, engineers still primarily work with keyboard and text to create code and collaborate with each other through code, so it’s likely they will continue creating virtual code for the foreseeable future. That holds true with metaverse development, with a few additions — such as 3D object design and some of the low-code application design engines. But introducing an “active” way of developing could be a net positive by getting a lot of engineers out of our seats.

MASTERCARD SIGNALS

The metaverse

The potential for the metaverse is immense, with billions of people visiting millions of always-on virtual environments to play, socialize, shop, work and learn. Explore the opportunities enabled by the metaverse and how we can overcome the challenges in bringing these worlds to life in the latest edition of Mastercard Signals.

Read more
David Wilkins, director, product development, mastercard foundry