BlockFi’s SEC Regulation Will Change Crypto Lending

Over the past six months, a disconnect has formed between how corporate America talks about the nascent concept of the metaverse and its plausibility, based on the nature of the computing power that will be needed to achieve it. Achieving this will require immense innovation, similar to the decades-long effort to shrink personal computers down to the size of an iPhone.

Microsoft announced its $68.7 billion bid for Activision Blizzard as a metaverse last month. In October, Facebook transformed its entire corporate identity to revolve around the metaverse. Last year, Disney even promised to create its own version of the Metaverse to “enable storytelling without borders.”

These ideas rely on our ability to build the chips, data centers and networking equipment needed to deliver the computing power required. And at the moment, we can’t. No one knows how or where to start, or even if the devices will always be semiconductors. There aren’t enough chips right now to build everything people want today, let alone what is promised by metaverse preachers.

“The most important things that we look at today in supercomputers still need to be improved in order to be able to provide [a metaverse] type of experience,” Jerry Heinz, the former head of Nvidia’s Enterprise Cloud unit, told Protocol.

Zuckerpoured

What we now describe as the Metaverse is at least as old as early 20th century speculative fiction.

E. M. Forster’s 1909 story “The Machine Stops”, for example, renders a pre-chip, pre-digital version of the Metaverse. Fast forward 70 years, and science fiction writer William Gibson called this concept “cyberspace” in the 1984 book “Neuromancer”; Neal Stephenson popularized the word “metaverse” in his 1992 novel “Snow Crash”; Ernest Cline called it OASIS (an acronym for Ontologically Anthropocentric Sensory Immersive Simulation) in “Ready Player One”. Few of these stories describe a utopian community.

It is possible that what we now call the metaverse will forever remain in the realm of science fiction. But like it or not, Mark Zuckerberg pushed the idea into the mainstream.

Zuckerberg’s explanation of what the metaverse will ultimately look like is vague, but includes some of the tropes his boosters roughly agree on: He called it “[an] The embodied internet that you are in rather than just watching” which would offer everything you can already do online and “some things that don’t make sense on the internet today, like dancing”.

If the Metaverse seems vague, that’s because it is. This description could mutate over time to apply to many things that could eventually happen in the field of technology. And arguably, something like the Metaverse could possibly already exist in some early form produced by video game companies.

Epic Games’ Roblox and Fortnite host millions of people – albeit in virtually separate groups of a few hundred – watching live concerts online. Microsoft Flight Simulator has created a 2.5 petabyte virtual replica of the world that’s updated in real time with flight and weather data.

But even today’s most complex metaverse-like video games require a tiny fraction of the processing and networking performance we would need to realize the vision of a persistent world accessible to billions of people, while at the same time, on several devices, screen formats and in virtual. or augmented reality.

“For something that is truly mass market, spending many hours a day doing [kind of activity, we’re looking] to generations of computing to leap forward to do so,” Ben Bajarin, CEO of Creative Strategies, told Protocol. “What you’re going to see over the next few years is an evolution from what you see today, with perhaps a little more emphasis on augmented reality than virtual reality. But it won’t be this rich, simulated 3D environment.

A generational leap

In the beginning, chips powered mainframes. Mainframes spawned servers, personal computers, and smartphones: smaller, faster, and cheaper versions of more or less the same technology that preceded them.

If the metaverse is next, no one can specifically describe the system requirements, as it will be a separate change from previous changes in computing. But it’s become clear that to get anything close to the optimistic version, chips of nearly every type will need to be an order of magnitude more powerful than they are today.

Intel’s Raja Koduri attempted to the question in a recent editorialwriting: “Truly persistent and immersive computing, at scale and accessible to billions of humans in real time, will require even more: a 1,000x increase in computing efficiency over the state of the art current.”

It’s hard to underestimate how difficult it will be to achieve the goal of a thousandfold increase in IT efficiency. Koduri’s estimate might be conservative, and claims could easily exceed 10 times that amount.

Even assuming these onerous hardware requirements can be met, better communication between all layers of the software stack — from chips at the bottom to end-user applications at the top — will also be needed, said Pedro Domingos, professor of computer science at the University of Washington, at Protocol.

“We can get away with it [inefficiency] right now, but we’re not going to make it in the metaverse,” he said. “The totality [software] the stack is going to be more tightly integrated, and it’s already happening in areas like AI and, of course, graphics.

This is not quantum computing

The generational leap to the metaverse probably won’t be quantum computing, or at least the way we think of it today: a largely theoretical platform with decades of practical use that requires calculations to be performed at vacuum temperatures in outer space in room-sized computers. But the performance breakthrough promised by something like quantum computing will be needed.

Google is exploring using algorithms to design more powerful chips, which could help move the needle. Special processors for AI models exist today, but by creating even more specialized chips, more performance can be achieved, Domingos said. These designs can circumvent barriers to increasing the raw performance of existing silicon, such as creating an application-specific integrated circuit that performs physical calculations.

“These companies — the chipmakers, or the metaverse vendors, or who knows — will make more and more advanced chips for that purpose,” Domingos said. “For every level of the stack, from physics to software, there are things you can do.”

Domingos noted that in the 1990s, real-time ray tracing would have been considered impossible, but decades later it’s now being done in real time with chips that power the PlayStation 5 and Xbox Series X. Google’s AI chips, known as Tensor Processing Units, are another example of a specialized type of chip that will only become more abundant in the future, and is needed for the metaverse.

A fabulous future

But generational changes in computing also require equivalent changes in manufacturing technology. Companies such as TSMC and Intel are already pushing the boundaries of physics with extreme ultraviolet lithography machines to print the most advanced chips.

The latest EUV machines are dedicated to compressing more transistors and smaller and smaller features on each chip, continuing the path established decades ago. But at some point in the future, chip-making machines will become too expensive, or it will be impossible to reduce functionality further.

“If you look at where the architecture is, if you look at where the performance per watt is, I’m not saying we need a breakthrough, but we’re about to need a breakthrough “, Bajarin said. “Below one nanometer is about four or five years away, and it’s not going to solve this problem.”

Without a generational leap in computing, a less faithful version of the Zuckerverse is feasible. Assuming that users will settle for slightly better graphics than Second Life was able to achieve ten years ago, it should be possible in the longer term to create something that achieves some of the goals, like a persistent virtual world and connected to the Internet. Building this version of the metaverse will require better networking technology, the specialized chips described by Domingos, and perhaps something like artificial intelligence in order to handle some of the more complex but mundane workloads.

“There’s a lot of scaling to do, which means today’s data centers are going to look tiny compared to tomorrow’s,” Domingos said.

But it will take a long time to get there. Zuckerberg’s vision for the Metaverse could be decades away, and after losing $20 billion on the effort so far, it’s unclear Meta will have the money to turn that vision into reality.

Comments are closed.