DEV Community

Cover image for AOL for Metaverse
Azat Hafizov
Azat Hafizov

Posted on • Originally published at metastack.ml

AOL for Metaverse

When was the last time you heard the terms «internet superhighway» or «infobahn»? If you are hearing this for the first time, these concepts were «web3» and «NFT» of the mid-90s — it was the time when people in the Internets were trying to find a suitable metaphor for the concept of the World Wide Web.

People at the time dreamed of «cyberspaces» as a founding spatial metaphor for the Internet and wanted to build «digital houses with a thousand rooms» rather than websites as we know them now. Part of the plan was to use VRML, a special markup language for representing 3D vector graphics, designed specifically with Internet in mind. You can feel a very specific Mark Zuckerberg vibe in VRML 1994 paper.

There were no left-menu websites, no hamburger menus you are now used to, no headers and footers. VRML files were referred to as «worlds». It was, indeed, the first Metaverse summer (by analogy with so-called AI summers). Sony Corporation (Apple, Meta and Google of that era, combined) even built a showcase VRML website called Colony City (later CyberTown) that included full society elements such as jobs, currency, home ownership, purchasable items, security, and social hierarchy.

I'm sure you can see many parallels with the current buzz surrounding NFTs, virtual land FOMO purchases, the creator economy and so on.

So, fast forward a few years. The search for metaphors had failed miserably, as had people's (and investors') hopes of applying building-architectural concepts to the Internet and 3D-based approach to designing web interfaces. Around the time Google was born in a dorm room, flat 2D websites had completely won the battle.

The next VR summer happened in mid-2000s with Second Life. The hype was colossal, tens of thousands of users regularly attended virtual conferences (think of Travis Scott's V-concert). Huge prospects of industrial use. However, 2D services like Google Docs won again. Turned out that communicating by means of 2D windows and voice is much easier, than looking in 3D at a virtual whiteboard with other users at skewed angles (say hi to Meta Workplaces).

We are now in the midst of the third VR summer in its entirety.

Omniverse

Second Life was designed for humans only. They wanted us to live, teach, collaborate, socialize, and chill there. NVIDIA's Omniverse is designed for non-humans rather than humans. Or, as they put it, "...for humans as well." And today, there are far more non-humans than humans, with the gap growing exponentially each year.

Anyway, in order to build infrastructure pillars for Metaverse and qualify for providing Metaverse services, you would need:

  • Large-scale parallelization. We should run millions of simulations with the domain randomization, i.e. introducing small deviations in each world (see chaos theories, best known for its Butterfly effect), and then sorting the results to select the most favorable deviations.
  • Photorealism. Ray tracing and other similar rendering techniques need a large amount of computational power. Furthermore, in order for us humans to benefit from simulated evolution or RL results, such systems' internal time must be significantly (billions and trillions of times) faster than real-time.
  • Proper physics. Not only mechanics, but also multi-physics, are involved. This necessitates advanced math and algorithm solvers. This, in turn, necessitates advanced modeling languages, which necessitate advanced compilers for various types of physical computers. In many ways, this is similar to what we understand as traditional CS, which is a discipline that binds math with physical computing and optimizes for available resources and timings. Precise physical modeling that is faster than real-time is extremely computationally demanding — see how NVIDIA handles it.

Prior to the advent of GPUs, it was impossible to discuss such use of virtual worlds: it would be a pure speculation. Today, all of this is real, and Jensen Huang is simply making a traditional entrepreneurial move: he builds an ecosystem with a full technological stack, covering the equipment level by software (often free), and teaching programmers (often for free) on how to use this equipment. He does the same for ride-hailing industry, which is expected to grow eightfold to $285 bln by 2030.

In this sense, you can see some parallels with Musk's SpaceX, which also builds the entire technological stack for flying to Mars on its own rather than waiting for the conservative space industry to catch up with Elon.

Open-source to dominate

Some of you may even recall Elon open sourcing of all Tesla patents «in the spirit of the open source movement,» which was done on the same exact entrepreneurial grounds as Jensen Huang is now doing for Omniverse by sharing all its current works. Nobody knows what share NVIDIA's Omniverse will have in Metaverse, but the chances of it having a significant share due to Pixar's USD and Huang's multilevel strategy are actually quite high.

AOL for Metaverse

Just like first ISPs in early 90s, first MSPs (Metaverse service providers) really have everything for a huge commercial success. Humans will visit these worlds through VR and non-humans that will live there will visit our world through AR and robots — although, in our world there is no the domain randomization and Metaverse perks like changeable physics constants, varying gravitation, time flow or concurrent simulations... In other words, everything that makes this whole concept of the Metaverse so appealing to us.

Why should I care?

The good news is that, within the next 5-7 years, anyone interested in expanding their cognition capabilities will be able to do so with an exobrain, i.e. cybercortex (purchased from a manufacturer or built and assembled by yourself), bringing the owner to the next level of metacognition — which always leads to a change in occupation, career, lifestyle, and so on. Much like how personal computers enabled many people to switch from blue collar to white collar jobs in the 1990s. Or how laptop computers spawned an army of digital nomads in the 2010s. Or how smartphones are allowing thousands of creators to earn six figures on YouTube and TikTok right now.

If you're reading this, you probably want to be among the first to learn how to combine and employ tools to build your own exobrain in order to maintain a competitive edge in the creative economy market for years to come.

By tools, I mean:

  • On-chain apps (smart contracts or dapps, i.e. everything Web3-related);
  • Software (programs written by you or some cloud-based no-code tools);
  • Machines (own or remote, hint: very likely equipped with NVIDIA GPUs);
  • Edge devices (ultra-low power microcontrollers capable of on-device ML inference for analytics that is not stored on any servers and belongs solely to you);
  • Bots that can perform actions in our world (physical robots, drones or vehicles);
  • Avatars in Metaverse worlds for collaborating with other humans;
  • Digital twins for your body, smart home, and business that run billions of concurrent simulations to advise you based on predictions of your health conditions next week, bills next month, profits next quarter, and so on;
  • Systems thinking for exploring options, resources and predicted best possible outcomes that available to you in your current life situation;
  • Computational thinking for formulating problems and their solutions in ways that involve software or hardware, not only for automation but also for exploring, analyzing, and comprehending processes that you are currently focused on.

You will soon be able to purchase branded exobrains from Meta, Microsoft, Apple, or say Elon Musk, but you may want to build your own Metacognition stack for a variety of reasons, such as enhancing standard capabilities, protecting the privacy of your data, or pursuing specific goals in health, business, or lifestyle.

Subscribe below to get notified early when I'm publishing new chapters.

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (0)

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more