Why The Universal Simulation Demands INFINITE Compute

πŸ‘¨β€πŸŽ“ Source: This first appeared in my free email newsletter.

The successive application of very simple computational rules is the underlying mechanism that builds the Universe:

“[…] The ultimate machine code of the Universe – yes, it’s computational. Computation isn’t just a possible formalization, it’s the ultimate one.”Stephen Wolfram

The human brain calculates the equivalent of 1 ExaFLOP per second (more later).

Yes, we are in a massive simulation. As we speak, the Universe tests how well your genetic code is able to survive and procreate. Let’s call it the Universal Simulation.

You give birth to many things, such as ideas, companies, emotions, and children.

All of these recursively procreate through time and in proportion to their organic growth rates.

The Universal Simulation is a web of interwoven calculations: Your ideas birth new ideas in other people’s minds. Your children create new ideas, companies, emotions, and children. Your ideas might create companies that create emotions that create children.

Highly efficient ideas, like superior genetic material, spread faster – exponentially increasing computational work.

The Universal Simulation algorithm is programmed to do more of what works.

Humans actually do work, so the simulation throws more humans into the system.

In fact, we’re at a point where we recursively span up miniature universes to compute things for us. These universes are built by myriads of little rules, like computing an XOR between two binaries 001010 and 100000.

(Side Quest: What is the answer?)

The name we’ve given these miniature computational universes is artificial intelligence (AI).

Nowadays, AIs have begun to work even better than humans, so the Universal Simulation’s algorithmic response is to throw infinitely more computation at them.

As NVidia reports earnings later today, I’m surprised that many people believe we’re at some level of convergence that AI is “overhyped”.

Let’s do the rough math:

  • 1 human brain calculates roughly 1 ExaFLOP/s, i.e., 10^18 operations per second
  • 8,000,000,000 humans collectively calculate 8,000,000,000 ExaFLOP/s
  • Today’s global computing capacity of all GPUs and TPUs is in the 1,000 ExaFLOP/s (rough order of magnitude +-1)

So, to replicate the computational experiment that is humanity, we need to scale up computing by a factor of one million.

1,000,000x

If anything, AI is massively underhyped in the long term. NVidia may be a great long-term investment after all, especially considering breakthroughs such as the latest OpenAI Sora text-to-video model (bye actors) or Apple Vision Pro (hello Ready-Player-One).

While the stock price seems pricy to some, I don’t expect NVidia chip demand to be severely impacted in the foreseeable future. (No investment advice.)

In the long term, however, computing demand will be off the charts, as we have seen in this and last week’s email:

“The rate of growth in each of these fields is very high, sticky, and persistent over decades. Billionaire investor Richard Koch advises getting rich by investing your time and money in businesses that are leaders in high-growth industries (>15% CAGR).”

Note that the AI chip market is sustainably growing at a rate north of 38% per year (CAGR)…

Be on the right side of change. πŸš€
Chris