Viewing a single comment thread. View all comments

Mortal-Region t1_j7gvfkz wrote

Furthermore... if it's a simulation of a technological society, it might have to be shut down eventually, because if the simulated people advance far enough to create their own simulations -- perhaps millions of them -- then the strain on the computer would be too great. The simulation would slow to a crawl, and more urgently, it'd run out of memory.

2

peterflys t1_j7kw4xx wrote

That could be true, but you could also end up in a situation where the hardware running the primary sim continues to get upgraded and expanded, which increases its capacity to hold more information (that is, the society—and its own simulations—that its simulating). Computation of These should get cheaper and cheaper too. Just another thought.

2

Mortal-Region t1_j7m2k8o wrote

Yeah, the argument assumes that the simulation is the "detail-on-demand" kind, meaning that when the simulated people run their own simulation, the real computer in base reality has to provide a tremendous amount of new detail -- roughly the same amount as is allocated for their world (assuming they run the same kind of simulation as the one they occupy).

So, for example, if the sims have just 10 simulations running simultaneously, the simulation they occupy will consume 11 times more computational resources than before (including memory). Even if the computer in base reality grows to the size of an entire galaxy, just one more level down means now you need ten galaxies. All this just to keep your single sub-simulation-incorporating simulation running.

I think it's more likely that the simulators will nip the problem in the bud by halting the sim just before sub-simulations become possible, thus also preventing sub-sub-simulations and sub-sub-sub simulations and so on. After all, the thing they're probably simulating is their own historical singularity, so once sub-simulations become possible, the simulation has pretty much run its course.

2