Comments

You must log in or register to comment.

Mortal-Region t1_j2p1s1j wrote

For really dramatic speed-ups you need simulated brains (as opposed to a brain-in-a-vat scenario). Bostrom estimates that a planet-sized computer could simulate the entire mental history of mankind in a millionth of a second.

7

el_chaquiste t1_j2p2ef0 wrote

Hence the ancestor simulation theory.

If it's so cheap for a future civilization to simulate all of humankind history and variations of it, then it's much more likely we are in a simulation created by our descendants than actual living humans.

5

PieMediocre872 OP t1_j2pdsrn wrote

What if we simulate the history of mankind, and this simulation simulates the history of mankind and so on. Would this mean that existence is merely a mandelbrot set?

2

el_chaquiste t1_j2phqj4 wrote

Note that AFAIK ancestor simulation theory still assumes computation resources are limited, thus their consumption still needs to be minimized, and some things in the simulation aren't fully accurately simulated.

Brains might be fully accurate, but the behavior of elementary particles and other objects in the simulated universe would be just approximations that look outwardly convincing. E.g. rocks and funiture would be just decoration and wallpaper.

If the simulated beings start paying attention to the details in their world, the simulation notices it and simulates a finer level of detail. Like having a universal foveated rendering algorithm for the simulated brains.

In that case, running a simulation inside the simulation could be computationally possible, but it would probably incur in too much computing overhead. But this assumption is a bit flaky, of course, considering we are already assuming miraculous levels of computing power.

Having nested simulations might be actually the point of the excercise, like seeing how many worlds end up having their own sub-worlds just for fun.

3

Mortal-Region t1_j2pq7mn wrote

>In that case, running a simulation inside the simulation could be computationally possible, but it would probably incur in too much computing overhead. But this assumption is a bit flaky, of course, considering we are already assuming miraculous levels of computing power.

If we assume that the sub-simulation we create will use the same optimization scheme (detail-on-demand) as the simulation we live in, and be of roughly the same size, then creating just a single sub-simulation, running 24/7, will double the strain on the computer in base reality. Double the computation and double the memory. No matter how powerful your computer, "twice-as-much" is always a lot. If left to run indefinitely, the system would eventually crash.

2

Mortal-Region t1_j2plfw1 wrote

Yeah, but some people think the simulators will halt our simulation before we're able to make sub-simulations precisely to avoid that scenario (endlessly proliferating sub-simulations).

2

Chispy t1_j2pu0pd wrote

Why would they want to avoid that scenario though? Seems like the goal scenario to me.

1

Mortal-Region t1_j2pzajw wrote

My guess is that the goal of the simulation is to model Earth in the 20th & 21st centuries. So say they've got resources set aside for N such Earth simulations. They can run N simulations one after the other, halting each before sub-simulations become possible, or they can allow an N-deep stack of simulations to form. The problem with the second option is that the simulations will become less-and-less accurate representations of the 20th/21st century as you go down the stack. Copies of copies of copies. Also, you'll need to allocate memory for all N simulations to run at once, whereas with the first option you only need memory for 1 simulation at a time.

On the other hand, I'm sure they/we will run many different kinds of simulations, so the question really is: of the kinds that contain conscious people living in the 21st century, which is most common?

1

Kinexity t1_j2pu9zp wrote

Running a simulation inside a simulation would be like an OS running a VM. The amount of nesting could be almost infinite.

1

whatsinyourhead t1_j2pyuei wrote

There was a black mirror episode that explored this topic, some sort of a prison that did that

5

DungeonsAndDradis t1_j2oce8g wrote

I was going to say exactly what you said about dreams.

I don't remember his name, but a psychologist studying dreams realized that dreams were happening in split seconds, when he dreamed that a book had fallen off the shelf and hit him in the head. He perceived his dream lasting several minutes. But he was actually woken up by a book falling on his head.

In the split second between asleep and awake, when the book struck him, he had an entire dream.

3

Kinexity t1_j2pum6h wrote

Nope. Brain cannot work faster than it already does.

2

PieMediocre872 OP t1_j2q3b12 wrote

What type of work? And what is the limit?

1

Kinexity t1_j2q65y3 wrote

Like normal functioning. You cannot make neural signals move faster nor make them more frequent and even if you could it would come at a tremendous energy expenditure at levels which brain could not handle. Memory would almost definitely not be able to keep up. Even increased focus and thinking at normal levels causes your brain to get tired quickly. Assume that brain is doing it's work as best as it can and without fundamental changes in architecture it is at it's limit.

1

Gab1024 t1_j2pzpd0 wrote

It's just a perception of time. Time is just altered when dreaming. Too much stuff is happening and it seems that hours has passed even tho it was only 30 minutes. The brain can't think faster than what it actually is. The only solution I'd say, is brain computer interface. Connect the brain to a machine so that you have a much better bandwith

1

realdreambadger t1_j2udh53 wrote

There's an interesting Stephen King short story called The Jaunt which touches on some of the consequences of such a thing.

1