Comments

You must log in or register to comment.

digitalthiccness t1_iwfqsxb wrote

>But what if we compress everything but the stuff needed for processing a certain event happening, then do the same for the next event, etc.?

Then it's not a 1:1 simulation of our universe.

9

Ivan_The_8th OP t1_iwfrbum wrote

As long as it's lossless compression, why not? Functionally it can be. There's no need to process two events that do not influence each other in any way at the same time.

2

Cryptizard t1_iwfs80m wrote

You are possibly describing the idea of the cosmological horizon? It is possible (some believe very likely) that our universe is infinitely large, but since it is expanding faster than light we will never be able to see or interact with anything past a certain distance. This leaves our visible universe finite.

If some alien was living in a non-expanding infinite universe, it could be possible to simulate our “finite” portion of the universe.

3

WikiSummarizerBot t1_iwfs8sd wrote

Cosmological horizon

>A cosmological horizon is a measure of the distance from which one could possibly retrieve information. This observable constraint is due to various properties of general relativity, the expanding universe, and the physics of Big Bang cosmology. Cosmological horizons set the size and scale of the observable universe. This article explains a number of these horizons.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

2

digitalthiccness t1_iwfs0t5 wrote

>There's no need to process two events that do not influence each other in any way at the same time.

No two such events exist. The factors needed to perfectly simulate anything are everything.

1

Ivan_The_8th OP t1_iwfstme wrote

All the results of an event can be stored in compression form until they are needed to determine how another event plays out.

1

[deleted] t1_iwftigs wrote

[deleted]

1

Ivan_The_8th OP t1_iwfucdz wrote

I said that in the post, however there is a possibility that the compressed data would still be small enough and multiple compression algorithms can be used to increase this possibility.

1

kayama57 t1_iwg3n2q wrote

1:1 is possible but only if you’re willing to look the other way about all the decimals that are going toneed to be cut off and hidden away to make it possible. You might be able to do a 1:1 model of specific scales of the universe, but being able to zoom from a Gluon on Earth to an overhead view of the entire Bang back down to a higgs boson at the core of Andromeda and everything in between - without absolutely massive massive error margins - is likely not gonna happen

4

Ivan_The_8th OP t1_iwg596s wrote

Yes, the worst part about this is that we might not know that it's not 1:1.

2

kayama57 t1_iwg6x6k wrote

What’s worse: every single modellable system in the world as we know it is modeled under the presumption that our presumptions about acceptable error in the model are correct

2

Meg0510 t1_iwgb5cd wrote

Not an expert (just a humble physics bachelors) but the problem isn't of whether you can "fit" one into the other: the real line (0,1) is in one-to-one correspondence with the whole real line, for example. The problem is that no infinite system can be simulated by us mortal finite beings.

For a slightly more boring situation: The Church-Turing-Deustche principle states if you assume every physical process can be completely described by quantum mechanics, then every finitely realizable physical system can be simulated by a universal quantum computer. So if you don't mind a reasonable finite approximation to whatever physical system you're interested in simulating, and you have the means to build a machine that can simulate them, there you have it.

Something else I heard on the passing: Apparently a string theorist had found what looks like error-correcting codes in his work (https://www.space.com/32543-universe-a-simulation-asimov-debate.html). Now I don't know what that means, cuz I know 0 string theory. But maybe it shows our universe itself is simulated (wherever the damn thing is running), who knows lol

3

Rezeno56 t1_iwg18i6 wrote

Instead of 1:1 simulation of the universe, how about a 1:1 simulation of our galaxy. If not a 1:1 simulation in about 1-100 light years?

2

Ivan_The_8th OP t1_iwg4njk wrote

Problem is that our galaxy isn't completely cut off from other galaxies. All outside influences need to somehow be accounted for.

1

Brangible t1_iwgkjf1 wrote

Not without knowing what all it consists of which we still don't. You'd need all the correct initial seed values that were present at the beginning of its timeline

1

botfiddler t1_iwgsned wrote

You would need to know every random event in our universe and replicate that in your simulation. Hmm.

1

ArgentStonecutter t1_iwgk4pc wrote

Your level of detail algorithms would produce observable artifacts unless you have some super AI managing it aware of the meaning of things like electronics and sensors so it can simulate the few cubic centimeters of rock that's the processors in a space probe but won't bother simulating the millions of cubic meters of rock in the asteroid it's flying past. Also, it needs to know that this chunk of rock is part of a gravity wave detector so you need to sync it up with the simulated supernova that happened 30 years ago 30 light years away... but this is just a chunk of the Earth and you can treat it as a uniform mass of undifferentiated basalt.

I don't think it's possible to hide the error artifacts from even our level of technology.

0