Jump to content

Simulated universes.


Sorcerer

Recommended Posts

This topic over laps equally in physics and maths, but since physics is essentially applied maths, I thought it fit best here.

 

My question is:

 

1) assume our universe isn't a simulation and it's physical laws are prerequisite to all simulations.

 

2) we create a simulated universes sometime in the future using a portion of this universes energy.

 

3) within that simulation, simulated beings eventually create a simulated universe of their own.

 

4)all simulations are modeled on the original physical laws.

 

5) this process is repeated in each subsequent simulation.

 

Is there a physical (thermodynamic) limit to the number of possible universes or is it infinite?

 

Consider:

 

The sum to infinity of a geometric series (with a common ratio between 1 and -1). (1/2+1/4+1/8+1/16....) =1

 

The sum of the harmonic series (1/2+1/3+1/4+1/5+1/6...)

 

That time is required for us and simulated beings to create a simulation, but time need not be set as our universe. For instance we could set time slower to conserve energy, e.g. the rate of time could be proportional to the energy content.

 

The possible futures of our universe, heat death, crunch, bounce, rip.

 

Aliens creating simulations, and/or we create multiple sims and subsequent sims do too.

 

Premise 1 and 4, might be false and we are a simulation and our physical laws aren't the same as our parent sim (or original universe), but still ultimately dependent on those laws.

 

I just thought this might be an interesting topic. It is a mix of science and speculation, move if you want. I think the original question can be answered using our current knowledge of science, but the considerations probably push it to speculation.

Edited by Sorcerer
Link to comment
Share on other sites

2) we create a simulated universes sometime in the future using a portion of this universes energy.

 

3) within that simulation, simulated beings eventually create a simulated universe of their own.

 

4)all simulations are modeled on the original physical laws.

 

5) this process is repeated in each subsequent simulation.

It's exactly what I said many years ago:

http://www.scienceforums.net/topic/86677-properties-of-photons-split-from-looking-in-telescope-to-distant-star/#entry839195

:)

Link to comment
Share on other sites

Yes, I too have had similar thoughts since I was young playing sim games, but it has become a mainstream idea recently and even less sci-fi with the advent of more powerful computing and the possibility that Moore's law is continuous and the possibility of functional quantum computers

However, Sensei, do you think that the possible number of simulations could be infinite or must be finite?


At some point it's ok to just say "I don't know" ...

I think this could be answered by someone more knowledgeable than I, based on my premises. Maybe energy isn't infinitely divisible, or all possible future universe conditions provide an end point for where all sims will cease to function.

Just because I don't know and you don't know, doesn't mean someone else doesn't know. I could probably figure it out myself, but I'm being lazy.... and also promoting discussion.

Link to comment
Share on other sites

Yes, I too have had similar thoughts since I was young playing sim games, but it has become a mainstream idea recently and even less sci-fi with the advent of more powerful computing and the possibility that Moore's law is continuous and the possibility of functional quantum computers

 

However, Sensei, do you think that the possible number of simulations could be infinite or must be finite?

I think this could be answered by someone more knowledgeable than I, based on my premises. Maybe energy isn't infinitely divisible, or all possible future universe conditions provide an end point for where all sims will cease to function.

 

Just because I don't know and you don't know, doesn't mean someone else doesn't know. I could probably figure it out myself, but I'm being lazy.... and also promoting discussion.

 

 

I have wondered about the hypothesis of a black hole tearing a hole in our universe and creating another one. It does raise problems of infinite energy as well but it seems reasonable to me that a tear in space time might release far more energy than it took to create it...

Link to comment
Share on other sites

However, Sensei, do you think that the possible number of simulations could be infinite or must be finite?

That depends on what exactly is simulated.

 

In games, engine doesn't render entire world around players, where nobody look.

Just what player see at the moment is processed and rendered.

After rotating head, moving body to other place, it's gone, not processed.

 

Technique f.e. back-face culling https://en.wikipedia.org/wiki/Back-face_culling

saves 50% of CPU/GPU time straight away even from what is directly in front of head of player.

 

To have very large landscape game engine splits them to blocks. Say we have 100x100 blocks.

1 block is what player can see by naked eye.

If we have f.e. 1 million polygons per block. All blocks have 10 bln polygons. But player can see just 1 mln where he/she is at the moment. 10,000 speed up.

Everything to limit needed calculations.

 

Things like fusion in star, does not need to be simulated to every particle, as nobody will observe these reactions with such detail.

Edited by Sensei
Link to comment
Share on other sites

That depends on what exactly is simulated.

 

In games, engine doesn't render entire world around players, where nobody look.

Just what player see at the moment is processed and rendered.

After rotating head, moving body to other place, it's gone, not processed.

 

Technique f.e. back-face culling https://en.wikipedia.org/wiki/Back-face_culling

saves 50% of CPU/GPU time straight away even from what is directly in front of head of player.

 

To have very large landscape game engine splits them to blocks. Say we have 100x100 blocks.

1 block is what player can see by naked eye.

If we have f.e. 1 million polygons per block. All blocks have 10 bln polygons. But player can see just 1 mln where he/she is at the moment. 10,000 speed up.

Everything to limit needed calculations.

 

Things like fusion in star, does not need to be simulated to every particle, as nobody will observe these reactions with such detail.

 

 

I remember a book with that premise! Reality only existed when god wasn't watching, when he was looking only what what was in his direct frame of sight was real, he had to be put back to sleep... >:D

Link to comment
Share on other sites

That depends on what exactly is simulated.

 

In games, engine doesn't render entire world around players, where nobody look.

Just what player see at the moment is processed and rendered.

After rotating head, moving body to other place, it's gone, not processed.

 

Technique f.e. back-face culling https://en.wikipedia.org/wiki/Back-face_culling

saves 50% of CPU/GPU time straight away even from what is directly in front of head of player.

 

To have very large landscape game engine splits them to blocks. Say we have 100x100 blocks.

1 block is what player can see by naked eye.

If we have f.e. 1 million polygons per block. All blocks have 10 bln polygons. But player can see just 1 mln where he/she is at the moment. 10,000 speed up.

Everything to limit needed calculations.

 

Things like fusion in star, does not need to be simulated to every particle, as nobody will observe these reactions with such detail.

And these kind of simulated universes could be simulated infinitely within one another? At one point wouldn't the simulator not have enough information to gather from their surroundings to provide a simulation which works?

 

I mean if there were 1 million rules that govern a universe and the first simulation cut that in half to 500,000 rules, so not simulating parts unseen, wouldn't at some point there not be enough rules for there to be a being that could continue the process?

 

This is interesting, from wiki:

 

 

 

 

Computability of physics[edit]

A decisive refutation of any claim that our reality is computer-simulated would be the discovery of some uncomputable physics, because if reality is doing something that no computer can do, it cannot be a computer simulation. (Computability generally means computability by a Turing machine. Hypercomputation (super-Turing computation) introduces other possibilities which will be dealt with separately.) In fact, known physics is held to be (Turing) computable,[14] but the statement "physics is computable" needs to be qualified in various ways, as a recent result[15] shows.

Before symbolic computation, a number, thinking particularly of a real number, one with an infinite number of digits, was said to be computable if a Turing machine will continue to spit out digits endlessly, never reaching a "final digit".[16] This runs counter, however, to the idea of simulating physics in real time (or any plausible kind of time). Known physical laws (including those of quantum mechanics) are very much infused with real numbers and continua, and the universe seems to be able to decide their values on a moment-by-moment basis. As Richard Feynman put it:[17]

"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do? So I have often made the hypotheses that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities".

The objection could be made that the simulation does not have to run in "real time".[18] It misses an important point, though: the shortfall is not linear; rather it is a matter of performing an infinite number of computational steps in a finite time.[19]

Note that these objections all relate to the idea of reality being exactly simulated. Ordinary computer simulations as used by physicists are always approximations.

These objections do not apply if the hypothetical simulation is being run on a hypercomputer, a hypothetical machine more powerful than a Turing machine.[20] Unfortunately, there is no way of working out if computers running a simulation are capable of doing things that computers in the simulation cannot do. The laws of physics inside a simulation and those outside it do not have to be the same, and simulations of different physical laws have been constructed.[21] The problem now is that there is no evidence that can conceivably be produced to show that the universe is not any kind of computer, making the simulation hypothesis unfalsifiable and therefore scientifically unacceptable, at least by Popperian standards.[22]

All conventional computers, however, are less than hypercomputational, and the simulated reality hypothesis is usually expressed in terms of conventional computers, i.e. Turing machines.

Roger Penrose, an English mathematical physicist, presents the argument that human consciousness is non-algorithmic, and thus is not capable of being modeled by a conventional Turing machine-type of digital computer. Penrose hypothesizes that quantum mechanics plays an essential role in the understanding of human consciousness. He sees the collapse of the quantum wavefunction as playing an important role in brain function. (See consciousness causes collapse).

Link to comment
Share on other sites

That's exactly right Sensei...

 

"In games, engine doesn't render entire world around players, where nobody look.

Just what player see at moment is processed and rendered."

 

But consider that...

 

In Quantum Mechanics all possible outcomes or worlds are encoded in the wave function, but only as probabilities.

But when someone looks ( an interaction occurs ), the wave function collapses and reality is 'rendered'.

Link to comment
Share on other sites

I want to point out that I'm not saying this universe is a simulation as that would most likely be unfalsifiable. However I found this in wiki... so maybe not:

Testing the hypothesis physically[edit][/size]
A long-shot method to test one type of simulation hypothesis was proposed in 2012 in a joint paper by physicists Silas R. Beane from the University of Bonn (now at the University of Washington, Seattle), and Zohreh Davoudi and Martin J. Savage from the University of Washington, Seattle.%5B10%5D Under the assumption of finite computational resources, the simulation of the universe would be performed by dividing the continuum space-time into a discrete set of points. In analogy with the mini-simulations that lattice-gauge theorists run today to build up nuclei from the underlying theory of strong interactions (known as Quantum chromodynamics), several observational consequences of a grid-like space-time have been studied in their work. Among proposed signatures is an anisotropy in the distribution of ultra-high-energy cosmic rays, that, if observed, would be consistent with the simulation hypothesis according to these physicists (but, of course, would not prove that the universe is a simulation). A multitude of physical observables must be explored before any such scenario could be accepted or rejected as a theory of nature.%5B11%5D


I just want to know if infinite simulations are possible. Sensei, does my question of your partial simulated universes, prevent an infinite number of simulations, or am I overlooking something?

if there were 1 million rules that govern a universe and the first simulation cut that in half to 500,000 rules, so not simulating parts unseen, wouldn't at some point there not be enough rules for there to be a being that could continue the process?


That's exactly right Sensei...

"In games, engine doesn't render entire world around players, where nobody look.
Just what player see at moment is processed and rendered."

But consider that...

In Quantum Mechanics all possible outcomes or worlds are encoded in the wave function, but only as probabilities.
But when someone looks ( an interaction occurs ), the wave function collapses and reality is 'rendered'.

You suggest our universe is already a simplified one, but could the way our universe currently work be simplified further and would this enable an infinite amount of nested simulations? Or does this mean that nested simulations need not be simplified and still enable an infinite continuation of sims, since even our universe doesn't require complete "rendering"?

While browsing the google search "simulated universe infinite" I found this video, about 1/2 an hour in, and still no answer to my question of if it can be infinite or not.

 

Edited by Sorcerer
Link to comment
Share on other sites

Given infinite time and infinite useable energy, the number of sub-simulations is unbounded. We very probably don't have infinite useable energy in the universe, though.

 

You have to remember that every nested simulation must be physically simulated in full in the "primary" universe. Everything that happens in the first simulation is simulated by stuff happening in the real universe. Everything in the second simulation is simulated by stuff in the first simulated universe, all of which must itself be simulated in the real universe. And so on.

 

To have an infinite regression of simulated universes, you would need to be able to simply simulate an infinite number of universes to begin with. Nesting them one within another doesn't free you from that requirement.

Link to comment
Share on other sites

And these kind of simulated universes could be simulated infinitely within one another? At one point wouldn't the simulator not have enough information to gather from their surroundings to provide a simulation which works?

As I said it depends WHAT is simulated.

 

If every single particle is simulated,

there must be storage to keep them all in memory.

Normal programs allocate memory when they need them, and release them when not needed anymore.

It leads to fragmentation of memory

https://en.wikipedia.org/wiki/Fragmentation_(computing)

(you can have 1 GB of memory, but continuous block of memory is f.e. 1 MB, so trying to allocate anything larger than 1 MB will fail, even though 1000x of free memory is available).

or to out of memory situations

https://en.wikipedia.org/wiki/Out_of_memory

In modern times (after invention of MMU), virtual memory is used, if physical memory is gone

https://en.wikipedia.org/wiki/Virtual_memory

This memory can be extended during running of computer, by plugging more disk (USB)..

One could press pause, and wait for disk to arrive, and install.

It's not problem when program has been programmed to support such extension and pausing.

 

OTOH,

you have 1 gamma photon >1.022 MeV on input prior pair production,

after pair production, there are two electron and positron particles to simulate.

Increased quantity of particles to bother.

 

Situation is even more "funny" if you try to analyze proton-antiproton annihilation..

Up to 13 (unstable) mesons can be made from this reaction,

and each of meson could decay to even more particles (13*2 = 26, 13*3=39)

From 2 input to ~40 output?

Good that they decay so quickly and annihilate...

 

I mean if there were 1 million rules that govern a universe and the first simulation cut that in half to 500,000 rules, so not simulating parts unseen, wouldn't at some point there not be enough rules for there to be a being that could continue the process?

I would say, quantity of rules (physical laws?) is very small. Very very small. It's code (program) of simulator.

It's quantity of particles-quantum objects that's tremendously high.

If simulator has to simulate behavior of every single one.

(One can imagine plentiful optimization techniques: f.e. instead of processing every single particle independently, group them by velocity vector, if it changes particle goes to different group)

 

Observable universe mass is 10^53 kg

https://en.wikipedia.org/wiki/Observable_universe

with mass of single proton ~1.66*10^-27 kg

https://en.wikipedia.org/wiki/Proton

It gives 6.022141*10^79 protons quantity.

 

After fusion simulator don't have to bother about two particles, but just 1 fused nucleus.

Edited by Sensei
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.