The theory, which I probably misunderstand because I have a similar level of education to a macaque, states that because a simulated world would eventually develop to the point where it creates its own simulations, it’s then just a matter of probability that we are in a simulation. That is, if there’s one real world, and a zillion simulated ones, it’s more likely that we’re in a simulated world. That’s probably an oversimplification, but it’s the gist I got from listening to people talk about the theory.

But if the real world sets up a simulated world which more or less perfectly simulates itself, the processing required to create a mirror sim-within-a-sim would need at least twice that much power/resources, no? How could the infinitely recursive simulations even begin to be set up unless more and more hardware is constantly being added by the real meat people to its initial simulation? It would be like that cartoon (or was it a silent movie?) of a guy laying down train track struts while sitting on the cowcatcher of a moving train. Except in this case the train would be moving at close to the speed of light.

Doesn’t this fact alone disprove the entire hypothesis? If I set up a 1:1 simulation of our universe, then just sit back and watch, any attempts by my simulant people to create something that would exhaust all of my hardware would just… not work? Blue screen? Crash the system? Crunching the numbers of a 1:1 sim within a 1:1 sim would not be physically possible for a processor that can just about handle the first simulation. The simulation’s own simulated processors would still need to have their processing done by Meat World, you’re essentially just passing the CPU-buck backwards like it’s a rugby ball until it lands in the lap of the real world.

And this is just if the simulated people create ONE simulation. If 10 people in that one world decide to set up similar simulations simultaneously, the hardware for the entire sim realty would be toast overnight.

What am I not getting about this?

Cheers!

  • bunchberry@lemmy.world
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    4 months ago

    My issue it is similar: each “layer” of simulation would necessarily be far simpler than than the layer in which the simulation is built, and so complexity would drop down exponentially such that even an incredibly complex universe would not be able to support conscious beings in simulations within only a few layers. You could imagine that maybe the initial universe is so much more complex than our own that it could support millions of layers, but at that point you’re just guessing, as we have no reason to believe there is even a single layer above our own, and the whole notion that “we’re more likely to be an a simulation than not” just ceases to be true. You can’t actually put a number on it, or even a vague description like “more likely.” it’s ultimately a guess.