Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • Grimy@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.

    Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.

    I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      where the sim doesn’t render things it doesn’t need to.

      That can’t work unless it’s a simulation made personally for you.

      • Grimy@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        10 months ago

        I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.

          • Grimy@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            10 months ago

            If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.

            None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.

            We wouldn’t know the difference.

            • Blue_Morpho@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              10 months ago

              If it was a simulation, there would be no need to go that far

              But you already said you have to go that far whenever someone is doing something where they could notice microscopic effects.

              So it’s not a simulation as much as a mind reading AI that continuously reads every sentient mind in the entire universe so as to know whether they are doing a microscopic observation that needs the fine grained resolution result or an approximation can be returned.

              • Grimy@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                10 months ago

                There would be no need to go that far at all times is what I’m saying. It’s the equivalent of a game rendering stuff far away only when you use a scope. Why render everything at all times if it isn’t being used and does not affect the experience. It would augment the overhead by an insane amount for little to no gain.

                This is also just a thought exercise.

                • Blue_Morpho@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  10 months ago

                  Why render everything at all times if it isn’t being used and does not affect the experience.

                  But how does the simulation software know when it needs to calculate that detail? If you are the only person in the simulation, it’s obvious because everything is rendered from your perspective. But if it’s more than one person in the universe, an ai program has to look at the state of the mind of everyone in the universe to make sure they aren’t doing something where they could perceive the difference.

                  Am I microwaving a glass of water to make tea, or am I curious about that YouTube video where I saw how you can use a microwave to measure the speed of light. Did I just get distracted and didn’t follow through with the measurement? Only something constantly monitoring my thoughts can know. And it has to be doing it for everyone everywhere in the entire universe.

                  • Grimy@lemmy.world
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    10 months ago

                    The way I see it, it would be coupled with the tool and not the intention someone has with it. So every microwave would render it properly at all time, as well as most electronics just by their very nature, regardless of what the person plans to do with it.

                    Actually I think they can probably just approximate the microwave stuff and just keep the electrical tools rendering like oscilloscopes.

                    They only need to render for things that give an exact measurement, the microwave trick has a 3% tolerance which is huge in the scope of things.

                    It seems like a lot but it’s less than simulating every single atom imo.