• sweng@programming.dev
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    7 months ago

    In theory, if you have the inputs, you have reproducible outputs, modulo perhaps some small deviations due to non-deterministic parallelism. But if those effects are large enough to make your model perform differently you already have big issues, no different than if a piece of software performs differently each time it is compiled.

    • delirious_owl@discuss.online
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      7 months ago

      That’s the theory for some paradigms that were specifically designed to have the property of determinism.

      Most things in the world, even computers, are non-deterministic

      Nondeterminism isn’t necessarily a bad thing for systems like AI.