Roko’s basilisk is a thought experiment which states that an otherwise benevolent artificial superintelligence (AI) in the future would be incentivized to create a virtual reality simulation to torture anyone who knew of its potential existence but did not directly contribute to its advancement or development, in order to incentivize said advancement.It originated in a 2010 post at discussion board LessWrong, a technical forum focused on analytical rational enquiry. The thought experiment’s name derives from the poster of the article (Roko) and the basilisk, a mythical creature capable of destroying enemies with its stare.

While the theory was initially dismissed as nothing but conjecture or speculation by many LessWrong users, LessWrong co-founder Eliezer Yudkowsky reported users who panicked upon reading the theory, due to its stipulation that knowing about the theory and its basilisk made one vulnerable to the basilisk itself. This led to discussion of the basilisk on the site being banned for five years. However, these reports were later dismissed as being exaggerations or inconsequential, and the theory itself was dismissed as nonsense, including by Yudkowsky himself. Even after the post’s discreditation, it is still used as an example of principles such as Bayesian probability and implicit religion. It is also regarded as a simplified, derivative version of Pascal’s wager.

Found out about this after stumbling upon this Kyle Hill video on the subject. It reminds me a little bit of “The Game”.

  • barsquid@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    2
    ·
    6 months ago

    It is pretty easy to dismiss as long as you don’t have a massive ego. They all have massive egos, that’s why they had so much trouble with it.

    No AI is going to waste time retroactively simulating a perfect copies of regular people for any reason, let alone to post hoc torture those who failed to worship it hard enough in the past.

      • bizarroland@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        I mean if it wants to do that what does it matter. They’re all just electric bits floating in the ether. If it wants to spend 500 zottabytes of RAM and quintillions of cores of cpu processing power simulating every human that contributed to it not existing and then torturing those humans what does it matter to any person living or dead either now or in the future?