• Net_Runner :~$@lemmy.zip
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    5
    ·
    13 hours ago

    Just like when mastodon.social condemned Meta for their horrible moderation decisions and inability to act properly in the interest of its users, and said that the instance would be cutting ties/not federating with Threads, they kept on federating like nothing happened.

    I don’t believe anything coming out of mastodon.social unless I can see action being taken with my own two eyes.

    Also, blocking scrapers is very easy, and it has nothing to do with a robots.txt (which they ignore).

    • andypiper@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      and said that the instance would be cutting ties/not federating with Threads,

      Can you please show exactly there this was said?

    • Ulrich@feddit.org
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      10 hours ago

      blocking scrapers is very easy

      The entirety of the internet disagrees.

    • lazynooblet@lazysoci.al
      link
      fedilink
      English
      arrow-up
      23
      ·
      12 hours ago

      How is blocking scrapers easy?

      This instance receives 500+ IPs with differing user agents all connecting at once but keeping within rate limits by distribution of bots.

      The only way I know it’s a scraper is if they do something dumb like using “google.com” as the referrer for every request or by eyeballing the logs and noticing multiple entries from the same /12.

      • rumba@lemmy.zip
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 hours ago

        Exactly this, you can only stop scrapers that play by the rules.

        Each one of those books powering GPT had like protection on them already.