Highlighting the recent report of users and admins being unable to delete images, and how Trust & Safety tooling is currently lacking.

  • bloup@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    I have to say, I think the article actually does address what you’re saying, in particular here:

    There are a couple of reasons as to why this is so surprising. Firstly, the Trust & Safety aspect: a few months ago, several Lemmy servers were absolutely hammered with CSAM, to the point that communities shut down and several servers were forced to defederate from one another or shut down themselves.

    Simply put, the existing moderation tooling is not adequate for removing illegal content from servers. It’s bad enough to have to jump through hoops dealing with local content, but when it comes to federated data, it’s a whole other ball game.

    The second, equally important aspect is one of user consent. If a user accidentally uploads a sensitive image, or wants to wipe their account off of a server, the instance should make an effort to comply with their wishes. Federated deletions fail sometimes, but an earnest attempt to remove content from a local server should be trivial, and attempting to perform a remote delete is better than nothing.

    I also just want to point out that the knife cuts both ways. Yes, it’s impossible to guarantee nodes you’re federating with aren’t just ignoring remote delete requests. But, there is a benefit to acting in good faith that I think is easy to infer from the CSAM material example the article presents.