• Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    The cover-your-ass scenario.

    In the Philosophy Crash Course there was a scenario like this. I’ll paraphrase:

    You’re a traveler exploring a semi-devloped nation in South America. Coming out of the wilderness you come across a squad of soldiers. They are forcing twenty villagers to dig a mass grave. The officer to the soldiers tells you these villagers committed the state crime of supporting a rival to their leader, and are to be executed. But as you are a guest in their country, he will make you an offer: if you shoot one of them, yourself, he will set all the rest free, and then can hike to the border and beg for asylum. (A rough trek, but the neighboring country may take them).

    Do you shoot one of the villagers?

    Actually killing someone is rather hard on the psyche, and most of us cannot bear the thought (and might suffer from trauma as a result). But then, perhaps this is a small price to pay for nineteen human lives.

    Thomas Aquinas and Kant were happy to let the soldiers kill the villagers so as to avoid committing the sin of murder, themselves. (They even would not lie to the murderer at the door, or Nazi Jew-hunters to save the lives of fugitives hidden in their home, since lying was sin enough, and they would count on God to know His own.) Their own contemporaries felt it was proper to suffer the trauma and do what was necessary (assuming the officer seemed inclined to keep to his word and spare the remaining villagers.)

    So, the cover your own ass response has a long history of backers, including known philosophers.

    • Sean@liberal.city
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      @uriel238 @mondoman712
      In the days before Wannsee Conference (Nazis setting up death camps) but after the invasion of Poland where most executions occurred by firing squad, there were German tourists who would travel to partake in the firing squads. So the trauma is not universal across the human experience and there’s some circumstances that would cause individuals to kill. Lynchings and massacres in the US, are examples of this occurring without a war to give cover to killings.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        We’ve seen a similar phenomenon in some of the red states in the ideology conflict here in the US. There are people eager to kill someone just to have the experience, and who volunteer to hunt targeted groups (trans folk, lately) or as participants in an execution by firing squad. I remember in the John Oliver’s first segment on the death penalty (he did a second one recently) executions were stalled due to difficulties obtaining the drugs used in lethal injections, and firing squads were brought up. The expert pointed out the difficulty finding one executioner, let alone seven. The officials suggested recruiting volunteers from the gun-enthusiast citizenry, which the expert saw as naïve.

        I can’t speak to firing-squad executions during the German Reich and the early stages of the holocaust, but I can speak to the Einsatzgruppen who were tasked with evacuating villages (to mass graves) who harbored Jews, harbored enemies of Germany or otherwise were deemed unworthy of life. The mass executions were hard on the troopers, and as a result Heydrich contented with high turnover rates.

        This figured largely into the movement towards the industrialized genocide machine that pivoted around the Auschwitz proof of concept. Earlier phases included wagons with an enclosed back in which the engine exhaust was piped. The process was found to be too slow, and exposed to many service people to the execution process. The death camps were staffed to assure no-one had to interact with the prisoners and process the bodies, so no-one would have to confront the visceral reality of before and after. They were staffed so that anyone who engaged a mechanism was two steps away from the person authorizing (and taking responsibility for) the execution. The guy who flipped the switch was just following orders.

        Interestingly, we’d see a repeat of this during the International War on Terror, specifically the Disposition Matrix which lead to executions of persons of interest on the field by drone strike (Hellfire missile launched from a Predator drone). During the CIA Drone Strike Programs in Afghanistan and Pakistan, the drone operation crews suffered from high turnover rate, with operators suffering from combat PTSD from having pulled the trigger on the missile launches. It didn’t help they were also required to scan the damage to assess the carnage, and identify the casualties.

        Interestingly, this also presented an inverted demonstration of how the human mind can tell the difference between violent video games and the real thing. Plenty of normies play Call of Duty without dealing with the mental after-effects of war, but even when we conduct war operations from continents away, our brains recognize that we are killing actual human beings, and suffers trauma from the act. War continues to be Hell, and video games not so much.

    • Kwakigra@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      The soldiers are killing them in either case. Basically I’m being asked to choose whether they kill one or all of them, and they’ll make me shoot someone if I choose to spare most of them. Considering the situation, I’m not sure these soldiers won’t just kill me with everyone else anyway since I’m a witness after yhey’ve had their fun.

      • KevonLooney@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        This is the actual answer. Don’t look at them in the face and try to leave ASAP. Philosophical questions are usually very simplified to illustrate a point, and there are usually more than two choices.

        In a real life “trolley” scenario, you should wave to the driver and get them to stop. There should be zero people tied to the tracks. And that’s only what should happen if you think quickly enough. In real life you may actually just freeze and do nothing.

        • Uriel238 [all pronouns]@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          IRL we typically do what we feel and justify it later, but then IRL there is no right or wrong, except what we construct in the process of organizing with each other to cooperate against outward threats such as predators and the elements. We have agreed to poor conditions because our lords were kinder than the winter and the bears, but then we’ve also overthrown our lords when we worked out they need us more than we need them.

          But yes, if you want to pretend that moral philosophy is just cerebral masturbation, that’s valid. All of our philosophy is about the opinions of past thinkers about the perimeters of right and wrong. It will give you a clear answer about as well as religious philosophy might tell you which patheon of gods is the true one.

          These scenarios are less about what is right or wrong, but about how you, individually and personally, decide ad hoc what is right or wrong. You might distrust the soldiers, but then if they were inclined to betray your trust with a lie, they might have never intended you to go free either, and the whole story becomes irrelevant.

          Another Trolley-like features a stranger come to town who is a perfect match for five transplant patients waiting organs. The surgeon / hospital administrator has a friend in organized crime who can abduct the stranger and harvest his organs quietly and cleanly so that the authorities won’t notice he disappeared. Although IRL, having a transplant is a mortal condition. Having the organ buys more time than not having the organ. Also this doesn’t get into the risks of other complications of transplant surgery that can occur even when an organ is a good match.

          These scenarios are not about real life, but about becoming more self aware of how you’d consider these. And yes, this may mean looking for third options, hoping to find one better than the two obvious ones.

          • KevonLooney@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            There are almost always better options than the given ones. I remember an answer to the “Ship of Theseus” problem a friend gave; he recommended calling it a different ship once more than 50% was replaced. I asked why and he said that all definitions are just made up, and you have to draw a line somewhere.

            That’s what people do in real life. They don’t just sit there perplexed by a “paradox”.

    • JohnDClay@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 months ago

      How are you sure the soldiers will follow though with their end of the bargain? Once they give you the gun, can you try and shoot the soldiers? Could you bribe the soldiers to release all the prisoners?

      Thought experiments like this have two options, but real life is never only two options. Getting into that mindset can lead people to accept things for the greater good without exploring all the options.

    • mondoman712@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      You just described an alternative version of the well known trolley problem, which the post is referencing.

      The answers to the problem from the philosophers is interesting.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        The Trolley problem is a schoolbook example of the failure of creed-based philosophy (deontological ethics), but is also used (the various scenarios) to illustrate that circumstances that don’t affect the basic scenario or outcome do affect our feelings and our response to the scenario.

        It’s easier to pull a lever from a remote position than to actually assault someone or kill them by your own hand, for example.

        There are other scenarios that don’t necessarily involve trolleys, but involve the question of doing a wrongful act in order to produce a better outcome. Ozymandias in The Watchman killing millions of New Yorkers to prevent a nuclear exchange, thereby saving billions of people. (Alan Moore left it open ended whether that was the right thing to do in the situation, but it did have the intended outcome.)

        We like the trolley problem because you can draw it easily on the blackboard, but other situations are much better at illustrating how subtle nuance can drastically change the emotions behind it.

        Try this one:

        The Queen of the land dies. On the day of her sister’s coronation, she declares that Anglicanism is now the faith and Catholics are now unlawful — a reversal of the old order — Catholics are to report to a town or city hall to convert or be executed. You are Catholic. Do you obey the law or flee? And if you obey the law, do you convert or perish at the hand of the state? Do you lie about your faith to state agents or to the national census?

        To a naturalist like myself, I’m glad to lie or convert to spare my own life, but to the devout, pretending to be another faith, or converting by force was a terrible sin, so it’s a very sober (and historically relevant) look at religious principle.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    this is funny and all, but it doesn’t matter what you’re doing here, you’re technically liable for all of them so uh.

    I’ll wait for a better version of this.

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      I think that’s the point. There’s a follow-up about killing the people tying others to the rails that fits.

    • Biyoo@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Autopilot turns off before collision because physical damage can cause unpredictable effects that could cause another accident.

      Let’s say you run into a wall, autopilot is broken, the car thinks it needs to go backwards. You now killed 3 more people.

      I hate Elon Musk and Teslas are bad, but let’s not spread misinformation.

      • Programmer Belch@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of “You won’t get a penny from Elon”.

        People who rely on Full Self Driving or whatever it’s called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn’t get off scot-free but it’s easier to blame the shooter rather than the gun manufacturer.

        • Biyoo@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          Yeah I agree. Both parties should be liable. Tesla for their misleading and dangerous marketing, drivers for believing in the marketing.

    • UndercoverUlrikHD@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          7 months ago

          Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.

  • DNOS@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Immagino having a car that doesn’t pretend to drive herself but it’s enjoyable to drive, a car that doesn’t pretend to be a fucking movie because it’s just a car, a car without two thousands different policies to accept in wich you will never know what’s written but a car that you will be able to drive even though you decided to wear a red shirt on a Thursday morning which in you distorted future society is a political insult to some shithead CEO, a car that you own not a subscription based loan ,a car that keeps very slowly polluting the environment instead of polluting it with heavy chemicals dig up from childrens while still managing to pollute in CO2 exactly the same as the next 20 years of the slow polluting one not to mention where the current comes from, a car that will run forever if you treat it well and with minor fixes with relative minor environment impact and doesn’t need periodic battery replacement which btw is like building a new vehicle … This are not only a critical thoughts about green washing but are meant to make you reflect on the different meanings of ownership in different time periods

    And yes I will always think that all environmentalists that absolutely needs a car should drive a 1990s car, fix it, save it from the dump fields and drive it till it crashes into a wall …

    • SinJab0n@mujico.org
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Imagine not being forced to need a car at all.

      Imagine being able to just sit down, watch memes, read something, watch a movie, maybe take a nap, or even take advantage of the journey and get ahead some tasks on ur way to our jobs.

      Imagine being able to eat dinner on ur way home if our daily commute is kinda long, woldn’t that be a dream?

      Brothers, sisters, lets get some trains in our lives.

      • DNOS@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Totally agree…
        The dream would be to see them arriving on time , maybe clean ( not from Graffiti I’m a huge fan I mean from trash… )z I don’t know about other places in the world but we definitely need more especially during peak hours and the Infostructure should be in the state hands not in the monopoly of a single private low paying dickhead … (We regularly have a strike almost every Friday since my parents were born)…

        • SinJab0n@mujico.org
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          Where r u from my friend? Even ours in the 3rd world ain’t that bad, actually they r really reliable (and clean), our usual demands its more lines

          • DNOS@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            7 months ago

            Italy we probably have the worst local train system… The long distance ones are actually better … Maybe are my standards you know people keep wanting more…

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Even with autopilot I feel it’s unlikely that driver would not be liable. We didn’t have a case yet but once this happens and goes higher to courts it’ll immediatly establish a liability precedence.

    Some interesting headlines:

    So I’m pretty sure that autopilot drivers would be found liable very fast if this developed further.

    • stom@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      You’re still in control of the vehicle, therefore you’re still liable. Like plopping a 5 year old on your lap to drive while you nap, if they hit people it’s still your fault for handing over the control to something incapable of driving safely while you were responsible for the vehicle.

      • Norodix@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        But a reasonable person would not consider a child capable of driving. An “extremeley advanced algorithm that is better and safer than humans and everyone should use it” is very different in this case. Aftet hearing all the stupid fluff, it is not unreasonable to think that selfdrivong is good.

        • stom@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          7 months ago

          Teslas own warnings and guidance assert that drivers should remain ready to take control when using the features. They do not claim it is infallible. Oversight and judgement still need to be used, which is why this argument wouldn’t hold up at all.

          • LovesTha🥧@floss.social
            link
            fedilink
            arrow-up
            0
            ·
            7 months ago

            @stom @Norodix Pity Tesla hasn’t taken reasonable precautions to ensure the driver is driving.

            It isn’t unreasonable to have customers expect the thing they were sold to do the thing they were told it does.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      They’re most likely liable. “FSD” is not full self driving, it’s still a test product, and I guarantee the conditions for using it include paying attention and keeping your hands on the wheel. The legal team at tesla definitely made sure they weren’t on the hook.

      Now where there might be a case for liability is Elon and his stupid Twitter posts and false claims about FSD. Many people have been mislead and it’s probably contributed to a few of the autopilot crashes.

    • SinJab0n@mujico.org
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      7 months ago

      It was possible to let Musk dealt with his own mess before, but after the last demands for false advertisement they changed the wording from “fully automated” to “assisted driving”, and now even the manuals says;

      "dude, this is some fucky woocky shit, and is gonna kill u and everyone involved if u let us in charge. So… Pls be always over the edge of ur seat ready to jump! We warned u (even if we did everything to be as misleading as possible), u can’t pass us the bill, nor sue us now.

      K, bye."

      So yeah, they ain’t liable anymore.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      I am not a lawyer.

      I think an argument can be made that a moving vehicle is no different than a lethal weapon, and the autopilot, nothing more than a safety mechanism on said weapon. Which is to say the person in the driver’s seat is responsible for the safe operation of that device at all times, in all but the most compromised of circumstances (e.g. unconscious, heart attack, taken hostage, etc.).

      Ruling otherwise would open up a transportation hellscape where violent acts are simply passed off to insurance and manufacturer as a bill. No doubt those parties would rush to close that window, but it would be open for a time.

      Cynically, a corrupt government in bed with big monied interests would never allow the common man to have this much power to commit violence. Especially at their expense, fiscal or otherwise.

      So just or unjust, I think we can expect the gavel to swing in favor of pushing all liability to the driver.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Making that argument completely closes the door for fully autonomous cars though, which is sort of the Holy grail of vehicle automation.

        Fully autonomous doesn’t really exist yet, aside from some pilot projects, but give it a decade or two and it will be there. Truly being a passenger in your own vehicle is a huge selling point, you’d be able to do something else while moving, like reading, working or sleeping.

        These systems can probably be better drivers than humans, because humans suck at multitasking and staying focused. But they will never be 100% perfect, because the world is sometimes wildly unpredictable and unavoidable accidents are a thing. There will be some interesting questions about liability though.

  • Sibbo@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Reminds me of the Chinese issue: you run over someone, but they are likely not dead. Will you save their life but accept having to pay for whatever healthcare costs they have until they are recovered? Or will you run over them again, to make sure they die and your punishment will be a lot lighter?

  • samus12345@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    Strange to assume that swerving will definitely kill one of them. What if you swerve off the road, or slam on the brakes? The reason the trolley problem works is that it’s on rails and you’re not operating it.

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      The Trolley problem is just a hypothetical situation with only 2 options.

      It being on rails just ads flavour, it doesn’t matter. You can’t choose anything else.

    • voldage@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      That’s because it’s a Tesla car, silly. It only allows for minimalization of victims down to a minimum of one. I’ve heard that newer models have a perdiction module, that will deploy a rear mounted gun and shot down any survivors in case of narrowly avoided car crash. The seat still does devour the driver if that happens though, for some legacy backwards compatibility reasons. As for the disembodied Voice that recites all your sins and threatens you to reveal them to the public should you NOT take the wheel and kill those people yourself, it’s apparently in spanish as well now. Such an age of wonders.