• kava@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    19
    ·
    7 months ago

    Is the investigation exhaustive? If these are all the crashes they could find related to the driver assist / self driving features, then it is probably much safer than a human driver. 1000 crashes out of 5M+ Teslas sold the last 5 years is actually a very small amount

    I would want an article to try and find the rate of accidents per 100,00, group it by severity, and then compare and contrast that with human caused accidents.

    Because while it’s clear by now Teslas aren’t the perfect self driving machines we were promised, there is no doubt at all that humans are bad drivers.

    We lose over 40k people a year to car accidents. And fatal car accidents are rare, so multiple that by like 100 to get the total number of car accidents.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      45
      arrow-down
      15
      ·
      7 months ago

      The question isn’t “are they safer than the average human driver?”

      The question is “who goes to prison when that self driving car has an oopsie, veers across three lanes of traffic and wipes out a family of four?”

      Because if the answer is “nobody”, they shouldn’t be on the road. There’s zero accountability, and because it’s all wibbly-wobbly AI bullshit, there’s no way to prove that the issues are actually fixed.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          21
          arrow-down
          12
          ·
          7 months ago

          Accountability is important. If a human driver is dangerous, they get taken off the roads and/or sent to jail. If a self driving car kills somebody, it’s just “oops, oh well, these things happen, but shareholder make a lot of money so never mind”.

          I do not want “these things happen” on my headstone.

          • Tja@programming.dev
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            5
            ·
            7 months ago

            So you would prefer to have higher chances of dying, just to write “Joe Smith did it” on it?

          • ipkpjersi@lemmy.ml
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            6
            ·
            edit-2
            7 months ago

            But if a human driver is dangerous, and gets put in jail or get taken off the roads, there are likely already more dangerous human drivers taking their place. Not to mention, genuine accidents, even horrific ones, do happen with human drivers. If the rate of accidents and rate of fatal accidents with self-driving vehicles is way down versus human drivers, you are actually risking your life more by trusting in human drivers and taking way more risks that way. Having someone be accountable for your death doesn’t matter if you’ve already died because of them.

            Is it any better if you have “Killed by Bill Johnson’s SUV” on your headstone?

      • dream_weasel@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        3
        ·
        7 months ago

        The answer is the person behind the wheel.

        Tesla makes it very clear to the driver they you still have to pay attention and be ready to take over any time. Full self driving engages the in cabin nanny cam to enforce that you pay attention, above and beyond the frequent reminders to apply turning force to the steering wheel.

        Now, once Tesla goes Mercedes and says you don’t have to pay attention, it’s gonna be the company that should step in. I know that’s a big old SHOULD, but right now that’s not the situation anyway.

        • exanime@lemmy.today
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 months ago

          Now, once Tesla goes Mercedes and says you don’t have to pay attention, it’s gonna be the company that should step in

          That doesn’t give me warm and fuzzies either… Imagine a poor dude having to fight Mercedes or Testla because he was crippled by a sleeping driver and bad AI… Not even counting the lobbying that would certainly happen to reduce and then eliminate their liability

          • dream_weasel@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            6
            ·
            7 months ago

            There will be legal battles for sure. I don’t know how you can argue for anything besides the manufacturer taking responsibility. I don’t know how that doesn’t end up with auto pilot fatalities treated as a class where there’s a lookup table of payouts though. This is the intersection of liability and money/power, so it’s functionally uncharted territory at least in the US.

      • ipkpjersi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        edit-2
        7 months ago

        The question isn’t “are they safer than the average human driver?”

        How is that not the question? That absolutely is the question. Just because someone is accountable for your death doesn’t mean you aren’t already dead, it doesn’t bring you back to life. If a human driver is actively dangerous and get taken off the road or put in jail, there are very likely already plenty more taking that human drivers place. Plus genuine accidents, even horrific ones, do happen with human drivers. If the death rate for self-driving vehicles is really that much lower, you are risking your life that much more by trusting in human drivers.

        • ShepherdPie@midwest.social
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          7 months ago

          Yeah that person’s take seems a little unhinged as throwing people in prison after a car accident only happens if they’re intoxicated or driving recklessly. These systems don’t have to be perfect to save lives. They just have to be better than the average driver.

          • Tja@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            7 months ago

            Hell, let’s put the threshold at “better than 99% of drivers”, because every driver I know thinks they are better than average.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          7 months ago

          Exactly.

          We should solve the accountability problem, but the metric should be lives and accidents. If the self-driving system proves it causes fewer accidents and kills fewer people, it should be preferred. Full stop.

          Throwing someone in jail may be cathartic, but the goal is fewer issues on the road, not more people in jail.

      • Maddier1993@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        7 months ago

        I don’t agree with your argument.

        Making a human go to prison for wiping out a family of 4 isn’t going to bring back the family of 4. So you’re just using deterrence to hopefully make drivers more cautious.

        Yet, year after year… humans cause more deaths by negligence than tools can cause by failing.

        The question is definitely “How much safer are they compared to human drivers”

        It’s also much easier to prove that the system has those issues fixed compared to training a human hoping that their critical faculties are intact. Rigorous Software testing and mechanical testing are within legislative reach and can be made strict requirements.

      • kava@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        7
        ·
        7 months ago

        Because if the answer is “nobody”, they shouldn’t be on the road

        Do you understand how absurd this is? Let’s say AI driving results in 50% less deaths. That’s 20,000 people every year that isn’t going to die.

        And you reject that for what? Accountability? You said in another comment that you don’t want “shit happens sometimes” on your headstone.

        You do realize that’s exactly what’s going on the headstones of those 40,000 people that die annually right now? Car accidents happen. We all know they happen and we accept them as a necessary evil. “Shit happens”

        By not changing it, ironically, you’re advocating for exactly what you claim you’re against.

        • exanime@lemmy.today
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          4
          ·
          7 months ago

          Hmmm I get you point but you seem to be taken the cavalier position of one who’d never be affected.

          Let’s proposed this alternative scenario: AI is 50% safer and would reduce death from 40k to 20k a year if adopted. However, the 20k left will include your family and, unfortunately , there is no accountability therefore, nobody will pay to help raise your orphan nephew or help grandma now that your grandpa died ran over by a Tesla… Would you approve AI driving going forward?

          • kava@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            7 months ago

            A) you do realize cars have insurance and when someone hits you, that insurance pays out the damages, right? That is how the current system works, AI driver or not.

            Accidents happen. Humans make mistakes and kill people and are not held criminally liable. It happens.

            If some guy killed your nephew and made him an orphan and the justice system determined he was not negligent - then your nephew would still be an orphan and would get a payout by the insurance company.

            Exact same thing that happens in the case of an AI driven car hitting someone

            B) if I had a button to save 100k people but it killed my mother, I wouldn’t do it. What is your point?

            Using your logic, if your entire family was in the 20,000 who would be saved - you would prefer them dead? You’d rather them dead with “accountability” rather than alive?

              • kava@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                7 months ago

                Your thought experiment doesn’t work. I wouldn’t accept any position where my family members die and beyond that, it’s immaterial to the scope of discussion.

                Let’s examine various different scenarios under which someone dies in a car accident.

                1. human driver was negligent and causes a fatal car accident.

                Human gets criminal charges. Insurance pays out depending on policy.

                1. human driver was not negligent and causes a fatal car accident.

                Human does not get criminal charged. Insurance pays out depending on policy

                1. AI driver causes a fatal accident.

                Nobody gets criminal charges. Insurance pays out depending on policy.


                You claim that you would rather have 20,000 people die every year because of “accountability”.

                Tell me, what is the functional difference for a family member of a fatal car accident victim in those 3 above scenarios? The only difference is under 1) there would be someone receiving criminal charges.

                They recieve the same amount of insurance money. 2) already happens right now. You don’t mention that in the lack of accountability.

                You claim that being able to pin some accidents (remember, some qualify under 2) on an individual is worth 20,000 lives a year.

                Anybody who has ever lost someone in a car accident would rather have their family member back instead.

                • exanime@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  7 months ago

                  Your thought experiment doesn’t work

                  The point of a thought experiment is to think about that proposition, not to replace with whatever you think makes sense

                  1. AI driver causes a fatal accident.

                  Nobody gets criminal charges. Insurance pays out depending on policy.

                  Now here is my concern… You are reducing a human life to a dollar amount just like Ford did with the Pinto. If Mercedes (who is apparently liable), decides they are making more money selling their cars than paying out to people injured or killed by their cars, what’s left to force them to recall/change/fix their algorithm?

                  PS: I also never claimed I rather have 20000 more people die for accountability… So, I guess you have to argue that with the part of your brain that made it up

                  • kava@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    7 months ago

                    PS: I also never claimed I rather have 20000 more people die for accountability…

                    You said it’s not a question of how much safer it is. You said it’s a question of accountability. So even if it were 50% safer, you claimed it was wrong.

                    And here’s the thing man, I understand where you’re coming from ij that you shouldn’t reduce a life to numbers. But how does AI driving fundamentally change the current situation?

                    Car companies already do this. They calculate whether or not fixing a safety problem will cost more or less than the lawsuits from all the dead people. There’s a famous documented case of this. Maybe it’s the Ford / Pinto thing you are referencing.

                    If you think of AI driving as a safety feature - like seatbelts - would you support it? I don’t know what the actual statistics are, but presumably it’s only going to get better over time.

          • sugar_in_your_tea@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            7 months ago

            Yes, unless you mean I need to literally sacrifice my family. But if my family was randomly part of the 20k, I’d defend self-driving cars if they are proven to be safer.

            I’m very much a statistics-based person, so I’ll defend the statistically better option. In fact, me being part of that 20k gives me a larger than usual platform to discuss it.

            • exanime@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              3
              ·
              7 months ago

              No, I do mean literally your family. Not because I’m trying to be mean to you, I’m just trying to highlight you’d agree with a contract when you think the price does not apply to you… But in reality the price will apply to someone, whether they agree with the contract and enjoy the benefits or not

              It’s the exact same situation with real life with the plane manufacturers. They lobby the government to allow recalls not to be done immediately but instead on the regular maintenance of the planes. This is to save money but it literally means that some planes are put there with known defects that will not be addressed for months (or years, depending on the maintenance needed)

              Literally, people who’d never have a loved one in one of those flights decided that was acceptable to save money. They agreed, it’s ok to put your life at risk, statistically, because they want more money

              • Tja@programming.dev
                link
                fedilink
                English
                arrow-up
                4
                ·
                7 months ago

                If there are 20k deaths vs 40k, my family is literally twice as safe on the road, why wouldn’t I take that deal?

                • exanime@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  3
                  ·
                  7 months ago

                  Read the proposition… It’s a thought experiment what we were discussing

                  • Tja@programming.dev
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    7 months ago

                    The proposition is stupid. If you told me that ALL future accidents will be prevented if I agree to kill my family, I would still not do it, that’s just a bad faith trolley problem. Let’s alone just recuding it by half.

                    I reduced it to a more realistic experiment, where my family migth be killed, with the same probability as any other.

              • sugar_in_your_tea@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                edit-2
                7 months ago

                Then it’s not a fair question. You’re not comparing 40k vs 20k, you’re comparing 40k vs literally my family dying (like the hypothetical train diversion thing), that’s fear mongering and not a valid argument.

                The risk does not go up for my family because of self-driving cars. That’s innate to the 40k vs 20k numbers.

                So the proper question is: if your family was killed in an accident, what would be your reaction if it was a human driver vs AI? For me:

                • human driver - incredibly mad because it was probably preventable
                • AI - still mad, but supportive of self-driving improvements because it probably can be patched

                The first would make me bitter and probably anti-driving, whereas the second would make me constructive and want to help people understand the truth of how it works. I’m still mad in both cases, but the second is more constructive.

                Seeing someone go to jail doesn’t fix anything.

                • exanime@lemmy.today
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  7 months ago

                  Yes, it’s a thought experiment… Not a fair question, just trying to put it in perspective

                  Anyone who understands stats would agree 40k death is worse than 20k but it also depends on other factors. All things being equal to today, the 20k proposition is only benefit

                  But if we look into the nuance and details emerge, the formula changes. For example, here it’s been discussed that there may be nobody liable. If that’s the case, we win by halving death (absolutely a win) but now the remaining 20k may be left with no justice… Worse, it absolutely creates a perverse incentive for these companies, without liability exposure, to do whatever to maximize profit

                  So, not trying to be a contrarian here… I just want to avoid the polarization that is now the rule online… Nothing is just black and white

                  • sugar_in_your_tea@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    7 months ago

                    left with no justice

                    But they’d get restitution through insurance. Even if nobody is going to jail, there will still be insurance claims.

                    I agree that there is nuance here, and I think it can largely be solved without a huge change to much of anything. We don’t need some exec or software developer to go to jail for justice to be served, provided they are financially responsible. If the benefits truly do outright the risks, this system should work.

                    Tesla isn’t taking that responsibility, but Mercedes seems to be. Drivers involved in an accident where the self-driving feature was engaged have the right to sue the manufacturer for defects. That’s not necessarily the case for class 2 driving, since the driver is responsible for staying alert and needs to be in contact with the steering wheel. With class 3, that goes away, so the driver could legitimately not be touching the wheel at all when the car is in self-driving mode. My understanding is the insurance company can sue on their customer’s behalf.

                    So the path forward is to set legal precedent assigning fault to manufacturers to get monetary compensation, and let the price of cars and insurance work out the details.

      • slumberlust@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        7 months ago

        The question for me is not what margins the feature is performing on, as they will likely be better than human error raters, but how they market the product irresponsiblely.

    • machinin@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      7 months ago

      I was looking up info for another comment and found this site. It’s from 2021, but the information seems solid.

      https://www.flyingpenguin.com/?p=35819

      This table was probably most interesting, unfortunately the formatting doesn’t work on mobile, but I think you can make sense of it.

      Car 2021 Sales So Far Total Deaths

      Tesla Model S 5,155 40

      Porsche Taycan 5,367 ZERO

      Tesla Model X 6,206 14

      Volkswagen ID 6,230 ZERO

      Audi e-tron 6,884 ZERO

      Nissan Leaf 7,729 2

      Ford Mustang Mach-e 12,975 ZERO

      Chevrolet Bolt 20,288 1

      Tesla Model 3 51,510 87

      So many cars with zero deaths compared to Tesla.

      It isn’t if Tesla’s FSD is safer than humans, it’s if it’s keeping up with the automotive industry in terms of safety features. It seems like they are falling behind (despite what their marketing team claims).

        • petrol_sniff_king@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 months ago

          If not, that would indicate that this newfangled self-driving is more dangerous than a little ol’ “caught in the stone-age” Nissan Leaf, wouldn’t it?

      • dream_weasel@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        9
        ·
        7 months ago

        That’s kind of a tough article to trust if I’m being honest. It may in fact be true, but it’s an opinion piece.

        I find it a little weird to look only within sales for the year and also not to discuss the forms of autopilot or car use cases.

        For example, are we talking about highway only driving, lane keeping assist, end to end residential urban, rural unmarked roads? Some of these are harder problems than others. How about total mileage as well? I’m not sure what the range is on a Nissan leaf, but I think comparing it to a Taycan or mach e seems disingenuous.

        All that being said, yeah Tesla has a lot of deaths comparatively, but still way less than regular human drivers. I worry that a truly autonomous experience will not be available until and unless a manufacturer like Tesla pushes the limits on training data and also the fed responds by making better laws. Considering Elon douchiness, I’m also kinda happy Tesla is doing that and catching flak, but paving the way for more established manufacturers.

        We were early adopters of Tesla, and trust me the cars are made cheap and the “autopilot” drives like shit even now, but it’s amazing the progress that has been made in the last 6 years.

        • machinin@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          4
          ·
          7 months ago

          You’re happy that a racist, misogynist billionaire whose companies have some of the worst employee safety data in the industries he’s involved in is pushing these cars onto public roads? Musk doesn’t care about our safety. Like everything else, he lies about it to make money.

          We have no clue if Tesla’s are safer than humans drivers in any other car. Tesla publishes those charts, but the data is no where to be found.

          Musk lies to make money. You can’t trust anything Tesla publishes.

          I don’t want Tesla testing their shit on the public roads and putting me at risk so that Musk can make more money. I don’t opt in to be one of his beta testers.

          • dream_weasel@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            10
            ·
            7 months ago

            We get it, you hate Elon Musk. That’s a fine position to take.

            You are beta testing for anyone and everyone who is doing anything on the road. You can say “look at this lending tree report” and see accident rates, or look at the article you posted, and compare to human drivers to know which is safer. Or you can say it’s an unknowable lie in which case why are we citing anything besides you saying I hate Musk? Again valid.

            He’s making money regardless, so yeah I’m glad that spaceX lands reusable boosters and Tesla pushes the limits of what is possible with an EV so at least we get something back. Considering how many other people hate the shit out of Tesla, I’m sure every time someone hits a raccoon in a Tesla we will get to read about it.

            • machinin@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              2
              ·
              edit-2
              7 months ago

              It’s not just hatred for Musk. Yes, he is a racist that had a place in his factory called “the plantation” for black workers. He swatted the wife and children of a whistleblower. There is so much shit he does, but that isn’t what makes Teslas dangerous.

              Teslas are dangerous because he creates a culture that despises safety engineering practices. When someone has sex on autopilot and endangers everyone on the road around them, does Musk rebuke them? No, he makes a joke. Now, good followers think that the silly little warning that pops up every time probably doesn’t mean much. If a worker says that something probably needs more testing before release, do you think he pauses to consider the safety implications? I can guarantee he doesn’t care.

              So, you get someone who runs into a fireman on the road and kills them because they were using autopilot while distracted. Or you back over a motorcycle driver and kill them, or plow into a firetruck and kill some more people.

              Musk and sycophants like you that think it’s okay to have a cavalier attitude about safety because people just have to be sacrificed for technology. You are menaces. We don’t have to sacrifice passengers to make airlines safer. We have proper testing and systems in place to integrate better technology at very little risk. In the same way, we don’t have to sacrifice motorcycle drivers, first responders, other drivers or pedestrians just because you think your technology is worth it. Other car manufacturers have implemented those safety test systems. Tesla just doesn’t want to spend the money so Musk can get his payout.

              • vaultdweller013@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 months ago

                You also forgot to mention that the damned things are rolling death traps since the doors arent properly mechanical. Why the fuck should I trust something that requires power to work in an emergency. Any number of things can knock out power and disable the doors if I back my 20+ year old jeep into a fucken river I could still open the door the seals are all shot as well so reduced pressure issues.

              • dream_weasel@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                7 months ago

                There is no obligation to sacrifice anybody. This is a question of risk vs law vs driver requirement which has got to be sorted out. Sure, point out that musk is shit and his factories are shit, it’s true. He’s also a liar. All true. What I take issue with is saying that the cars are 4 wheeled death machines killing everyone in their path. That is not true. It is also not true that other companies are solving the same problem without risk. They are solving a different problem of highly mapped cities and solutions for specific scenarios.

                It’s a people problem and drivers (people) are irresponsible. I bet lift kits have killed more people than Tesla has had autopilot accidents by people not adjusting headlights. People are gonna fuck up. It has to happen, then laws have to be implemented and revised. There’s no hop skip and jump that solves autopilot on a closed course and has zero incident in practice. Conditions on the whole are just too varied. Of course, machine learning is my job so maybe I’m just a pessimist in this regard.

                • machinin@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  7 months ago

                  What I take issue with is saying that the cars are 4 wheeled death machines killing everyone in their path. That is not true. It is also not true that other companies are solving the same problem without risk.

                  I never said that. It isn’t black or white. I said musk creates a culture that despises safety engineering. Other companies like Volvo embrace it. Different companies embrace it to different degrees. As a result, you have wildly different fatality rates. Teslas happen to be the worst (although, like you said, it’s impossible to get good data that accounts for all the factors).

                  Yes, it is a people problem, but it is also a systems problem. Volvo has aimed for zero fatalities in their cars. They engineer for problematic people. They went 16 years without a fatality in the UK in one of their models. Tesla simply doesn’t care about problematic people. In fact, problematic people may even get a boost from a Musk re-tweet.

                  I agree, zero incidents may be impossible and people are problematic. But attitudes, practices, cultures and systems can either amplify those problems or dampen their effects. Musk and Tesla amplify the negative effects. It doesn’t have to be that way.

    • ipkpjersi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      edit-2
      7 months ago

      I know this is going to sound bad but bear with me and read my entire post. I think in this case it might be that people are trying to hate on Tesla because it’s Elon (and fair enough) rather than self-driving itself. Although there’s also the side of things that self-driving vehicles are already very likely safer than human-driven ones, have lower rates of accidents, etc but people expect there to be zero accidents whatsoever with self-driving which is why I think self-driving may never actually take off and become mainstream. Then again, there’s the lack of accountability, people prefer being able to place the blame and liability on something concrete, like an actual human. It’s possible I’m wrong but I don’t think I am wrong about this.

      edit: I looked further into this, and it seems I am partially wrong. It seems that Tesla is not keeping up with the average statistics in the automotive industry in terms of safety statistics, the self-driving in their vehicles seem less safe than their competitors.

      • NιƙƙιDιɱҽʂ@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        For example, I dont really trust mine and mostly use it in slow bumper to bumper traffic, or so I can adjust my AC on the touchscreen without swerving around in my lane.

    • suction@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      7 months ago

      Only Elon calls his level 2 automation “FSD” or even “Autopilot”. That alone proves that Tesla is more guilty of these deaths than other makers are who choose less evil marketing terms. The dummies who buy Elon’s crap take those terms at face value and the Nazi CEO knows that, he doesn’t care though because just like Trump he thinks of his fans as little more than maggots. Can’t say I blame him.