- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk’s assertions about the technology.
The feature is called “Autopilot”, meaning that the car automatically pilots itself, rather than using a human pilot. The definition of autopilot is literally “a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot.” I’m not sure how he could have more explicitly misrepresented the product.
No it doesn’t. Even an airplane autopilot only maintain the course set by the pilot and it’s not capable of making decisions and navigating autonomously.
All technologies in publicly sold vehicles today and in recent years are of driving assistance and require driver’s attention. Anybody using the tech without paying attention is being negligent.
Autopilot is capable of navigating though, and it does make decisions like when to merge and when to execute a turn, by design. I don’t think it’s adequately equipped to make those decisions, but by design, it does. They even advertise it on their official YouTube channel, with a clip of them just plugging in a destination and letting the car get them there in their video. Tesla is responsible for advertising they do, and claims they make of their product that simply aren’t true.
This is FSD, not autopilot. Also note the driver is paying attention.
It is two different modes of the same system, one just has more features enabled than the other. You also can’t tell if the driver is paying attention, as they are mostly out of frame. Even if they are, their hands are entirely off the wheel, and it’s unlikely that they would be able to react in time to prevent an accident even if they are paying attention.
Autopilot is cable of basically ying the plane itself. A human is there for when shit goes wrong.
Like if another plane is nearby? It’s not exactly just “shit going wrong.” Autopilot doesn’t follow TCAS or ATC commands for instance.
So to be similar, driver autopilot only needs to work while there are no other cars that might be in your path. Which is why we’ve had some degree of plane autopilot for nearly a century and are just starting to get some degree of car autopilot–the assumption that no cars might be in your path is pretty much always false.
Only if you ignore traffic. Autopilot doesn’t take in ATC directions. But it’s not a useful comparison, air traffic and navigation is much simpler compared to ground.