Amazon.com Widgets

Tesla crash kills driver: Is an autopilot society healthy for the human race?

  • Autopilot will be much more safe than manual, death was not due to autopilot.

    First i'd like to point out the autopilot did not cause the accident, it only failed to avoid it. The feature was marked as beta and that it shouldn't be completely relied upon, because Tesla knows it is not ready for full autonomy. In the near future autonomy will be much more safe than human drivers (and there are already so many cases of autopilot saving lives). Additionally to autopilot being better than humans it will allow for so many more benefits to society (automatic taxis (potential to reduce total amount of cars on the road), self parking (helpful in areas with low parking), etc).

  • Innovation has its cost

    After the Tesla crashed and killed the driver we have to admit that there is some kind of hiccup happening with the vehicle's auto-piloting system. It is very unfortunate and heart wrenching that this happened but we can't stop innovating and growing. Of course the engineers need to figure out what happened in the accident and fix it immediately. Then we can move forward with auto piloting vehicles.

  • Yes, Autopilot is safe.

    Like any technology there are some kinks to be worked out, but so far, autopilot cars have proved to be safe. Accidents occur everyday, many of them fatal . The fact that one accident has occurred with an autopilot car, does not mean the idea is unsafe. By cutting out human error, it has the potential of being safer.

  • An autopilot society is not healthy for the human race.

    An autopilot society is not healthy for the human race. Our thoughts and actions are becoming devalued in an ever-changing society fueled by technological advances. We rely too heavily on innovations that take away our ability to reflect deeply and act for ourselves - we are becoming less aware of our environment and the people around us.

  • It's not healthy and it's not safe.

    Simply relying on a computer for your safety - and the safety of everyone in the car, including any children - is simply not safe or healthy at all. There are so many things that can go wrong, as this has shown, and a malicious hacker could cause so much havoc with just a laptop if every car ran on auto-pilot. Sure, a computer that takes over if you're about to crash and safely avoids the collision if the driver is unable would be a good idea, but the costs and risks of auto-pilot greatly outweigh the meagre benefits of it.

  • We can't rely on autopiolt alone.

    This is why I don't like these no self driving cars. Some of these have no stirring wheel at all in other words no way for a person to take control if necessary. What happened with Tesla was human error though because this auto pilot care does have a stirring wheel and it recommended that an individual have their hands on it, regardless if the car is doing the driving. Maybe this whole thing happened as a warning to those who are creating those self driving cars without the stirring wheel that have yet to hit the market, and to make modifications because cars can't be trusted to do it all.


Leave a comment...
(Maximum 900 words)
No comments yet.