Posted on Jun 28, 2017
The ethical dilemma of self-driving cars - Patrick Lin
2.59K
17
15
4
4
0
Posted >1 y ago
Responses: 7
I'm not ok with putting my life or anyone elses at the mercy of a computer. They talk about computers removing human error... but who programs computers? Humans... So at some level there's still going to be human error.
(4)
(0)
CPT Jack Durish
Don't look now but your life is in the hands of countless computers. Every mode of transportation from elevators to jet airlines (especially those built by Airbus) is controlled by computers. Medical robots are becoming the hands of surgeons. I could go on. Do you want me to or have I frightened you sufficiently?
(3)
(0)
SN Greg Wright
The truth is, Sarge, a self-driving car is going to react MUCH faster than any human possibly can, and therefore, they're going to be vastly more safe than human-piloted cars. There are issues that need to be addressed, as you say, but once they are, autonomous cars will be MUCH better than humans driving.
(0)
(0)
(2)
(0)
The premise is flawed. If there is a large group of self driving cars then they would also be programmed to ensure we have a proper braking distance behind a vehicle ahead of based on distance and speed. Therefore the most logical choice would be that the car would just brake in it's on lane.
Secondly if you truly want to have driver-less vehicles then all of the vehicles would really need to be networked together so that traffic would flow better and cars could react and sequence other cars for on and off ramps and other changes in the number of cars on a particular road. So with networking the car in the center that suddenly had to brake would also send a signal to the cars around it and they would also brake in a manner that would allow the car in danger to avoid not only the danger ahead but also the vehicles to the left or rights.
But lets go with their premise. There really is no ethical dilemma at all. There is no difference between someone else making the decision because of programming or you making the decision. In the end the same effect occurs, an accident occurs and most likely someone is either injured or killed. In fact the computer controlled car will react faster than the human and will most likely avoid other vehicles as the computer can go through millions of algorithms to determine the best possible decision before a human could even react.
Lastly with the advent of learning artificial intelligence, computer controlled cars would have a much easier time driving than humans would. A significant reduction not only in accidents but DUIs and most importantly deaths on the highways.
Secondly if you truly want to have driver-less vehicles then all of the vehicles would really need to be networked together so that traffic would flow better and cars could react and sequence other cars for on and off ramps and other changes in the number of cars on a particular road. So with networking the car in the center that suddenly had to brake would also send a signal to the cars around it and they would also brake in a manner that would allow the car in danger to avoid not only the danger ahead but also the vehicles to the left or rights.
But lets go with their premise. There really is no ethical dilemma at all. There is no difference between someone else making the decision because of programming or you making the decision. In the end the same effect occurs, an accident occurs and most likely someone is either injured or killed. In fact the computer controlled car will react faster than the human and will most likely avoid other vehicles as the computer can go through millions of algorithms to determine the best possible decision before a human could even react.
Lastly with the advent of learning artificial intelligence, computer controlled cars would have a much easier time driving than humans would. A significant reduction not only in accidents but DUIs and most importantly deaths on the highways.
(2)
(0)
Read This Next