Can self-driving cars apply human ethical principles?
by Anaptyx on September 12, 2016 - 5:14pm
Ethical issues are questionable subjects due to the controversy between moral claims from both sides. The article "The ethical dilemmas facing self-driving cars" by Rebecca Lee from CBS News demonstrates how the self-driving car is a good example of a controversy between technology improvement and its efficacy to take human moral claims into consideration. This new product should revolutionize the world and completely change the road experience. Researchers are aware that each road situation is different and programming a computer with ethical reflexes will be a hard job to do. The self-driving car project is a very controversy subject because researchers cannot be sure which ethical and moral principles should the car follow if a disaster was unavoidable. One moral claim could save the passengers life while another one would sacrifice them in order to protect the other people on the road. The self-driving car will cause more problems than it will improve human’s life.
According to Rebecca Lee from CBS News, self-driving cars "can increase traffic efficiency, reduce pollution, and eliminate up to 90 percent of traffic accidents". Road accident frequency is a problem and these cars could avoid a lot of these accidents. People would be driven by their car, eliminating human error on the road, drivers that get distracted by using their cellphone, drunk driving and drivers that fall asleep. This is a great idea because it follows the ethical principal that claims to do no harm. Drunk driving, texting and falling asleep while driving are the cause of many accidents and it causes harm to the passengers and their family. A self-driving car would counter these human errors and save many people’s life. The human life is fundamentally valuable; the self-driving car would protect its passengers and would follow this moral claim.
The bad side of having self-driving cars in our society is that cars would be programmed to travel and know what to do in certain situations. Researchers claim that every situation is different and if the programmers would not have programmed the right reflex for this specific situation, there could be more accidents. There are also ethical and moral principles that would not be respected. One claim is that we should act in order to promote our best interest and another claim promotes that we should always act for the best good possible. Self-driving cars contradict these two ethical principles because in a situation where a car accident is unavoidable, the car would have to choose between these two principles. Should the self-driving car protect its passengers at all cost and follow the passengers’ best interest or should it sacrifice them in order to do no harm to innocent people on the road? In addition to that and according to Rebecca Lee from CBS News, "the ethical dilemma also brings with it legal questions. In the case of any accidents, lawyers could blame the car manufacturers". People would be used to an end and this violates another moral claim.
I think the self-driving car should not be created because it restrains people’s autonomy. The original idea of owning a car was to give autonomy to humans. Driving a car requires skills, analyzing and instinct. Computers may have skills and may be able to analyze quickly, but they do not have the human instinct that allows the driver to quickly react and follow ethical principles. Moreover, any mistake from a human driver can be debatable, but a mistake from a computer would not be spared because we expect it to always do the right thing. However, computers are not conscious of moral claims like humans and this is a major problem for the self-driving cars project. Will self-driving cars ever be able to apply all human ethical principles?
Lee, Rebecca. "The ethical dilemmas facing self-driving cars." CBS News, June 2016, http://www.cbsnews.com/news/can-self-driving-cars-be-programmed-to-make-.... Accessed 7 September 2016.