User:Timot2016

From WikiProjectMed
Jump to navigation Jump to search

With the emergence of autonomous cars, there are various ethical issues arising. While morally, the introduction of autonomous vehicles to the mass market seems inevitable due to a reduction of crashes by up to 90%[1] and their accessibility to disabled, elderly, and young passengers, there still remain some ethical issues that have not yet been fully solved. Those include, but are not limited to: (1) The moral, financial, and criminal responsibility for crashes, and (2) the decisions a car is to make right before a (fatal) crash.

(1)  There are different opinions on who should be held liable in case of a crash, in particular with people being hurt. Many experts see the car manufacturers themselves responsible for those crashes that occur due to a technical malfunction or misconstruction.[2] Besides the fact that the car manufacturer would be the source of the problem in a situation where a car crashes due to a technical issue, there is another important reason why car manufacturers could be held responsible: It would encourage them to innovate and heavily invest into fixing those issues, not only due to protection of the brand image, but also due to financial and criminal consequences. However, there are also voices that argue those using or owning the vehicle should be held responsible since they lastly know the risk that involves using such a vehicle. Experts suggest introducing a tax or insurances that would protect owners and users of autonomous vehicles of claims made by victims of an accident.[2] Other possible parties that can be held responsible in case of a technical failure include software engineers that programmed the code for the autonomous operation of the vehicles, and suppliers of components of the AV.[3]

(2)  Taking aside the question of legal liability and moral responsibility, the question arises how autonomous vehicles should be programmed to behave in an emergency situation where either passengers or other traffic participants are endangered. A very visual example of the moral dilemma that a software engineer or car manufacturer might face in programming the operating software is described in an ethical thought experiment, the trolley problem: A conductor of a trolley has the choice of staying on the planned track and running over 5 people, or turn the trolley onto a track where it would only kill one person. However, the person on that track assuming that the there is no traffic on it.[4] There are two main considerations that need to be addressed: Firstly, on what moral basis would the decisions an autonomous vehicle would have to make be based on. Secondly, how could those be translated into software code. Researchers have suggested in particular two ethical theories to be applicable to the behavior of autonomous vehicles in cases of emergency: Deontolicalism and utilitarianism.[5] Asimov’s three laws of robotics are a typical example of deontological ethics. The theory suggests that an autonomous car needs to follow strict written-out rules that it needs to follow in any situation.

Utilitarianism suggests the idea that any decision must be made based on the goal to maximize utility. This needs a definition of utility which could be maximizing the number of people surviving in a crash. Critics suggest that autonomous vehicles should adapt a mix of multiple theories to be able to respond morally right in the instance of a crash.[5]

Further ethical questions include privacy issues and the possible loss of jobs due to the emergence of autonomous vehicles.


[1] Fagnant, D. J., & Kockelman, K. (2015, May 16). Preparing a nation for autonomous vehicles: Opportunities, barriers and policy recommendations. Transportation Research Part A: Policy and Practice, 77, 167-181. doi:10.1016/j.tra.2015.04.003

[2] Hevelke, A., & Nida-Rümelin, J. (2014). Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis. Science and Engineering Ethics, 21(3), 619-630. doi:10.1007/s11948-014-9565-5

[3] Marchant, G. E., & Lindor, R. A. (2012, December 17).  The Coming Collision Between Autonomous Vehicles and the Liability System. Santa Clara Law Review, 52, 1321-1340. Retrieved October 26, 2016, from http://digitalcommons.law.scu.edu/lawreview

[4] Thomson, J. J. (1985, May). The Trolley Problem. The Yale Law Journal, 94(6), 1395-1415. Retrieved October 25, 2016.

[5] Meyer, G., & Beiker, S. (2014). Road vehicle automation. Springer International Publishing. (pp. 93-102)