The Self-Driving Car Dilemma

  The self-driving car dilemma is a test of ethics in their relation to technology.

When large boxes fall from a truck ahead of a self-driving car, of which the instructions can be manually overwritten, there is the choice of swerving left to hit an SUV, keep going and risk the life of the driver, or swerve right and hit a motorcyclist. I choose the option of programming to hit the SUV.

    My choice is to program the self-driving car to swerve left and hit the SUV because this option would minimize the more certain danger to people, avoiding the more likely events of killing the driver of the self-driving car or the motorcyclist upon taking those paths. Considering this, those choices could be argued to be premeditated murder despite not intending to hurt anyone in the emergency situation. Although the issue is highly subjective as all three choices cause harm, I believe ethically it is best to minimize the amount of harm caused.


Credit: industrywired.com

    There are many considerations of this problem. For instance, the driver, by operating a self-driving vehicle, willingly took on a risk by turning over some autonomy in driving. There are also arguments based on economics, like how a car company would likely favor the safety of the driver above all else for sales. 


Tesla Model S- Credit: autoevolution.com

    Legal issues could arise in many ways; for example, by maximizing the driver's safety that makes the car tend to hurt others who may sue the company. We could also look deeper into the situation and different unknown variables, such as whether there are children in the SUV, or the speed of the drivers which could affect the outcome if the situation isn't as clear cut as in the question. There are also questions, such as, why does it seem like the SUV driver is being punished for being in a safer vehicle than the motorcyclist?

    Finally, we could try to address a larger problem in society through an "option D"; for example, should we get rid of self-driving cars because of this complex liability problem? Many people already don't feel comfortable with self-driving cars. Should all cars be self-driving? Perhaps this would result in a better connection. There is then the question of what kind of legislation will be enacted what will be the opposition to it.

    I believe that the advent of self-driving cars introduces too much potential for conflict and danger. We should avoid the reliance on technology in situations that concern lives, especially in such a direct way. By placing too much reliance on technology, we are introduced to complex moral and legal questions that could be avoided. In addition, people should not have to drive on the road with self-driving cars if they do not feel comfortable with it, nor feel pressured to transition to driving self-driving cars themselves.

Thus, this dilemma illustrates well the relationship between morals and technology and the difficult questions that it brings.


Additional information/discussion related to the ethics of self-driving cars- https://www.bing.com/ck/a?!&&p=2a320256ec5f3717JmltdHM9MTcxNzM3MjgwMCZpZ3VpZD0xMTk3OTE0Yi0yYmNkLTY5MTYtMjM1NS04NWQ5MmFkYTY4YjgmaW5zaWQ9NTI3Ng&ptn=3&ver=2&hsh=3&fclid=1197914b-2bcd-6916-2355-85d92ada68b8&psq=self-drivingcardilemma&u=a1aHR0cDovL2luc2lkZXNjaWVuY2Uub3JnL25ld3MvbW9yYWwtZGlsZW1tYXMtc2VsZi1kcml2aW5nLWNhcnM&ntb=1

https://www.bing.com/ck/a?!&&p=207b47eae6e001baJmltdHM9MTcxNzM3MjgwMCZpZ3VpZD0xMTk3OTE0Yi0yYmNkLTY5MTYtMjM1NS04NWQ5MmFkYTY4YjgmaW5zaWQ9NTMyNw&ptn=3&ver=2&hsh=3&fclid=1197914b-2bcd-6916-2355-85d92ada68b8&psq=self-drivingcardilemma&u=a1aHR0cHM6Ly93d3cuZ3NiLnN0YW5mb3JkLmVkdS9pbnNpZ2h0cy9leHBsb3JpbmctZXRoaWNzLWJlaGluZC1zZWxmLWRyaXZpbmctY2Fycw&ntb=1

Comments

Popular posts from this blog

About Me- Lillian Wu

Data Visualization & AI Overview (Titanic)