ECE 3400 Group 6: Ethics Homework
Ayomi Sanni (acs333), Drew Mera (dnm54), Emily Sun (eys27), Eric Lyne (eal234), Jacob Glueck (jng55), Joo Yeon Chae (jc2464)
Links/references on the case you are discussing:
- Tesla driver dies in first fatal crash while using autopilot mode
- Inside the Self-Driving Tesla Fatal Accident
- The driver who died in a Tesla crash using Autopilot ignored at least 7 safety warnings
- Tesla’s Statement
- Economic Effects of Automated Vehicles
Brief explanation of the case you are discussing.
The case we are discussing regards the first fatality involving a self-driving car – particularly the accident of Joshua Brown who was killed driving a Tesla Model S. There is debate as to how effective the technical safety feature of Tesla’s Autopilot was versus how Brown responded to the safety warnings of the system. “Brown was audibly warned six times to keep his hands on the steering wheel. He was also warned visually, seven times, on his Tesla’s dashboard,” as described by the Washington Post.
Define who the stakeholders are.
Each driver behind the hands of a self driving car is at risk of injury or even death if the car malfunctions. The stakes could potentially be death for the driver and passenger, and even nearby drivers and pedestrians. On the other hand, Tesla has its reputation and possible lawsuits at stake if an accident occurs.
Try to address the situation with a:
- Utilitarian test: in terms of utility, self driving cars have a lot of potential: they could make it easier and safer for people to travel. However, failures can kill people, and destroy property. While they are still in development, it seems that overall, they will ultimately be safer than human-driven cars. Humans are subject to fatigue, intoxication, and distraction. Most car crashes are caused by human error, and ostensibly, self driving cars can prevent these crashes. However, this case exemplifies a serious issue with self driving cars: they may lure people into trusting them, to a point where the human driver simply lets the car drive itself with no oversight. If people become too reliant on immature technology, self driving cars could potentially cause a lot of deaths, far more than normal cars. But, as it gets more mature, it will eventually perform better than human drivers, and thus passes the utility test.
- Justice test: If self driving cars can become reliably safer than a typical driver, this advancement will be justified. Those who pay for the self driving cars will be rewarded in the added safety and comfortability of these cars without risk of damaging outsiders. However, if these self driving cars are hazardous, it will be unjustified for pedestrians and peer drivers to be at risk for the purchase of one’s self driving car.
- Virtue test: Tesla’s vision is to create cars that are eventually able to drive freely, taking the risk out of the hands of the driver. This does in fact pass the virtue test, despite the current imperfection of self-driving vehicles. Eventually, they will become a lot safer, but in order to get there the technology needs a lot of work and cannot be taken for granted. Tesla still maintains that the operators of its autonomous vehicles should keep their hands on the wheel and remain alert for this exact reason. The company makes sure that the technology works as effectively as possible, but also ensures that the operators of its cars understand the limitations of the technology. In addition, in Tesla’s statement on the incident, they noted that their cars have been much safer than the average vehicle in the US. “This is the first known fatality in just over 130 million miles where Autopilot was activated.” This is compared to a death for every 94 million miles for all vehicles.
What are the economic, social, and political constraints that play into this scenario?
- Economic: The technology for autonomous vehicles is increasingly improving and will mostly likely dominate the automotive industry in the near future. Once self-driving cars become reliable and affordable, it will change the automotive market and create significant ripple effects throughout many industries. For example, the software components that make up the vehicle will be more important in determining the car’s value, as opposed to hardware. Furthermore, the number of vehicles purchased per household would most likely decrease as there would be more widespread self-driving transportation such as ride sharing. Furthermore, heavy commercial trucks travel would most likely be automated and be more cost effective, and there would be opportunities for drivers to do other jobs. There will be an overall lower demand for vehicle repairs and traffic police because of the autonomy of these systems.
- Social: One major issue towards the conflict is the way in which Elon Musk has chosen to market Tesla’s Autopilot system. The social issues of a company like Tesla needing to maintain its reputation along with the underlying importance of safety becomes a tricky line to balance. Companies constantly want to demonstrate new cutting-edge technology to woo customers and attract investors. At the same time however, safety is the key concern, and promising numbers and flashy statistics cannot overcome the underlying importance of how safe the autonomous system must be.
- Political: There are also legal issues involving whether self driving cars should be placed on the road. A major concern is if a robotic vehicle can share lanes with uncooperative human drivers. In the case of a self driving car, who to blame for an accident becomes much more ambiguous: the human driver, the driver of the autonomous vehicle, or the company which produces these self driving cars. This issue has become a legal nightmare and until accidents in self driving cars become more obsolete they will continued to be constrained by these concerns.
Are there creative solutions that would benefit all stakeholders?
One creative solution Tesla came up with that benefits all stakeholders is a strikeout system. Drivers can no longer ignore the safety warnings if they want to continue using Autopilot while they are driving. Tesla is also preventing any future lawsuits relating to accidents cause when the driver didn’t keep their hands on the wheel. If the strikeout system does not work, Tesla could begin to only distribute the autopilot software to users who are trusted to obey the limitations of the technology. This way, there can be control over who is testing the technology until it can be be perfected and released to anybody. Further, if this does not work, then Tesla should consider an end to distributing the software to users on the open road and only allow its employees to test the software until the bugs are worked out.