Self-Driving Cars: Car Accident Ethics and Liability

Posted on

Self-driving cars sound like something of the future, but the future is here, and it has been for awhile. Google’s Self-Driving Car Project claims to have self-driven more than 1.5 million miles across California, Texas, Washington, and Arizona. While self-driving cars are not yet on the streets of South Carolina, it looks like we’re certainly heading in that direction. Many people view this technological advancement as an opportunity cut down on the 5.25 million driving accidents that the National Traffic Administration reports take place every year, however, there are a few questions surrounding car accident ethics and liability that must be resolved before self-driving cars can become the new norm.

 

Should Self-Driving Cars Be Programed Based On Utilitarian Ethics?

 

Utilitarianism is a system of ethics under which an action is morally right if it is the choice that leads to happiness for the greatest number of people in society. CBS News reports that Americans generally approve of self-driving cars, or autonomous vehicles, being governed by ‘utilitarian ethics.’ This means that when surveyed most people generally approve of self-driving vehicles that are programed to minimize the total number of deaths in a car crash, even if that means sacrificing passengers in the car. However, the survey also revealed that participants did not want to ride in such a car themselves and would not support legislation requiring driverless cars to be programmed with utilitarian ethics. In other words, the survey revealed that people generally support utilitarian thinking if given an impersonal hypothetical, but as soon as they image themselves or a loved one in the hypothetical collision their opinions shift and they are much more likely to opt for self preservation programming.

 

This study and the moral dilemma behind it is quite interesting as it addresses a choice that the public hasn’t really been faced with before. We don’t currently own appliances that are programed to harm us if it determines that doing so is for the greater good. Would you be willing to purchase a self-driving car that was programed to sacrifice your chance of surviving an accident in exchange for possibly saving multiple lives? It seems that most people would answer no to this question. However, CBS News notes that if lawmakers do not pass legislation requiring utilitarian programming in self-driving vehicles, there may be a ‘race to the bottom’ where car manufacturers start producing self-protective cars in response to their consumers demanding personal safety to be prioritized.

 

Regulations

 

The moral debate surrounding self-driving cars will undoubtedly help shape whether or not regulations are passed that require driverless cars to be programmed according to utilitarian standards. CBS News interviewed an assistant professor of psychology at the University of Oregon who predicts that regulations requiring utilitarian programming may cause society to hesitate about pursuing driverless cars altogether. On the other hand, others in the field predict that people will support the driverless car movement regardless of pro-utilitarian legislation given that self-driving cars will likely promise an increase in overall safety on our nation’s roadways.

 

Who Should Be Liable If a Self-driving Car Gets Into an Accident?

 

Although driverless cars are programed to avoid accidents, accidents will still inevitably happen sometimes. But when a self-driving car gets into an accident, who is legally to blame? According to an article from Vocativ, we currently do not have a standardized rule for determining who is liable. The article predicts that car manufacturers will likely have to accept more responsibility for driverless car crashes than they currently do, and that lawmakers will have to reinterpret our country’s liability laws once self-driving cars become prevalent.

 

However, an assistant professor from the University of South Carolina’s School of Law believes that our currently liability system, called tort law, will not have to change much in response to self-driving cars as our current laws already allow consumers to sue manufacturers for defective products. So it seems that manufacturers will likely be liable in driverless car accidents, but will individual drivers be completely off the hook? Probably not. The assistant professor also points out that individuals will still need insurance to cover non-collision related risks such as vandalism and hail.

 

When Will A Self-Driving Cars Be Available From Google?

 

According to LiveScience, being able to purchase a completely driverless car from Google is still a long ways off as there are several problem areas that first need to be addressed. These problem areas include:

  1. Software: As far as statistics go, people actually do a fairly good job driving and therefore engineers will need to develop better software for self-driving cars in order to make them significantly more safe than vehicles operated by humans.

  2. Maps: Google Maps has not yet currently mapped all of the roadways in the United States, therefore, better maps would be necessary before self-driving cars would be able to traverse the entire country.

  3. Sensors: Driverless cars need to have very advanced sensors so that they can distinguish between things such as dangerous potholes and harmless plastic bags in the road. This technology still has a way to go.

  4. Communication: Currently, communication between driverless cars is minimal, but ideally this technology will improve so that the cars can more efficiently assess situations on the road.

  5. Ethics: Machines are not yet able to weigh ethical decisions, and it is impossible to program a reaction for each and every possible situation that a driverless vehicle may face on the road.

While the problem areas listed above are crucial and seem to indicate that a self-driving car sitting in your driveway is a long way off, Google’s driverless cars that are currently on the road actually have a fairly good driving record. Google has reported that in six years, its self-driving cars have only been involved in 11 minor crashes. Additionally, the report also notes that most of these incidents can be attributed to human error and may not have been preventable. Perhaps completely automated vehicles available for consumers to purchase are not all that far off.

 

In Need Of Legal Advice?

 

If you have been involved in a car accident you likely have many questions regarding liability and your legal options. The experienced car accident attorneys in Rock Hill at the Elrod Pope Law Firm are well versed in South Carolina’s driving laws and would be happy to answer any questions that you have.

 

Get in touch with us today to get started with your FREE case review. We’re only a call, click, or short drive away.