Free Case Consultation

Don’t Delay – Contact us Today!

  • This field is for validation purposes and should be left unchanged.

5 Questions We Should Be Asking About Driverless Vehicles

questions about driverless vehicles

Share with a friend!

As exciting as the technology sounds, many questions about driverless vehicles still remain. And there has been a lot of excitement and media attention on autonomous vehicles from companies like Tesla, Uber, Waymo, Baidu and numerous others. Much of it sounds like marketing: Safer than human drivers! Eliminates human error! Saves lives! 

Recently, Tesla CEO Elon Musk laid out an audacious plan to have a network of fully self-driving taxis on the road by 2020.

But as personal injury attorneys, we wonder if the right questions are being asked about autonomous vehicle safety. Here’s what we want to know:

How Safe Are They, Really? (Show us the data.)

In 2017, the National Highway Traffic Safety Administration (NHTSA) released data on Tesla’s driver-assist technology, Autosteer. The NHTSA claimed that Autosteer reduced crashes by 40 percent, and Tesla has cited this incredible statistic ever since.

In fact, that number is incredible – as in difficult to believe. R.A. Whitfield, the director of a company called Quality Control Systems Corp., asked NHTSA for the supporting data. He was met with a bureaucratic wall. Eventually, QCS sued the NHTSA under the Freedom of Information Act.

A recent Los Angeles Times article quoted Whitfield saying that NHTSA numbers don’t add up. The methodology measured airbag deployment per mile driven before and after Autosteer installation. However, for many of the cars, Tesla’s reports were missing the miles traveled before the automated driving system was installed. This automatically inflated the percentage of deployments pre-Autosteer. That data is flawed and should not be considered.

The data that included the pre-Autosteer mileage actually showed that Autosteer may be 60% more dangerous.

At any rate, the claims by Tesla and NHTSA on the safety of driverless cars are highly questionable, at best.

questions about driverless vehicles - pedestrian safety

Can driverless vehicles differentiate between all types of vehicles, pedestrians, and other objects on our streets?

While Autosteer can safely drive when the driver is incapacitated, it is apparently not smart enough to pull over for the police. Waymo claims that their cars’ programming has accounted for that. In fact, they claim their cars respond safely when traffic lights are out and police officers are directing traffic manually. However, they haven’t shared that functionality with Tesla.

Bicycles and other modes of pedestrian traffic may also present a challenge, such as bikes, scooters, roller-skaters, skateboarders. They all have varying sizes and speeds, and are prone to sudden changes in behavior. They may not always follow the rules of the road. And they all present challenges to the safety of driverless cars.

Former Baidu executive, Andrew Ng is a prominent advocate of artificial intelligence. He argues that we should make roads safer for driverless cars, rather than pedestrians and smaller/human-powered vehicles. He suggests partnering with government to “ask people to be lawful and considerate.”

It seems extraordinarily naïve, and more than a bit backwards, to ask people to behave differently so the streets are safe for automated vehicles.

How will autonomous vehicles be tested and regulated for public use?

RAND Corporation calculated how many miles autonomous cars needed to drive to prove statistically that they are as safe as human drivers. Their estimate was 275 million miles. So far, Tesla has only logged 130 million. It will take many more years to reach that benchmark.

RAND recommended developing more innovative testing methods, which would probably still leave some uncertainty. They further recommended an adaptive set of regulations to evolve with the technology.

The NHTSA has released a 15-step assessment guideline for the testing of driverless car technology. However, these guidelines only address concerns broadly, leaving states to use their own interpretation in creating actual evaluations. NHTSA admits they don’t have enough staff who are experts in autonomous vehicle technology and qualified to make such assessments. There are currently no plans to assemble such expertise at either federal or state levels.

When dockless scooter companies unceremoniously dumped fleets of scooters on city streets, many municipalities were forced to move urgently to regulate them. Automobiles are exponentially more dangerous than scooters. Unleashing autonomous cars before public safety standards are in place could result in a catastrophe.

questions about driverless vehicles - safety testing

How are choices made in no-win scenarios?

Currently, autonomous motor vehicles still require a human operator. Like cruise control, the steering wheel can still be taken over by the driver. If the goal is to create fully autonomous vehicles, how will they make value judgments when human lives are at stake?

If a sudden obstacle makes it impossible to stop in time, what is the car programmed to do? Does the car instead swerve into traffic, or into a crowd of pedestrians? Or does it continue straight and kill its own driver?

Moreover, who makes these decisions when creating the program that gives the vehicles the full autonomy to choose?

The Massachusetts Institute of Technology’s Media Lab invited people from all over the world to play a game. This game was meant to determine their moral preferences in a no-win scenario like the one described above.

The results were disturbing. Different cultures demonstrated disturbing tendencies regarding what type of people they would select to run into. For example, some were more willing to run down a fat person vs. an athletic one. Others, a homeless person vs. a business executive; or a jaywalker vs. a dog.

The ethical questions facing autonomous driving programming are worth asking.

questions about driverless vehicles - ethical choices

Who is liable for my injuries if I’m run down by a driverless car?

There have been a number of self-driving car accidents already, some of them fatal. If the self-driving vehicle is at fault, who is liable for the other person’s injuries?

The Department of Transportation defines the computer as the “driver” of an autonomous vehicle. Someone who is injured in an accident with an autonomous car may have a product liability case. Or, the car’s ownership may be an issue, if the owner didn’t maintain the vehicle properly. Also, in a semi-autonomous vehicle, the driver still has a responsibility to pay attention. The degree to which the driver is liable vs. the car’s owner or manufacturer could result in a legal quagmire.

Autonomous vehicles present many exciting possibilities. But possibly the culmination of our questions about driverless vehicles is this one: why rush to put them on the streets before fully testing them or reaching common regulatory standards? When solutions are implemented before fully tested, the result can be worse than the problem they are designed to solve.

At TorkLaw, we see too many accidents already. We’d prefer a safer approach.

If you are injured in an accident with another vehicle, driverless or not, we can help. Call us for a free, no obligation consultation.

Share with a friend!



shares