Available 24/7

888.845.9696

Look Out, Californians! Uber Has New Permit to Test Self-Driving Vehicles

| Personal Injury Interest Stories

In April 2019, we asked about five key things we’d like to know before we trust self-driving vehicles on the road. So far, those questions have not been answered. But on February 5, 2020, the California Department of Motor Vehicles granted Uber a new permit to test self-driving vehicles in California.
Currently, the permits do not allow testing of completely autonomous vehicles – there must be trained drivers behind the wheel.
And, Uber did not announce any immediate plans to begin its driverless road tests on California streets. It is likely that there will be a gradual shift toward fully autonomous vehicles.
However, it is unclear exactly how and when the general public will be alerted about the status of autonomous vehicles in driving on public roads.
It’s also worth noting that 65 other companies have permits to test Autonomous Vehicles — with a driver — in California. And Waymo has been testing fully driverless vehicles on public roads since 2018.
But Uber already has a questionable record in terms of self-driving vehicle research. The National Transportation Safety Board (NTSB) reported findings that Uber had disabled the automatic emergency braking system in the self-driving vehicle that killed a pedestrian in a March 2018 crash in Arizona.
Uber suspended its testing after that incident. They claim they have improved their safety program, and raised required competency levels for their test drivers, including medical fitness, technological expertise, and a Commercial Driver’s License. Operators are limited to four hours behind the wheel per workday and must take a break or switch positions every two hours. They’re monitored by camera systems and remote monitoring teams.
In response to reports in October 2018 about failures of vehicle pedestrian-detection technology, Uber said that it has made changes that improve detection and tracking of both pedestrians and cyclists. They also say they have improved the emergency braking systems and made other updates to its safety technology.
Still, a 2019 Leasing Options survey indicated that 62.6% of respondents trust a driverless car made by an automaker, such as Tesla or BMW. However, only 6 percent trust a taxi company like Uber. A 2019 AAA survey found that 71% of people are afraid to ride in fully autonomous vehicles.
Clearly, there is a long way to go before full public confidence in driverless vehicles on our streets.


5 Questions We Should Be Asking About Driverless Vehicles

Originally published April 29, 2019
self-driving vehicles
As exciting as the technology sounds, many questions about self-driving vehicles still remain. And there has been a lot of excitement and media attention on autonomous vehicles from companies like Tesla, Uber, Waymo, Baidu and numerous others. Much of it sounds like marketing: Safer than human drivers! Eliminates human error! Saves lives! 
Recently, Tesla CEO Elon Musk laid out an audacious plan to have a network of fully self-driving taxis on the road by 2020.
But as personal injury attorneys, we wonder if the right questions are being asked about autonomous vehicle safety. Here’s what we want to know:

How Safe Are They, Really? (Show us the data.)

In 2017, the National Highway Traffic Safety Administration (NHTSA) released data on Tesla’s driver-assist technology, Autosteer. The NHTSA claimed that Autosteer reduced crashes by 40 percent, and Tesla has cited this incredible statistic ever since.
In fact, that number is incredible – as in difficult to believe. R.A. Whitfield, the director of a company called Quality Control Systems Corp., asked NHTSA for the supporting data. He was met with a bureaucratic wall. Eventually, QCS sued the NHTSA under the Freedom of Information Act.
A recent Los Angeles Times article quoted Whitfield saying that NHTSA numbers don’t add up. The methodology measured airbag deployment per mile driven before and after Autosteer installation. However, for many of the cars, Tesla’s reports were missing the miles traveled before the automated driving system was installed. This automatically inflated the percentage of deployments pre-Autosteer. That data is flawed and should not be considered.
The data that included the pre-Autosteer mileage actually showed that Autosteer may be 60% more dangerous.
At any rate, the claims by Tesla and NHTSA on the safety of driverless cars are highly questionable, at best.

self-driving vehicles - pedestrian safety

Can self-driving vehicles differentiate between all types of vehicles, pedestrians, and other objects on our streets?

While Autosteer can safely drive when the driver is incapacitated, it is apparently not smart enough to pull over for the police. Waymo claims that their cars’ programming has accounted for that. In fact, they claim their cars respond safely when traffic lights are out and police officers are directing traffic manually. However, they haven’t shared that functionality with Tesla.
Bicycles and other modes of pedestrian traffic may also present a challenge, such as bikes, scooters, roller-skaters, skateboarders. They all have varying sizes and speeds, and are prone to sudden changes in behavior. They may not always follow the rules of the road. And they all present challenges to the safety of self-driving vehicles.
Former Baidu executive, Andrew Ng is a prominent advocate of artificial intelligence. He argues that we should make roads safer for self-driving vehicles, rather than pedestrians and bicycles. He suggests partnering with government to “ask people to be lawful and considerate.”
It seems extraordinarily naïve, and more than a bit backwards, to ask people to behave differently so the streets are safe for automated vehicles.

How will autonomous vehicles be tested and regulated for public use?

RAND Corporation calculated how many miles autonomous cars needed to drive to prove statistically that they are as safe as human drivers. Their estimate was 275 million miles. So far, Tesla has only logged 130 million. It will take many more years to reach that benchmark.
RAND recommended developing more innovative testing methods, which would probably still leave some uncertainty. They further recommended an adaptive set of regulations to evolve with the technology.
The NHTSA has released a 15-step assessment guideline for the testing of driverless car technology. However, these guidelines only address concerns broadly, leaving states to use their own interpretation in creating actual evaluations. NHTSA admits they don’t have enough staff who are experts in autonomous vehicle technology and qualified to make such assessments. There are currently no plans to assemble such expertise at either federal or state levels.
When dockless scooter companies unceremoniously dumped fleets of scooters on city streets, many municipalities were forced to move urgently to regulate them. Automobiles are exponentially more dangerous than scooters. Unleashing autonomous cars before public safety standards are in place could result in a catastrophe.

self-driving vehicles - safety testing

How are choices made in no-win scenarios?

Currently, autonomous motor vehicles still require a human operator. Like cruise control, the steering wheel can still be taken over by the driver. If the goal is to create fully autonomous vehicles, how will they make value judgments when human lives are at stake?
If a sudden obstacle makes it impossible to stop in time, what is the car programmed to do? Does the car instead swerve into traffic, or into a crowd of pedestrians? Or does it continue straight and kill its own driver?
Moreover, who makes these decisions when creating the program that gives the vehicles the full autonomy to choose?
The Massachusetts Institute of Technology’s Media Lab invited people from all over the world to play a game. This game was meant to determine their moral preferences in a no-win scenario like the one described above.
The results were disturbing. Different cultures demonstrated disturbing tendencies regarding what type of people they would select to run into. For example, some were more willing to run down a fat person vs. an athletic one. Others, a homeless person vs. a business executive; or a jaywalker vs. a dog.
The ethical questions facing autonomous driving programming are worth asking.

self-driving cars - ethical choices

Who is liable for injuries when people are run down by self-driving vehicles?

There have been a number of self-driving car accidents already, some of them fatal. If the self-driving vehicle is at fault, who is liable for the other person’s injuries?
The Department of Transportation defines the computer as the “driver” of an autonomous vehicle. Someone who is injured in an accident with an autonomous car may have a product liability case. Or, the car’s ownership may be an issue, if the owner didn’t maintain the vehicle properly. Also, in a semi-autonomous vehicle, the driver still has a responsibility to pay attention. The degree to which the driver is liable vs. the car’s owner or manufacturer could result in a legal quagmire.
Autonomous vehicles present many exciting possibilities. But possibly the culmination of our questions about self-driving vehicles is this one: why rush to put them on the streets before fully testing them or reaching common regulatory standards? When solutions are implemented before fully tested, the result can be worse than the problem they are designed to solve.
At TorkLaw, we see too many accidents already. We’d prefer a safer approach.
If you are injured in an accident with another vehicle, driverless or not, we can help. Call us for a free, no obligation consultation.

Free Case Evaluation

Discover how we can help you..

  • This field is for validation purposes and should be left unchanged.