Click here to read more about our Riverside County Animal Services Case
car accident advertisement

Will Self-Driving Cars Make Roads Safer?

car accident advertisement

Self-driving cars are a hot topic of conversation these days. As more companies begin developing autonomous technology and vehicles with self-driving features become more commonplace, safety concerns have arisen. Are self-driving cars really safer than other vehicles on the road? Can the technology be trusted? And who is responsible in a self-driving car accident?

Will Self-Driving Cars Solve Human Error Behind the Wheel?

According to the National Highway Traffic Safety Administration (NHTSA), 90% of auto accidents are due to human error. Each year, about 35,000 traffic fatalities occur in the U.S. Of these crashes, 10% are caused by distracted driving, and nearly 30% are a result of drunk driving. On top of fatalities, traffic accidents also cause millions of injuries each year. In 2015, 4.4 million Americans were injured in auto accidents.

Some researchers claim that self-driving vehicles could reduce traffic fatalities by up to 90% by the middle of the century. If that is accurate, autonomous vehicles could save nearly 30,000 lives each year. These estimates are comparable to the life-saving efficacy of modern vaccines.

These numbers clearly show that removing human error from the driving equation could save many lives. But this might be easier said than done. For one, automakers need to determine if self-driving vehicles are indeed safer than regular vehicles—in practice, not just in theory. Additionally, safety relies heavily on the widespread adoption of autonomous vehicles. So not only do automakers need to prove the safety of these vehicles, but consumers need to buy into the technology before big changes are made.

Measuring Autonomous Vehicle Safety

How do we measure the safety of self-driving technology? Unfortunately, we don’t currently have a way to statistics on how much safer autonomous vehicles are than other vehicles. Furthermore, regulatory agencies have not yet set a standard for how much safer self-cars need to be.

The equation becomes trickier when you consider that self-driving vehicles can’t actually be tested long enough to determine their safety compared with other vehicles. According to RAND Corporation, self-driving cars would need to be driven hundreds of millions of miles in order to provide enough information to compare it to the safety of normal vehicles. Such testing could potentially take hundreds of years, which certainly isn’t a realistic timeframe for approving these vehicles for the public. Thus, companies need to come up with some creative ways to test the safety of their self-driving vehicles.

Do People Trust Self-Driving Vehicles?

When it comes to public opinion, self-driving cars face a tough crowd. According to a AAA survey, three-quarters of U.S. drivers say they would be afraid to ride in a self-driving vehicle. AAA reports that this hesitance likely stems from drivers’ own experience with autonomous vehicle technology.

Because self-driving technology—such as Tesla’s Autopilot feature—does not currently work consistently enough to replace a driver, drivers might be concerned that fully autonomous will exhibit the same issues. However, AAA suggests drivers who own vehicles that have some autonomous features are far more likely to trust self-driving technology than those who do not own vehicles with these features. So perhaps exposure to autonomous driving technology is the key to quelling drivers’ fears.

Owners Sue Tesla for Autopilot Technology

Tesla, a pioneer of self-driving technology, was the subject of a class-action lawsuit filed Wednesday in California. Tesla owners involved in the lawsuit claim the company “mischaracterized” the capabilities of its vehicles’ Autopilot 2 technology. According to the lawsuit, the feature was sold to customers as “safe and stress-free,” but instead poses a danger to drivers.

Steve Berman, managing partner of Hagens Berman, who filed the suit, says the Autopilot 2 program is erratic and dangerous. But Tesla claims the lawsuit is unfounded and lacks any legitimacy.

Tesla’s Autopilot feature came under significant scrutiny last year when a driver was killed in a Model S. Joshua Brown was driving his Tesla in Autopilot mode on an Ohio highway last May when he drove under the trailer of a semi truck. Brown died in the crash. According to Tesla, the semi truck cut across the divided highway perpendicular to the Model S. The vehicle’s Autopilot failed to pick up the white trailer against the bright sky. Thus, the brake was not applied.

The NHTSA launched an investigation following the fatal Tesla crash. The investigation determined that there were no defects in the Autopilot system. The agency also determined no recalls were necessary.

However, regulators emphasize that Autopilot and other similar features require drivers to stay attentive and engaged. While the system is proficient at prevent rear-end accidents, situations involving crossing traffic are beyond the system’s capabilities. And while self-driving technology is helpful for assisting drivers, it has not yet reached the point where drivers can rely on it as a fully autonomous program.

Who is to Blame when Self-Driving Cars Crash?

Even if self-driving vehicles reduce the human error factor, crashes will still occur. When they do, who is responsible for the crash? Experts say automakers will likely end up bearing the burden in these cases. The company is responsible for the computerized driver in an autonomous vehicle. Thus, the companies putting these vehicles on the market will likely have to assume the blame in self-driving crashes where the autonomous vehicle is at fault.

The Bottom Line About Self-Driving Vehicles

Autonomous vehicles will no doubt be sharing the roads with traditional vehicles in the near future. However, the reality is that the technology is still in its early phases. Motorists who currently drive vehicles with self-driving features must continue to pay attention behind the wheel and not rely solely on the technology for their safety on the road. Self-driving technology could eventually greatly reduce traffic fatalities and injuries, once widespread adoption takes place.

“The world of self-driving vehicle technology is certainly complex, and we are still a long way from from widespread adoption of autonomous vehicles. However, we’re optimistic about the reduction of traffic fatalities and improved road safety in the future,” said Attorney Walter Clark, founder of Walter Clark Legal Group.

Our firm has been handling personal injury cases throughout the California Low Desert and High Desert communities for over 30 years. With a 95% success rate, the personal injury attorneys at Walter Clark Legal Group will fight to hold those responsible for your loss accountable and win compensation to cover medical bills, lost wages, and pain and suffering. If you have been injured in an auto accident and want to discuss your legal options, contact us today for a free consultation with an experienced personal injury lawyer. We have offices in Indio, Rancho Mirage, Victorville, and Yucca Valley and represent clients through the entire California Low Desert and High Desert communities.

DISCLAIMER: The Walter Clark Legal Group blog is intended for general information purposes only and is not intended as legal or medical advice. References to laws are based on general legal practices and vary by location. Information reported comes from secondary news sources. We do handle these types of cases, but whether or not the individuals and/or loved ones involved in these accidents choose to be represented by a law firm is a personal choice we respect. Should you find any of the information incorrect, we welcome you to contact us with corrections.

Recent Posts
Popular Tags

Walter Clark Legal Group

Walter Clark Legal Group
N/a