Google Disagrees with California DMV Regulations to Require Human Drivers in Driverless Cars
Google has been testing autonomous cars for several years now, and until recently there haven't been any huge hurdles in the way of rules and laws. However, as we get closer to the anticipated launch of the first driverless cars expected to arrive in 2017, legislators are starting to pay closer consideration to the regulations that will govern the usage of Google's cars on the road.
This week the California Department of Motor Vehicles (DMV) released an outline of draft regulations proposed to make autonomous vehicles safer. Part of the draft regulations state that fully driverless cars – with no human inside to take control if necessary – would not be legal initially. The DMV apparently has some doubt as to whether the technology is advanced enough to be considered completely fool-proof, and is therefore insisting that a licensed driver be behind the wheel, at least for now.
The director of Google's self-driving cars project, Chris Urmson, called the decision a “perplexing” one, expressing clear disappointment in the organization's cautious approach. In a recent post on his blog, Urmson discussed the fact that many people are in need of personal transportation but are unable to drive, and that the proposed regulations would keep driverless cars from helping those people. The obvious argument is that requiring a licensed driver to be present completely negates the benefits that autonomous vehicles could offer to people who cannot obtain a license, such as those with disabilities.
Self Driving Cars Following the Law too Closely?
Google has repeatedly stressed that autonomous vehicles are safer because they eliminate the possibility of human error. This doesn't mean that they make accidents impossible, as other drivers can make mistakes and crash into the perfectly driven autonomous cars. In fact, there have been a few issues with other drivers slamming into Google cars that are driving too slow. It turns out the only apparent flaws the cars currently have is that they follow speed limits and traffic laws to the tee, which can create hazardous situations when human drivers are speeding by or making mistakes.
In the past 6 years there have been 17 minor accident incidents involving Google's cars, all resulting in no serious injuries. The first injury-causing accident involving a self-driving car occurred on July 17, 2015 near the Mountainview Google headquarters. However, it is worth noting that the accident was caused by the other driver, as were the 14 other rear-ender incidents that have happened since the project began in 2009. The other two accidents were fender-benders.
Google published this short clip to illustrate how inattentive human motorists have been rear-ending self-driving cars, without even breaking according to Chris Urmson:
Right now the main problem appears to be that Google's self-driving cars are moving slower than the flow of traffic, because they're programmed to drive the exact speed limit. This has resulted in other drivers slamming into the back of some of Google's cautious autonomous test vehicles, which has undoubtedly been one of the reasons why California's DMV has proposed regulations that require licensed drivers behind the wheel. There have even been instances of self driving cars being pulled over for driving too slow.
Are Self-driving Cars Safer, More Crash Prone, Or Both?
How can driverless cars be safer than those driven by human drivers while also being more crash prone? Well, it turns out that while Google's self driving cars are allegedly five times more likely to crash than those driven by human drivers, up until this point all of the accidents have been extremely minor in comparison to the average accident involving only human motorists.
Will Self Driving Cars be able to Coincide with Real Motorists?
As trivial as it may seem, the biggest challenge the project has faced is deciding whether the cars should follow the law perfectly or deviate from time to time to account for the fact that sometimes the flow of traffic is faster than the speed limit. This raises the question whether self-driving cars will be able to safely drive alongside human motorists who speed and violate traffic laws. If a solution cannot be found there maybe a possibility that autonomous driving software will start off being used as a corrective tool (i.e. - helping keep cars stay in their lane or adhere to navigational directions) rather than a full fledged self-driving program.
Should Autonomous Cars Break the Law in the Name of Safety?
This problem seems to be a lose-lose situation for Google. Sometimes there are situations in which a driver has to deviate from traffic laws to avoid colliding with another motorist who is already breaking the law. However, if the cars are programmed to break the law only to avoid an accident, then there is a liability concern that arises because Google might be releasing a software that creates the possibility for violations of the law.
Furthermore, if any such spur-of-the-moment adjustments were to fail to avoid an accident, there could be criticism that states the accident was then Google's fault for permitting the vehicle break the law. However, if the cars are outfitted with enough external surveillance they'll be able to record the causative events involved in accidents to determine who is truly at fault.
Are the Regulations Proposed by the California DMV Justified?
From the standpoint of an agency tasked with ensuring the safety of motorists, it seems to be understandable that the California DMV would require a licensed driver to be present, especially given that autonomous driving software is still in its early stages. The engineers and project leaders at Google appear to be rather confused and disappointed by the decision, but the DMV has tried to ease the company's concerns by hinting at the fact that the laws may change later on after the self-driving car project has proven itself.