Uber And California DMV Fight Over Definition Of Self-Driving Cars
from the policy-fight! dept
In highly regulated private industries the law means what it says ? right up until a regulator decides that it doesn’t. For that reason Uber, a company with a reputation for aggressively challenging legal norms, must have been particularly frustrated when the California Department of Motor Vehicles decided to publicly rebuke it for complying with the law of the Golden State.
The crux of the issue is that Uber decided to move forward with deploying some of its vehicles with automated technologies onto California’s roads without a permit which, the California DMV believes, it must first obtain before rolling out.
In a statement, the DMV said that it has a “permitting process in place” through which twenty manufacturers have obtained permits. Then, so as to leave no double about its position on the matter, stated that “Uber shall do the same.”
Now, whether the new Volvo XC90’s equipped with Uber’s technologies are “autonomous vehicles” as a matter of perception or regulatory projection is up for debate. Different people have different ideas about what fits that mold. But, when it comes to whether the DMV should take action to slow Uber’s work, the question turns from one of perception to one of law and textual interpretation.
California, by way of the DMV, has chosen to define an autonomous vehicle in regulation as a vehicle equipped with technology “…that has the capability of operating or driving the vehicle without the active physical control or monitoring of a natural person….” Thus, the factual question that confronted Uber before it made its decision to deploy the vehicles in California was simple: “is this vehicle capable of driving without being monitored or controlled by a driver?”
For all of their impressive capabilities, it is a matter of public record that Uber’s vehicles often require human intervention. By extension, those vehicles require constant monitoring by a human driver. On that basis, Uber likely thought that, while not toeing the industry line, its vehicles do not meet the definitional threshold necessary to trigger the state’s autonomous vehicle testing regulations.
Of course, what regulatory history there is that points to a different intent, one that tracks with the DMV’s argument, is no doubt informative and interesting as a matter of historical record, but it should not overcome the obvious strictures of the regulation as written.
In the meantime, the DMV has sent Uber a cease and desist letter. While the merits of regulation are often a matter of debate, the even application of the plain language of the law should not be. Unfortunately, it appears that Uber, by dint of its reputation, is facing unwanted “special treatment” by its regulator. Worse, the DMV may be expanding the reach of its regulations after the fact. If that’s the case, and certainty is lost, so too will be the very definitional purpose of the DMV’s regulations ? to make regular.
Filed Under: autonomous vehicles, california, dmv, regulations, san francisco, self-driving cars
Comments on “Uber And California DMV Fight Over Definition Of Self-Driving Cars”
I wonder how likely it would be that if Uber had instead got the permit, that it would now be defending itself from false advertisement allegations seeing as the vehicles aren’t as fully automated as the permit would lead others to believe.
So since Uber may have believed their vehicles cannot operate autonomously, I’m certain they’ll cooperate nicely in prosecuting the drivers for allowing the vehicles to operate autonomously without a permit as well as the various moving violations they piled up, in addition to any internal punishments up to and including termination from the company.
No? Funny, that.
Just because they can't drive themselves safely
Doesn’t mean they can’t drive themselves. If the engineer weren’t in the car, it would still work and drive itself. Therefore it falls under the regulation.
And as a person who also uses the roads, requiring basic evidence that the systems actually work seems like a sound idea to me. Yes, yes, slippery slopes and all that, but as I said above – it really is already captured.
Finally, monitoring doesn’t work. Humans who aren’t driving a car don’t have the attention or reflexes to correct. Assisted driving (where the human is expected to take over) is almost certainly going to be worse than going straight to full self-drive.
Re: Just because they can't drive themselves safely
Yeah, the examples in the article indicated the car was driving itself and the engineer chose to override it: "Once, he’s not happy with how long the car is waiting before slowing for a pedestrian. Another time, he manually steers around a double parked truck, knowing the system will just stop and wait for it to move."
So, it can drive itself, but it might kill some pedestrians (which I imagine is the situation the DMV hopes their rules will prevent). Or it might just take too long. But if that engineer fell asleep, the car could and would keep going without any human monitoring.
Re: Just because they can't drive themselves safely
Exactly, from what I’ve read these cars are using more than adaptive cruise and lane assist. They’re supposed to stop for red lights and presumably go on green. Just because they don’t work well and need more testing and oversight isn’t a loophole. I think Uber’s interpretation of the rules would mean only companies who have already developed successful autonomous systems need permits. That’s not the way any other company proceeded, and obviously not the way the DMV will allow it.
Re: Just because they can't drive themselves safely
Not sure – what’s the difference between self-driving with monitoring human, and cruise control? Or cruise control with automatic distance control? And automatic braking for obstacles? Lane departure warning? etc.
Re: Re: Just because they can't drive themselves safely
It’s a spectrum. There are classifications vehicles fall into relating to how much human control or oversight they require. There’s also an argument that cars become more dangerous when they require less human oversight without offering full autonomy, because the operator will inevitably lose focus.
Marketing, advertisement and press releases...
These are not autonomous cars.
It appears that Uber may have deliberately led the press to believe that they have put early-stage driverless cars on the road, or are about to do that. But right now they have a prototype of a development system that might lead to a platform that could evolve into a pre-production evaluation of an eventual driverless taxi.
We are another year or two from a useful taxonomy of autonomous vehicle descriptions. There is a first cut of classification, but it’s not that useful. It appears that Uber has a map-following, lane-centering, distance-keeping, automatic braking vehicle. But they almost certainly don’t have the ability to recognize and follow the directions of construction flagmen and traffic officers, or any of the myriad other skills that human drivers do without effort.
Why not just...
…let the self-driving car take the DMV driving test? If the DMV tester gives it a passing grade, then the system should be certified to drive. Sure it may make some mistakes, but so do all the other drivers out there.
Re: Why not just...
Because it’s easy to design a system to meet a set of known metrics. The driving test is meant to identify whether a human knows the rules and possesses adequate skills. Not whether they are a thinking, functioning, adaptable being able to perceive the world and cope with unexpected circumstances.
Sure, we could devise an autonomous driver testing metric, but it should be very different from the human one.
Uber. Now this is the company who is always trying to find ways to get around our laws so they can avoid abiding by the very same laws that the rest of us have to follow? This is the company who tried to get out of paying their employees (drivers) a fair wage so they fought like the devil to classify their drivers as “independent subcontractors”?
If the rest of us have to follow the law and if the rest of U.S.-based company have to apply for a license or permit for their autonomous vehicles then Uber is required to do the same.
Once again, Uber is trying to ignore our laws by employing the same legal trickery they used to get out of paying their drivers a fair wage. Uber is a company full of idiots.
Is the driver in the Volvo XC90 supposed to be driving it 100% of the time? If the answer is Yes it isn’t an autonomous vehicle its a volvo with a lot of fancy shit on it. If how ever the car is being allowed to drive itself while under supervision then it is an autonomous vehicle and they should have gotten the permit.
First of all, why didn’t Uber pick a car that was already qualified as a self driving vehicle?
Secondly, when the police pull the vehicle over for their legally acceptable ‘whatever reason’, to whom do they issue the ticket? The paying passenger?
Thirdly, assuming that citizens of locales where these vehicles are deployed are aware of the issues, why would one get in it?
Fourthly, while I like the idea of self-driving cars (when they are both self-driving and safe) I am not certain that they completely replace taxi drivers who might at times help with you luggage or packages, where a self-driving car will not, and cannot.
1) I believe Uber sees the writing on Tesla’s wall about owning their own autonomous fleet. They’re terrified of getting completely boxed out, or at best ceding a portion of their profits to whoever actually makes their cars. Time will tell if they’re reaching too far beyond their capability.
2) I think we’ll see a paradigm shift from ticketing drivers to filing bug reports. First you’d need to confirm the vehicle was running factory software (user mods could open the owner up to operating infractions) and then send the precise details of the situation to the responsible company. Certain thresholds of errors should trigger corporate fines and other penalties.
3) Because it’s normal and faster and cheaper. Much like the transition away from horses to these automobile death traps was seen as a crazy fad, people will be come more accepting of self-driving cars as they become more mainstream and proven.
4) There are already different kinds of cars for hire: taxi, limo, van, min-bus, tour bus. You’d hire the car and/or driver that suits your needs.
Re: Re: Connundrums
Re: Re: Re: Connundrums
Good point, but remember the cars actually won’t be breaking the law very often, and there will be heaps of concrete data recorded about each incident. So subjective or false assessments like "you were speeding" or "you were weaving" can and will easily be disproven by the companies responsible for the code.
Jamie Zawinski, a San Francisco resident and one of the most respected names in tech, has a very different take on it: "Uber is now literally trying to murder me."
These guys have been blatantly disregarding any and every law they find inconvenient from the very beginning, and now they’re running red lights, making hook turns through bike lanes, and completely disregarding requirements for proper registration of their autonomous vehicles.
I really hope this case gives some agency an excuse to shut them down completely, because this just raised the stakes. Before, their lawless attitude only screwed people out of money and dignity. Now, they could kill someone.
I think they’re about to discover that while the cost of sensor packages is going down, autonomous driving is still really hard. They’re being put in a position of “we have to do it too” because there’s no reason a successful fully autonomous vehicle manufacturer won’t just eat their lunch. Hopefully no one gets hurt as a result.
Do you have cruise control?
Then you may need a DMV permit.
Clearly you own a vehicle “…that has the capability of operating or driving the vehicle without the active physical control or monitoring of a natural person….”