Uber's Video Shows The Arizona Crash Victim Probably Didn't Cause Crash, Human Behind The Wheel Not Paying Attention

from the everyone-error dept

In the wake of a Tempe, Arizona woman being struck and killed by an Uber autonomous vehicle, there has been a flurry of information coming out about the incident. Despite that death being one of eleven in the Phoenix area alone, and the only one involving an AV, the headlines were far closer to the “Killer Car Kills Woman” sort than they should have been. Shortly after the crash, the Tempe Police Chief went on the record suggesting that the victim had at least some culpability in the incident, having walked outside of the designated crosswalk and that the entire thing would have been difficult for either human or AI to avoid.

Strangely, now that the video from Uber’s onboard cameras have been released, the Tempe police are trying to walk that back and suggest that reports of the Police Chief’s comments were taken out of context. That likely is the result of the video footage showing that claims that the victim “darted out” in front of the car are completely incorrect.

Contrary to earlier reports from Tempe’s police chief that Herzberg “abruptly” darted out in front of the car, the video shows her positioned in the middle of the road lane before the crash.

Based on the exterior video clip, Herzberg comes into view—walking a bicycle across the two-lane road—at least two seconds before the collision.

Analysis from Bryan Walker Smith, a professor at the University of South Carolina that has studied autonomous vehicle technology indicates that this likely represents a failure of the AVs detection systems and that there may indeed have been enough time for the collision to be avoided, if everything had worked properly.

Walker Smith pointed out that Uber’s LIDAR and radar equipment “absolutely” should’ve detected Herzberg on the road “and classified her as something other than a stationary object.”

“If I pay close attention, I notice the victim about 2 seconds before the video stops,” he said. “This is similar to the average reaction time for a driver. That means an alert driver may have at least attempted to swerve or brake.”

The problem, of course, is that AVs are in part attractive because drivers far too often are not alert. They are texting, playing with their phones, fiddling with the radio, or looking around absently. We are human, after all, and we fail to remain attentive with stunning regularaty.

So predictable is this failure, in fact, that it shouldn’t surprise you all that much that the safety operator behind the wheel of this particular Uber vehicle apparently is shown in the video to have been distracted by any number of things.

A safety operator was behind the wheel, something customary in most self-driving car tests conducted on public roads, in the event the autonomous tech fails. Prior to the crash, footage shows the driver—identified as 44-year-old Rafaela Vasquez—repeatedly glancing downward, and is seen looking away from the road right before the car strikes Herzberg.

So the machine might have failed. The human behind the wheel might have failed. The pedestrian may have been outside the crosswalk. These situations are as messy and complicated as we should all expect them to be. Even if the LIDAR system did not operate as expected, the human driver that critics of AVs want behind the wheel instead was there, and that didn’t prevent the unfortunate death of this woman.

So, do we have our first pedestrian death by AV? Kinda? Maybe?

Should this one incident turn us completely off to AVs in general? Hell no.

Filed Under: , , , ,
Companies: uber

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Uber's Video Shows The Arizona Crash Victim Probably Didn't Cause Crash, Human Behind The Wheel Not Paying Attention”

Subscribe: RSS Leave a comment
147 Comments
Anonymous Coward says:

So wait humans have LIDAR?

Maybe the human “safety” person was not paying attention.. Didn’t the sheriff already say it would have been unavoidable because she “came out of the shadows”, maybe grounds for firing for not doing her job, but SELF DRIVING CAR. That’s what they are TESTING right?..?? so they learn nothing and try to blame everyone except there software, Active sensors don’t have the limitations people with only passive visual like Oculation do. Don’t make excuses or try to shift blame, do an analysis FFS there is presumably (much much, much )more data that air crashes and they can figure them out, IF you give it to the NTSB.

crade (profile) says:

Re: So wait humans have LIDAR?

Obviously the car’s programming also failed to avoid the accident, and the people who develop it are learning from that result, that is 100% besides the point since the car isn’t cleared to go live autonomously. You know, because it isn’t finished testing yet? No one is shifting blame. The whole point of the human “safety” is to be able to get those test results without people dying. Otherwise it’s not really testing is it?

Anonymous Coward says:

Re: Re: So wait humans have LIDAR?

wrong the saftey driver is there because automated vehicles are illegal, not to provide backup after all if you need a back up you may was well require a driver in any case it does not go to my point which is that the ACTIVE sensor systems don’t have a problem seeing in the dark so unlike the human that the sheriff concluded would have been unable to avoid the collision because “she came out of the shadows” the LIDAR and IR camera computer vision systems don’t have that excuse, so it is blame shifting. if you think it’s good enough to be live off a test track your saying your taking responsibility… except they are not.

PaulT (profile) says:

Re: Re: Re: So wait humans have LIDAR?

“automated vehicles are illegal”

Citation needed.

“after all if you need a back up you may was well require a driver”

Yeah, when I was learning to drive, the instructor had his own controls in case of any incident. There was obviously no point in me learning, the instructor should have just taken my test for me.

( /s in case anyone needs it, but hopefully the dense among us will get the point)

Anonymous Coward says:

Re: Re: Re: So wait humans have LIDAR?

How do you get from manual driving to automatic driving without gradually introducing systems to make the process more automatic? What is your definition of “automated vehicle”? The responsibility is complex and both the technician/driver and the cars system failed here.

Legally you need a guilty party with “personhood” to blame and that is the manual driver normally. You cannot put the car in jail. The company Uber and their executives, though… I know economic penalties can befall them, but how would jailable offenses end out?

k says:

Re: Re: Re:2 So wait humans have LIDAR?

“How do you get from manual driving to automatic driving without gradually introducing systems to make the process more automatic?”

Google did it. They started their self-driving program with the philosophy that the vehicles would be completely autonomous from the start and limited them to the speeds they could safely operate. The first model could only travel 2 miles per hour (walking speed is around 3.5). I’ve been seeing them on the roads around here for over a decade and they’re now tooling around in the high 30’s, possibly over 40 mph. A friend of mine who often hits the sleep alarm too many times and is usually hurrying to work still gets annoyed by their extremely slow turns at intersections where they detect pedestrians, but he’s no longer hurling profanity at them for going slowly on the straight bits.

I think the ‘add more assistance until the driver is no longer necessary’ approach is fatally flawed (literally) because of the dangerous valley between mostly don’t need a driver and really don’t need a driver. A human who’s not actually needed 90%+ of the time will not be paying attention.

Anonymous Coward says:

Re: So wait humans have LIDAR?

Didn’t the sheriff already say it would have been unavoidable because she “came out of the shadows”

Yes, he said that. And then analysis of the video showed that the stretch of road in question has no shadows and is actually well lit.

My pet theory as to why LIDAR failed is that at that particular place in the road, it’s actually too well lit; the building in front and the structures on the sides are reflecting a bunch of radiation down on where the car was located, which may have blinded the LIDAR.

If this was the case, the fault lies with the designers, as such a blinding should result in control going back to the supervisor and alerts going off, not the navigation system silently attempting to gain resolution of the road again.

Anonymous Coward says:

Uber cuts corners...

I’m guessing that Uber’s lidar and radar sensing systems are complete shit…

No surprise really, they’re new to this autonomous vehicle game, and they have a track record for being a shitty company with poor decision-making from the top-down.

Uber thinks they can get in front of everyone else with their fleet of autonomous cars, and in order to do so, I bet they’ve cut every corner they can. I wouldn’t be surprised if we learn next week that Uber’s Lidar system is simply for looks… and it’s non-functional.

Waymo (Google) has been at this game for nearly a decade now… and they still seem cautious about their approach compared to Uber and other companies trying to get in on the action recently.

Anonymous Coward says:

Re: Re: Dear cheerleader for Uber & Google: we don't want your shit.

I’m confused as to how a Luddite is commenting on an internet website.

You don’t even know what a “Luddite” is! That was social protest against LOW wages paid by the emerging mercantilists. Smashing machines was only a tactic, not that they were afraid of them. — And then the British tyrants sent in the military and KILLED them.

You’ve been told a pack of lies by royalists and corporatists. YOU are a “Luddite” if you believe in “A fair day’s pay for a fair day’s work.”

Anonymous Coward says:

Re: Re: Re:2 Dear cheerleader for Uber & Google: we don't want your shit.

Um.. what do you think the British empire was? granted Luddites where in there era that was sill mostly Serfdom and aristocracy a small progressive working class was beginning to develop and allow people to have lives that where not mostly disconnected and miserable, as tradesmen making everything from cloths and fabrics, to block and tackle.. to Guns(the prototype assembly line was an American invention to make guns) that the British navy adapted to make block and tackle. If you want to pretend that we have not seen this before and that people that are upset about it are crazy, just remember those that do not learn from history are doomed to repeat it, only this time it won’t just be the skilled trades, it will be everyone, a Masters is now required for most stable jobs, that is what at least 16 years or confirming you are loyal to the state, that is really not OK and to get angry at people because they don’t like what direction their society is going is unpatriotic and anti-human

Anonymous Coward says:

While I agree that The car likely should have noticed, i keep hearing how, after repeated viewings, people see the victim as many as 2 seconds before the crash…First time i saw it, it was a second, maybe less. I certainly hadn’t fully registered her presence before the impact. Its only after repeated viewings, knowing what size object im looking for that ive begun to see shadows shifting earlier. And despite some other report’s suggestions, I certainly wouldn’t have expected a pedestrian to be crossing in front of a car like that. The safety drive probably wouldn’t have seen her in time to prevent her death.

That said, again, the lidar/radar should have noticed the woman.

Also, it is important to note that the Ex Waymo employee who designed Uber’s systems left google because he thought google was playing it way too safe. Google didn’t think the tech was ready, he wanted it out on the streets *Now* and damn the consequences. Its also why Uber moved to Phoenix – Lax testing regulations.

Anonymous Coward says:

Re: Re: Re:

Autonomous vehicles should never rely solely on cameras, at any range!

Lidar range is actually quite good, and should have seen this pedestrian.

To see how the combination of lidar + camera sensing works, you can get a decent overview here:

https://www.youtube.com/watch?v=tiwVMrTLUWg

There are some crazy scenarios in there, several in which the vehicle sensing equipment and software makes a better decision than most humans would.

Anonymous Coward says:

Re: Re: Re: Re:

Looking at that video, the laser range is somewhat limited, which makes sense, as at longer ranges it needs more angular resolution to identify a target, a slower scan rate, and the reflected power for the same target drops by a 16th as the range is doubled. Given where the woman was, a human had little chance of spotting her, (at least not without knowledge that she was there), the camera probably a poorer chance, and the lidar almost none when she came out of the shadows, with her dark clothing not helping. Indeed the first part of her visible in the film are just her feet, which cold be small animals or birds in the road.

Anonymous Coward says:

Re: Re:

Ars has a story too, stating the feet are visible 1.4 seconds before the crash. Based on that picture—dark road, no oncoming traffic—I’d say the driving conditions called for highbeams. They’re not just brighter, they literally point higher and could have illuminated the victim’s body. Or the body of a large wild animal that could have killed the safety driver.

I certainly hadn’t fully registered her presence before the impact. Its only after repeated viewings

Had you been the driver, you’d have been "overdriving your headlights". It’s dangerous and illegal. "Overdriving your headlights means not being able to stop inside the illuminated area ahead. It is difficult to judge other vehicles’ speeds and distances at night. Do not overdrive your headlights—it creates a blind "crash area" in front of your vehicle. You should be able to stop inside the illuminated area ahead."

That said, again, the lidar/radar should have noticed the woman.

Yes. Scotchlite is a magical substance that could have saved her life—sometimes I’ll see a 1x1cm strip on the back of a shoe, seconds before I see anything else—but this car was equipped with the actual magic of LIDAR and other sensors. Think Terminator 2 here, and notice that the safety driver was fully visible to the camera.

Anonymous Coward says:

Re: Re: Re:

If as prior posters have noted there was another car travelling roughly 3 seconds in front of the Uber vehicle then driving with the highbeams on would have been illegal.

Your statement about “overdriving your headlights” would mean that on any road or any time where an individual cannot drive with their highbeams they should drop their speed to below 38 mph (the speed at which the Uber car was travelling). This is patently ridiculous. Speed limits on roads are chosen based on how dangerous the area is and how likely crashes will occur at those speeds — the choice takes into consideration all normal adverse effects such as (but not limited to) light rain, night, and slight fog. If the speed limit was 35 mph which changed to 45 mph slightly before the area where the accident occurred then it is safe to assume that 45 mph is a safe driving speed with or without a car’s highbeams activated.

It may be that the city planners re-visit this speed limit posting and change it down to 35 mph due to the (prohibited) use of the “crosswalk.” But given the posted speed limit it is eminently reasonable for the driver to choose to go 45 mph.

ECA (profile) says:

Re: Re:

TO MANY QUESTIONS..
not enough answers yet..

1. BIKE SAFETY RULES??? where are the reflectors??
2. Lidar, radar, IR, UV…what ever…and HOW WIDE A BEAM??
If its focused on the front…it DID SEE ANYTHING..
3. DID THE WALKER SEE THE CAR??? She could of STOPPED BEFORE SHE CROSSED IN FRONT..
4. She is about 100 foot away..Stopping range?? NOT going to miss it. Even with a 30mm lens, SHE IS TO CLOSE..

Anonymous Coward says:

Re: Re: Re:

  1. BIKE SAFETY RULES??? where are the reflectors??

Not sure about Arizona…. bike safety rules normally apply to bikes-as-vehicles, not bikes-as-cargo. It was being pushed not ridden. Still, reflectors are easy to get; I’ve tried to buy them but multiple shops just gave them for free. Maybe they were there but not visible because the headlight was aimed too low.

Volvo3 says:

Re: lidar/radar should have noticed

…detection & analysis of bicycles has been a BIG problem for all these automated vehicles. LIDAR has had great difficulty with the various bike shapes, reflectivity, and human bodies clinging to them.

The Uber AV was going 40mph in a 35mph speed limit zone. How does that happen with computers fully controlling the vehicle?

The brief accident video being reviewed has much less clarity & field of view than was available to a normal human driver — it is not a definitive record of the event.

Anonymous Anonymous Coward (profile) says:

Re: Re:

Isn’t there a significant difference between electronics identifying a person off the road and predicting what that person will do next?

Some scenarios I see possible are:

1) The electronics identify a person, not on the road, and the person remains off the road. Should the vehicle stop because that person might wander onto the road?

2) The electronics identifies the person off the road, takes multiple readings of their approximate location and direction of travel (how many seconds?) and then makes a determination as to slowing down or ignoring the person who is still off the road.

3) Take that second scenario up to the point where at the last second the person off the road changes direction and moves into the road rendering all the calculations previously accomplished null and void and no time left to maneuver.

Are we really gonna ask these AV to predict the behavior of every human within range, or have some expectation that those others will follow the rules/laws? Sure detectors can help, but just as humans, processors need time to process information and react, And reacting to things that happen faster than either human or processor can react to are still gonna happen sometimes.

Anonymous Coward says:

You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.

First, you haven’t seen the video, just rely on likely slanted characterization by cop who wants this experimentation to proceed. We’ll need to dig into any payoffs made by Uber to council members or city to allow this menace on the roads.

"On the median" doesn’t fit with "stepped from the shadows", nor with the superior low-light ability of a video camera, nor right front of SUV dented.

This ain’t over until lawyers have argued to jury that the heartless giant corporation loosed this faulty robot to attack unsuspecting humans.

Also, my take that a human would expect person on the median to be unpredictable and slow down or turn to clear is still accurate (minion makes a point of stating that the SUV didn’t slow or swerve…): if she’s in video for more than half a second, then the car with its "superior" reaction time is faulty.


So why was that censored? — The more right you are here, the more censored.

Anonymous Coward says:

Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.

Probably because you repeatedly and admittedly spam nonsense all over every single post.

A) It’s tactical. I have a right to protest the unfair treatment.

B) I don’t admit that it’s “spam nonsense”; it’s my own personal views, given on a forum that solicts them with an HTML input form, advertising “comment is open to all”.

C) THOUSANDS of my comments have been censored for no articulable reason, while fanboys get away with vile ad hom — and Techdirt won’t even admit that an Administrator (which YOU may well be, “AC”) has approved each of those censorings.

D) Techdirt is violating its form contract (see CRFA) and its stated principle of free speech.

Anonymous Coward says:

Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.

I also reported you. You have a right to protest and I have a right to report your protest.

As for why, it is because your post are egotistical and condescending and not because of your view that I report. Also at least learn what censor is before you claim to be censored. You view isn’t suppressed, only made to be easily ignored. Kinda makes your whole censorship crusade seem kinda pointless.

Anonymous Coward says:

Re: Re: Re: You believe police when exculpates Uber. -- WAIT FOR THE TRIAL.

“A) It’s tactical. I have a right to protest the unfair treatment.”

It’s a private website. Fuck off with your “rights.” TD has the right to ban you the same way a private homeowner can kick you out of their house. That you can still post and your posts can be read by anyone who is curious by clicking a single button is a testament to how generous TD is with its comment moderation. You’d be instantly banned on other sites for the same behavior.

“D) Techdirt is violating its form contract (see CRFA) and its stated principle of free speech.”

Please oh please find a greedy lawyer who will take your money and sue TD over this contract violation. I would love to read a TD article about how you outed yourself as a troll and wasted money on a lawyer with dubious legal claims. You wouldn’t even have to give me a Xmas present for the next five years. That would be so awesome.

PaulT (profile) says:

Re: Re: Re:

…and that is going to be exactly what the Uber staff are doing right now. Nobody on that team will want to think that a failure on their system is leading to people getting killed, and the management don’t want this kind of publicity getting in the way of them selling the tech to the end market. It’s a tragic event, but this is the entire point of public testing, learning lessons to improve the product.

PaulT (profile) says:

Re: Re: Re:2 Re:

Exactly. It sounds heartless but the fact is that this appears to be a generally safer technology than standard road vehicles. Flaws in those that cause deaths have to occur many times before action is taken, be they problems with the road system or the vehicles themselves, and corporations are noted for holding back on improvements because it costs them more money than letting people die. The fallout from this incident is impossible for them to ignore, so they have to make things safer for everybody else.

Anonymous Coward says:

Re: Re: Re:2 Re:

Right, accidents are bug reports. And as immoral as that sounds, it’s actually huge progress.

Agreed, but… is "pedestrian in dark clothing crossing a dark street" not already part of their test suite? If you asked me what to check before sending your autonomous car into the night, it would be in my top 10 (more generally, person or deer).

PaulT (profile) says:

Re: Re: Re:3 Re:

It’s the “suddenly wandering in front of the car” part that wasn’t in the test data. The lack of visible clothing just means that the cameras didn’t fail and the result would likely have been the same if a human was driving.

Now we move on to the next set of details – did the car simply fail to react to something the other sensors picked up, or was the whole incident unavoidable? Can something be improved with those sensors, or teach the AI something about how to deal with people doing things like this?

You’re fixating on the “dark clothes” part, but missing the actual argument. That’s only mentioned to reinforce the fact that a human driver would not have made a difference.

Anonymous Coward says:

Re: Re: RED ALERT! RECORD-BREAKING ZOMBIE!

BAlbrecht or Bruce A.: SEVEN AND HALF YEAR GAP! LAST SEEN IN 2010! https://www.techdirt.com/user/balbrecht

HA, HA! — OH, NO, NO ASTRO-TURFING ON THIS SITE!

This one is unusual because makes several comments, but below those here, right on first page, back to 2010!

Only real question is whether I’m the only human here!

MDT (profile) says:

Misleading title

I think the title is a bit misleading. The victim ’caused’ the accident by wearing dark clothes at night and jaywalking.

The uber car ’caused’ the accident by not correctly identifying the potential obstacle using it’s non-visual sensors.

The uber driver ’caused’ the accident by being distracted rather than doing their job.

In other words, this thing was pretty much a mutual fault all the way around. People need to start taking responsibility for their actions. If you dress in dark clothes and jaywalk late at night in a dark area rather than crossing under the street lights, you are gambling that oncoming cars will see you. That’s called taking an unnecessary risk, and it caught up with this woman. It’s a shame, and we can and should question why the sensors didn’t catch it, but a human being wouldn’t have done any better than the uber car.

Yes, the uber car failed at what it should have been, which is better than a human, but if it had been a human driver, the fault would have been **entirely** the pedestrian’s fault. Only the fact that this is a self driving car makes it a mutual all around fault.

Anonymous Coward says:

Re: Misleading title

See the invention of jaywalking. Do we know whether the person was actually jaywalking? In Arizona, the law says:

28-793. Crossing at other than crosswalk

A. A pedestrian crossing a roadway at any point other than within a marked crosswalk or within an unmarked crosswalk at an intersection shall yield the right-of-way to all vehicles on the roadway.

B. A pedestrian crossing a roadway at a point where a pedestrian tunnel or overhead pedestrian crossing has been provided shall yield the right-of-way to all vehicles on the roadway.

C. Between adjacent intersections at which traffic control signals are in operation, pedestrians shall not cross at any place except in a marked crosswalk.

I’d call C jaywalking, and A and B failure to yield. Maybe the victim failed to yield, or maybe the car wasn’t visible when they started crossing. It wouldn’t be jaywalking unless the adjacent intersections were both signalled.

Also:

28-794. Drivers to exercise due care

Notwithstanding the provisions of this chapter every driver of a vehicle shall:

  1. Exercise due care to avoid colliding with any pedestrian on any roadway.
  2. Give warning by sounding the horn when necessary.
  3. Exercise proper precaution on observing a child or a confused or incapacitated person on a roadway.
Anonymous Coward says:

Re: Misleading title

If you look at this from an engineering point of view you have reason.

If you look at this from a legal point of view you are nuts.
The car has a driver.
The driver was not paying attention and was actively engaged in other activity such as playing video games.
A pedestrian was killed.
If the driver was driving drunk he would be charged with vehicle homicide.
I bet before this is over the driver is charged just like he would be if he had been drunk and not in control.

PaulT (profile) says:

Re: Re: Misleading title

“was actively engaged in other activity such as playing video games”

…or doing something else such as checking the centre console display that some of these cars have, but which wouldn’t be visible on the video. We’ll wait for an investigation to find out what she was actually doing, but lying about it because you fantasied what she might have been doing is not going to help your case.

“I bet before this is over the driver is charged just like he would be if he”

Your grasp of the gender of the driver is as accurate as your grasp of what was happening in the car.

It’s a funny trend during these discussions, actually. A lot of the people spewing bile over what the driver or car should have been doing based on the video can’t seem to get the most basic facts about the identity of either the driver or victim correct.

If someone refers to either of them as “him” I know I can safely ignore their other observations.

Anonymous Coward says:

Re: Re: Re: Misleading title

re PaulT
Arguably someone could refer to the driver as “he”, if they are going on chromosomes rather than what the driver currently identifies their gender as. I saw driver face in a media article and thought driver a “he” until I read the article describing the driver as trans and them now identifying as female.

PaulT (profile) says:

Re: Re: Re:2 Misleading title

Exactly. If someone’s jumping to conclusions without knowing all the details of the incident, their assumptions about what’s happening in the video can probably be ignored. A quick reading of the case background will reveal the truth.

It’s not just the driver, I’ve seen lots of people talking about the cyclist as if she were male too. From my observation, those are usually the people with little worthwhile in their other observations.

Anonymous Coward says:

Re: Re: Re: Misleading title

If the driver was looking at the center display that presumably showed what the car was “seeing”, then it’s really telling that the car certainly didn’t “see” the lady, as the driver clearly hasn’t noticed anything until she looks up and spots the lady with the bike based on how she reacts.

PaulT (profile) says:

Re: Re: Re:2 Misleading title

The centre console has other information from I’m aware of, but that’s a possible scenario if the video stream is mixed with other information that the human support needed to take into account.

The point is, people are attacking this person with claim that she was not doing her job, but as far as we know she could have been doing exactly what was required. She just happened to be doing something other than looking through the windscreen during the last couple of seconds before the first ever pedestrian collision, during which time it’s unclear whether she could have made a difference if she had been.

PeterScott (profile) says:

Uncharacteristically poor slanting by techdirt.

Techdirt:
Check your own previous story. The chief didn’t say darted.

She said “based on how she came from the shadows right into the roadway.”

That is a direct quote from the previous Techdirt story.

What the chief said in that previous story is a fair characterization of what happened in the video. She does appear right out the shadows when the headlights hit her.

First: the victim certainly did cause this collision. She is casually crossing a 45 MPH road, in the dark while wearing dark clothes. It’s fatally reckless to do that. You don’t count on car to stop for you in broad daylight on a 25 MPH road, it’s suicidal to do on 45 MPH road in the dark.

Second: Ubers Vehicle shouldn’t be on the road if they can’t pick out one person crossing an empty road. All the blame belongs to the pedestrian and Uber.

Third: The Safety driver would have no chance to react in time, even if she was paying attention. She still isn’t driving. Reaction time would be longer than typical 1.5 seconds of someone actually driving, instead of watching. Also, I strongly dispute anyone would have seen the victim a full 2 seconds before impact. You don’t get to rewind the video multiple times real life until you see what you are expecting. The first time I watched that video, it was “shit- boom”, and I knew it was coming.

The safety driver is in an impossible situation and should not be a scapegoat for a fatally reckless pedestrian and poor Uber technolgy.

Thad (user link) says:

Re: Uncharacteristically poor slanting by techdirt.

You don’t count on car to stop for you in broad daylight on a 25 MPH road, it’s suicidal to do on 45 MPH road in the dark.

That stretch of Mill is 35, though it used to be 45.

She also presumably didn’t know there was a car coming when she started across the street; the road curves there and it was 10 PM.

MDT (profile) says:

Re: Re: Re:2 tl;dr

This appears to have happened because the road was recently changed from 45 to 30. The current supposition I’ve seen is that the Uber road specs in it’s computer didn’t have the change.

This brings up a different issue, which is, how do we ensure that AV’s have the correct road information in a timely fashion. This is likely going to require changes to how states handle speed changes.

Almost certainly it’s going to require that there be a standard regarding temporary speedlimits, with devices that broadcast the temporarily lowered speedlimit to the cars. This is, of course, an issue because you just know someone will hack it and put out pi devices that broadcast 120 or 5 mph speed limits to mess with the cars.

Thad (user link) says:

Re: Re: Re:3 tl;dr

This appears to have happened because the road was recently changed from 45 to 30. The current supposition I’ve seen is that the Uber road specs in it’s computer didn’t have the change.

It’s 35 and the car was going 38. I think it’s likely that it was operating in a "within 3 miles of the speed limit" range for the current, correct speed limit than going 7 miles under an outdated speed limit.

Almost certainly it’s going to require that there be a standard regarding temporary speedlimits, with devices that broadcast the temporarily lowered speedlimit to the cars. This is, of course, an issue because you just know someone will hack it and put out pi devices that broadcast 120 or 5 mph speed limits to mess with the cars.

There’s all kinds of mischief people can get into by painting weird shit on roads and signs, and that’s before we even start asking about network security. We’re just getting started finding out all the different ways autonomous cars can fail.

Which is not to say that I’m opposed to them on principle. Just that we’re going to have plenty of problems as they’re deployed, and some of them haven’t even been thought of yet.

Anonymous Coward says:

Re: Re: Re:3 tl;dr

with devices that broadcast the temporarily lowered speedlimit to the cars.

Why bother with all that complicated expensive tamperable equipment? Just keep a government database, updated with the most recent changes listed first. The cars connect to the database fairly regularly (say, checking a county-specific list every time they start, every time they cross a county line, and every hour) and read any changes that have occurred, and then adjust their system accordingly.

And of course, still keep all the proper verification measures in place on the database’s end. You can’t put in a change that would go above the federal speed limit of 80, you can’t put a limit below 15, you can’t drop more than (10? 15?) mph with a single change on one road.

Physical measures could still supplement this, but requiring proper encryption, like amber alerts.

MDT (profile) says:

Re: Re: Re:4 tl;dr

When I say temporary reductions in speed, I’m talking about construction zones or emergency zones, where they’ve reduced the speed within the last hour. You can’t update the databases fast enough unless every car is mandated to have always on wireless, and even then, it can still have issues with network congestion.

PeterScott (profile) says:

Re: Re: Re:2 tl;dr

The paper is wrong.

Here is the streetview, with a X in green to show the accident location. You can see the accident location from 45 MPH limit sign:
https://i.imgur.com/oN57tu2.jpg

Bottom Right corner of the image, is the vidcap from a video uploaded yesterday, showing that Limit sign is still 45 MPH.

And here is the Video uploaded yesterday, Sign is visible around 26 seconds.

https://youtu.be/1XOVxSCG8u0?t=24s

I don’t see how that can be a 35 MPH zone with a clearly marked 45 MPH limit sign right in front of it.

crade (profile) says:

Re: Uncharacteristically poor slanting by techdirt.

First, Uber should be on the road if the car is at least as safe as the human driver that they allow on the road.

Second, the human driver isn’t a scapegoat, he/she is there to make sure the car is at least as safe as a human driver (which is the standard the law holds us to). If the car was less safe than that standard, it’s the human driver’s fault.

To say that the human driver would have no blame, but Uber does is to say that automated vehicles should be automatically more at fault than human drivers.

PeterScott (profile) says:

Re: Re: Uncharacteristically poor slanting by techdirt.

First, Uber should be on the road if the car is at least as safe as the human driver that they allow on the road.

There is no evidence that Uber AV’s are safe. You fail to grasp the "watcher" problem, that makes a human observer much less effective than a normal human driver.

Second, the human driver isn’t a scapegoat, he/she is there to make sure the car is at least as safe as a human driver (which is the standard the law holds us to). If the car was less safe than that standard, it’s the human driver’s fault.

As someone already posted. The human backup is NOT required under Arizona law. So you can’t claim her role is to intervene and make it safer.

The role of such backup/safety drivers is mostly about providing the illusion of safety. A real driver that is already operating the controls of a car, will usually take ~1.5 seconds to react to an emergency. Someone who isn’t actually driving the car is obviously going to take MUCH longer to react to an emergency, so there is no way this backup driver can be as safe as a normal driver.

If an AV isn’t considered safe enough to operate on it’s own without a backup/safety driver, then it should not be considered safe enough to operate with one, because the difference between those two is mostly illusory.

Thad (user link) says:

Re: Re: tl;dr

To say that the human driver would have no blame, but Uber does is to say that automated vehicles should be automatically more at fault than human drivers.

No it isn’t.

No "automatically" about it; he’s saying that the automated vehicle is more at fault in this specific instance, given the facts we know.

I think that’s a fair read. His argument, as I understand it, is that the pedestrian appeared too quickly for a human driver to react, but the autonomous car should have done something — it should have recognized the presence of an obstacle and swerved or braked before impact.

The automated vehicle is not automatically more at fault than the human behind the wheel; it’s more at fault because this appears to be a clear case where the LIDAR or the image recognition malfunctioned.

Anonymous Coward says:

Let's Pretend Autonomous Vehicles Aren't Involved...

Driver A and Pedestrian 1 aren’t named and we know nothing about them. They are both fully insured with similar tiers of coverage. Which insurer pays?

There’s who we say is at fault. Not because it is true but because that choice determines how road-legal AVs can be in the future and how insurable they are.

Anonymous Coward says:

Re: Re: Re:

The whole point is to be a scapegoat when something goes wrong?

Uh… to prevent things from going wrong. But it turns out:

Even if the driver did have some legal responsibility, it doesn’t mean Uber wouldn’t.

Because if you are watching a car drive, you are really in no position to take over fast enough if something like this goes wrong.

It’s common for Driver’s Ed cars to have a brake pedal in the passenger seat, and these do get used.

Anonymous Coward says:

Re: Re: Re: This is a watcher problem

… Because if you are watching a car drive, you are really in no position to take over fast enough if something like this goes wrong.

It’s common for Driver’s Ed cars to have a brake pedal in the passenger seat, and these do get used.

The difference there is that the person with the brake pedal is actively planning to use it during a run with a student driver. The observer in the driver’s position is not. (The passenger of your Driver’s Ed car isn’t either, when the driver is known to be experienced.)

The person in the driver’s position in the car has been watching the car do its thing for however long, uneventfully. There is little or no traffic, no sign of people. While it isn’t time to break out a novel, there’s nothing to warn a person that a second’s inattention is going to be crucial.

And even if there were, "the car should handle it". You have to go through the recognition that "oops, the car isn’t handling it".

It isn’t humanly possible to be prepared to do something "instantly" for hours on end. And there’s no guarantee that braking by itself would have been enough.

Anonymous Coward says:

Re: Re: Re:2 This is a watcher problem

The difference there is that the person with the brake pedal is actively planning to use it during a run with a student driver. The observer in the driver’s position is not.

So….. we don’t know that the state of Arizona expected the "driver" to do anything. What was Uber’s view of her responsibility? Was she expected to keep her eyes on the road, or was she looking down at some Uber software? If the former, Uber should have employed gaze detection to make sure she’s paying attention. That camera was aimed at her for a reason right? They have much more video than us and can look for patterns of attention over her history.

Anonymous Coward says:

Re: Re: Re:3 This is a watcher problem

So it would be completely unfair to expect them to react as a normal driver would.

Why are we calling them the "safety driver" then? It seems to me that it’s exactly what the public expects of them. "Normal drivers" aren’t necessarily "engaged" either. You don’t get away with killing someone because you had cruise control on, or Telsa’s autopilot.

PeterScott (profile) says:

Re: Re: Re:4 This is a watcher problem

Why are we calling them the "safety driver" then? It seems to me that it’s exactly what the public expects of them. "Normal drivers" aren’t necessarily "engaged" either. You don’t get away with killing someone because you had cruise control on, or Telsa’s autopilot.

Misnomers and bad assumptions don’t change the realities of human reaction.

The person in the seat, is NOT the driver. The car is doing all the driving.

The person is monitoring the car. We should call them Autonomous Vehcile Monitors.

AV Monitors are effectively passengers. If an emergency happens, the car is supposed to handle it. The monitor is not pressing the brakes every time the AV gets close to something. They are really assuming the AV will hit the brakes because the AV is driving.

If an emergency happens an AV monitor would have to:

1: Recognize it. But since they are in relaxed passenger mode, it would almost certainly take longer than an actual driver, who would be more engaged.

2: Recognize that the car isn’t going to handle. Because they are used to the car doing all the driving and handling all the situation, this could be a long pause.

3: Shift into driving mode, grab the controls and take action, and this will take time because unlike a normal driver, they aren’t already using the controls.

Regardless of how you try to legislate this, there are many more cognitive and physical steps for an AV monitor to take than a regular driver, an it will take them a multiple of the time a regular driver would to react.

Thus considering them a safety element in any fast developing emergency is absurd.

As I said in a previous post:

If an AV isn’t considered safe enough to operate on it’s own without a backup/safety driver, then it should not be considered safe enough to operate with one, because the difference between those two is mostly illusory.

IMO, given the failure by this Uber platform, it does NOT meet that safe enough standard, and shouldn’t be permitted on the road with or without a backup/safety/monitor in the seat.

Anonymous Coward says:

What I don’t get is how people are partially blaming the victim for wearing dark clothes. Shouldn’t the LIDAR be able to pick up people with dark clothes, even at night? How many other automated cars are running similar tech that can’t pick out someone with dark clothes like that? That seems like a problem that should’ve been worked on and fixed before letting these cars out loose on the open roads.

When you ask if we’ve had our first pedestrian death by AV, the answer isn’t “Kinda? Maybe?”, but simply “Yes.” I don’t understand the need to be so wishy-washy about it.

And honestly, I’m sure that quite a few people will justifiably be turned off of AVs completely if clothing choices factors into your chances of getting run over.

Aaron Walkhouse (profile) says:

Re: Dark clothing is still dark to infrared LIDAR…

…and RADAR is little help when it’s tuned for vehicles and
solid obstacles. ‌ They should look into military antipersonnel
RADAR such as these:
https://en.wikipedia.org/wiki/Man-portable_radar

No doubt they can add a simple stripped-down version of
these to detect pedestrians and wildlife at low cost.

All people should be careful crossing streets in dark clothing,
because even good drivers and the best AVs will still hit them. ‌ ‌ ;]

PaulT (profile) says:

Re: Re:

“What I don’t get is how people are partially blaming the victim for wearing dark clothes.”

That’s as much for the people claiming that it performed worse than a human driver, I think. While you can talk about whether or not she should have been visible with the other sensors, the fact that a human driver would also not have seen her until it was too late is an important thing to consider. That changes the discussion from “why didn’t the AI perform properly” to “why didn’t the AI perform better than we expect people to perform”, which is a different question.

“When you ask if we’ve had our first pedestrian death by AV, the answer isn’t “Kinda? Maybe?”, but simply “Yes.” “

Define “by AV” first. If you mean “an AV was involved in a fatal collision”, then yes. If you mean “an AV caused a fatal collision”, then no. That’s an extremely important distinction for this discussion.

“I’m sure that quite a few people will justifiably be turned off of AVs completely if clothing choices factors into your chances of getting run over.”

Do they have the same reaction to the other deaths across the country that week where people say the victim should have been more visible, or just the one where a scary new technology was involved? There’s a reason why high visibility clothing is being considered as mandatory for cyclists in some countries.

PaulT (profile) says:

Re: Re: Re: Re:

We’ll find out, and maybe people will stop trying to second guess everything before all the information is known. We don’t actually know what the sensors on the car picked up yet, nor the full timescale of the car’s reactions to it.

But, it’s notable that the narrative has now changed to the other sensors now that it’s clear that a human driver wouldn’t have reacted much differently to the footage in the video. Meaning that we’re now criticising the car for not being superhuman enough, rather than saying it was worse than a human would have been.

Anonymous Coward says:

RED ALERT! RECORD-BREAKING ZOMBIE! "BAlbrecht" or "Bruce A."

SEVEN AND HALF YEAR GAP! LAST SEEN IN 2010! https://www.techdirt.com/user/balbrecht

HA, HA! — OH, NO, NO ASTRO-TURFING ON THIS SITE!

This one is unusual because makes several comments, but below those here, right on first page, back to 2010!

Only real question is whether I’m the only human here!

Anonymous Coward says:

Re: RED ALERT! RECORD-BREAKING ZOMBIE! "BAlbrecht" or "Bruce A."

Whoops. Laughing so hard confused “Bruce” with “PeterScott”, which has only a 15 month gap, though many on this topic so MAY be human.

Anyhoo “Bruce” is all the more remarkable for ONE comment after 7.5 years!

And again, is most likely that these zombies will show up on Timmy’s pieces, and they all seem to have his views. Weird, huh?

PeterScott (profile) says:

Disingenuous expert?

“If I pay close attention, I notice the victim about 2 seconds before the video stops,”

I have checked this, and that claim seems very disingenuous. I used a stopwatch and timed it on multiple runs.

I get 1.0 seconds average from the time I see the first visible indication, and when the video freezes.

1.0 and “about 2 seconds” are significantly different. Different enough, that I question the motivations of “Expert” that is that far off from the truth when making public statements.

Check it yourself.

Normal driver reaction time to a surprise event it 1.5+ seconds. There is no way in hell an average normal driver would even touch the brakes before hitting her.

I am not trying to exonerate Uber. The Car should NOT be limited to visible beam of the headlights. LIDAR should have picked her out of the shadows. The car failed. The fleet should be grounded indefinitely.

Hopefully the official investigators will ignore the media circus around this.

It probably would have been best to say nothing, and not release video until after the investigation was concluded.

Anonymous Coward says:

Pro-tips for Techdirt Zombie Killers (TM):

#1: This gets latest, starting on ALL over a million comments:

https://www.techdirt.com/comments.php?start=0

#2: The lite mode lets see all comments, though apparently loses useful gravatar so can’t tell among the many ACs, ALSO, sets a cookie so you’re stuck in that mode:

https://www.techdirt.com/?_format=lite

#3 Most important: don’t take this site seriously! It’s just entertainment, like pro-wrestling.

Anonymous Coward says:

Fascinating how so many people seem to imagine that technology just appears fully formed, with no mistakes during its creation.

Don’t get me wrong, this is tragic. Somebody died. As did thousands of other people over the course of human history to give us everything we take for granted today.

Trial and error sometimes means big fucking errors, which ideally result in big fucking leaps forward.

Ed (profile) says:

I’m comforted by the apparent fact (from reading the pearl-clutching diatribes from so many) that pedestrians are never hit or killed by human drivers, that it is never the pedestrian’s fault even though they’re wearing black clothes, crossing outside the crosswalks in an unlit area on a pitch-black night, on a 45mph stretch of road. Every human driver would have avoided that poor woman, she’d be alive today! Right? Right??? Yes, technology likely should be able to detect every ignorant fool but it didn’t in this case. That doesn’t mean the technology should be abandoned, just modified.

Anonymous Coward says:

Where did all these self-driving car apologists come from?

Yes, cars driven by people kill pedestrians & cyclists all the time. And sadly, juries filled with other car drivers won’t convict even really bad drivers.

But this situation in no way excuses the self-driving car companies and their lobbyist-paid politicians who removed their liability.

I’m not interested in self-driving cars who also want to kill 30,000 people each year; I’d like to see self-driving cars do a heck of a lot better than human drivers.

But I don’t want them on the public roads while they’re still practicing with their “L” (“learner”) tags.

GERALD L ROBINSON (profile) says:

Re: Re: Roads are for cars

The driver will not be alert nor paying attention in any half usable So set-up is just another point is failure. After all when suddenly needing to make a snap judgement he is at least as likely to make the wrong one as the right. Seems like an obvious failure in the Au or sensors. The bike should have been obvious to radar or life and the car react accordingly. A fail in the sensor desighn, the sensors or the Ai?! We likely will never know as AND A makes examination of the code illegal!

PaulT (profile) says:

Re: Re:

That’s one of the aims. Given that this is literally the first death related to such cars, this doesn’t change that aim.

Again, I’ll note that nobody know the details of the 10 other deaths in that city over the same week, because they’re so ordinary and haven’t really been reported upon. This death is international news because it hasn’t happened before.

Andrew D. Todd (user link) says:

Cars And Roads

About five hundred years ago, Leonardo Da Vinci, the universal genius of the Italian Renaissance, drew up a plan for a city with two sets of streets at different levels, one for walking, and on for vehicles. He was presumably aware of Venice with its canals, but this was a dryland version, suitable for employment in, say, Milan.

An automobile is part of a transportation system, together with the roads it runs on, the gas stations, the drive-throughs, the parking lots, etc. Changes in one part of the system inevitably dictate changes in other parts of the system. This means that the ultimate owner of land, the government, is necessarily a key partner, in fact, the leading partner. You cannot do an electric car or a self-driving car as a tech product. You must have government sponsorship and large sums of government money to pick up the loose ends. Almost anything built under the tech-entrepreneur system tends to develop a critical shortage of physical public goods, because the greed of the tech-entrepreneur is incompatible with the spending of public money. The major limiting constraint on computers has become the internet, and Comcast has become one of the most hated companies. .I think that most Techdirt people would agree that we cannot put up with Comcast much longer, that the telecommunications system, which undergirds the internet, needs to be socialized, and placed under the control of public water authorities and the Post Office. The internet is too important to be run by businessmen who want to control everyone. However, considered as civil engineering projects, telecommunications networks are kid stuff compared to roads. There are lots of ways you can snake a cable which is only half an inch in diameter, and can easily be bent to a six-inch radius. Anything involving automobiles is much messier.

A smart car is going to require a smart road, with lots of built-in electronics. An electric car is going to require an electric road, with built-in power supply wires. These things are not particularly impossible. It’s not usually a major issue for public railroads, such as American commuter railroads or European national railroads, where everything belongs to one government entity. However, such things cannot be done with Elon Musk or Travis Kalanick as sole beneficiary and proprietor. The State of Arizona allowed just about anyone to start running self-driving cars on the public streets, without regulation, but the state was not prepared to meet the expenses of building pedestrian over-passes on a large scale, say for every road where the speed limit is at least two-thirds of that on an urban freeway. The state swallowed the tech-entrepreneur kool-aid., and abdicated its responsibility to organize transportation for the public good. You cannot build public works as private luxury goods. They have to be for everyone, because everyone has a vote, and people driving two-thousand-dollar “beaters” cannot be expected to vote for roads which require an eighty-thousand dollar car to use.

Often,, change of technological systems involves military sponsorship. The first jet airliner, the Boeing 707, was also known as the KC-135. It was an air-refueling tanker to go with the Boeing B-52 bomber. The United States Air Force acted as the general patron and sponsor of jet aircraft. The United States Army was the institutional sponsor of the Interstate Highway system.

Unlike an airplane, an automobile, or a train, does not usually have a very good field of view for the path is is traveling on. This means that sensors need to be located where they do have a good field of view. This might mean mounting cameras/radar/lidars on telephone poles or lamp poles, where they can see things in a much more unambiguous way than sensors mounted in a vehicle three or four hundred feet away. The basic means by which railroads avoid collisions at grade crossings is that a track detector located half a mile or a mile from the grade crossing identifies a train, and then a signal is transmitted, which causes gates to swing down, blocking automobile traffic. In the case of high-speed trains, various additional refinements are necessary, but the basic principle remains the same.

Smart cars work fairly well on freeways. The lanes in opposite directions are isolated by barriers, and the intersections by overpasses. It is rare for the relative speed between vehicles in the same roadway to be more than 20 mph. Freeways are already much safer than ordinary streets.

What can work, given sufficient money, is “slow-fast,” that is slow off the freeway, and fast on the freeway. Set an off-freeway speed limit of, say, 20 mph, for arterial streets; 10 mph for secondary streets; and 5 mph for paring lots, campuses, etc. . However, on a controlled interstate, let the speed limit be 80-90 mph. Extend the freeways far enough that slow speed, off the freeways, becomes acceptable. You can build a 20-mph interstate, that is a limited-access road with tighter curves, steeper ramp grades, and shorter merging zones, so that it fits within the space allocated for an arterial street. Because cars do not have to stop, the 20 mph freeway will have the same net speed as a 40 mph conventional road. Such a road would typically be dug into a trench below grade level, in much the same sprint that subways are put underground.

Alternatively, you can provide a car with an information system which permits it to “time” traffic lights, that is, to arrive at a light just when the light is turning green. A system like this would probably still require pedestrian overpasses, because people simply don’t move fast enough to clear an intersection within a couple of seconds.

The basic problem with self-driving cans is that they have been mistakenly edited into the framework of what a Silicon Valley company can do. They are forced to pretend to human-level artificial intelligence which they do not have. Even now, a computer with the power of a human brain would approximately fill a warehouse. The “brain” in a self-driving car is more on the level of a cockroach. Insects aren’t real bright. You can manipulate them with a flashlight.

Leave a Reply to PaulT Cancel reply

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...