Report Showcases How Elon Musk Undermined His Own Engineers And Endangered Public Safety

from the first-do-no-harm dept

For a long time now, it’s been fairly clear that consumer safety was an afterthought for some of the more well known companies developing self-driving technology. That was made particularly clear a few years back with Uber’s fatality in Tempe, Arizona, which revealed that the company really hadn’t thought much at all about public safety. The car involved in the now notorious fatality wasn’t even programmed to detect jaywalkers, and there was little or no structure at Uber to meaningfully deal with public safety issues. The race to the pot of innovation gold was all consuming, and all other considerations (including human lives) were afterthoughts.

That same cavalier disregard for public safety has been repeatedly obvious over at Tesla, where the company’s undercooked “autopilot” technology has increasingly resulted in a nasty series of ugly mishaps, and, despite years of empty promises, still doesn’t work as marketed or promised. That’s, of course, not great for the public, who didn’t opt in to having their lives put at risk by 2,500 pound death machines for innovation’s sake. Every week there’s new evidence and lawsuits showing this technology is undercooked and dangerous, and every week we seemingly find new ways to downplay it.

This week the scope of Elon Musk’s failures on this front became more clear thanks to a New York Times piece, which profiles how corner cutting on the autopilot project was an active choice by Musk at several points in the development cycle. The piece repeatedly and clearly shows that Musk overstated what the technology was capable of for the better part of the last decade:

“As the guiding force behind Autopilot, Mr. Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the last decade show. Mr. Musk repeatedly misled buyers about the services? abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliation from Mr. Musk and Tesla.”

Musk’s bravado, and the exaggeration of the sophistication of Autopilot, helped encourage some customers to have too much trust in the product or actively misuse it. Constantly pushing undercooked software and firmware updates without proper review also created safety challenges. But the article tends to focus heavily on how Musk repeatedly undermined his own engineers through stubborn decisions that undermined both overall safety and engineer expertise, like Musk’s unyielding belief that full automated driving could be accomplished with just cameras, and not cameras and radar (or other detection tech):

“Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Mr. Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.”

The article also makes it clear that employees that were overly happy to please Musk’s whims only tended to make the overall quality and safety issues worse. And when employees did challenge Musk in a bid to improve quality and safety, things very often didn’t go well:

“In mid-2015, Mr. Musk met with a group of Tesla engineering managers to discuss their plans for the second version of Autopilot. One manager, an auto industry veteran named Hal Ockerse, told Mr. Musk he wanted to include a computer chip and other hardware that could monitor the physical components of Autopilot and provide backup if parts of the system suddenly stopped working, according to two people with knowledge of the meeting.

But Mr. Musk slapped down the idea, they said, arguing it would slow the progress of the project as Tesla worked to build a system that could drive cars by themselves. Already angry after Autopilot malfunctioned on his morning drive that day, Mr. Musk berated Mr. Ockerse for even suggesting the idea. Mr. Ockerse soon left the company.”

None of this is particularly surprising for folks who have objectively watched Musk, but it does catalog his erratic bravado and risk taking in a comprehensive way that makes all of it seem notably more concrete. For a man whose reputation is one of engineering savvy, the report repeatedly showcases how Musk refused to actually listen to his own engineers. There’s little doubt Musk has been innovative, but the report does a fairly solid job showcasing how a not insubstantial portion of his near-deified reputation is more than a little hollow.

Filed Under: , , ,
Companies: tesla

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Report Showcases How Elon Musk Undermined His Own Engineers And Endangered Public Safety”

Subscribe: RSS Leave a comment
42 Comments

This comment has been flagged by the community. Click here to show it.

trontimouse says:

Elon is right about everything

Hey, have you ever thought that the engineer was only giving the information that supports his own view of the situation? After all it is only human nature to want to be right all the time. So just as Elon Musk is not always right, so too is this engineer not right in the whole story.

Elon has a vision and needs to make progress towards it now and not in 20 years time. Sure, things will need to be tweeked, but that vision needs to be implemented.

Elon is a software engineer which means he gets the idea, then implements the idea, not the whole of everything. After the idea seems to work it is tweeked and tweeked until it does the thing it needs to do. Only then is the whole rest of the program added, things like safety and security. That is the part where the software works as advertised but needs to be made safe.

As you can see, Elons software as is everyone else’s , is not finished and making it into a finished product is not yet being worked on. Elon is still iterating and so are the engineers. So eventually when the auto pilot is doing its job properly , then the software will be added to make it safe and secure. That day is not yet here, so caution when using auto pilot or any other self driving software.

This comment has been deemed insightful by the community.
danderbandit (profile) says:

Re: Elon is right about everything

As you can see, Elons software as is everyone else’s, is not finished and making it into a finished product is not yet being worked on.

I was looking for the /s but you appear to be serious. If the car is not a finished product what are those that I see driving around?

What you are saying, whether you intended to or not, is the buyers are no more than beta testers. And paying handsomely for the privilege.

OldMugwump (profile) says:

Re: Re: Elon is right about everything

Imperfect as Autopilot is (and it is imperfect; I have 2 Teslas), it appears to be nonetheless safer than the average human driver.

The bottom line is number of accidents per mile driven – Teslas generally, and Autopilot specifically, have less accidents that average cars.

Ref: https://www.tesla.com/VehicleSafetyReport

So the article and Musk attacks seem pretty unfair – sure, nothing’s perfect, but surely "better" is an improvement to be praised.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re: Re: Re: Elon is right about everything

Crashes are kinda fun, data-wise.

We used to have super rigid car frames, like tesla. Eventually we realized having flex and crumple built in reduced injuries. Societally we choose to inflict more severe damage to a car over injuries to passengers. The move to a super durable frame might radically shift us back to physical injuries as the norm.

The length of a yellow light affects the rate of crashes in city driving. Longer yellow lights reduces reds being run, and reduces sideswipes but comes with an increased risk of rear-endings. Car accidents go up…seriousness of the accidents goes down. We can trade fewer deadly accidents for more accidents with minor injuries and damages.

This is to say focusing on the number and frequency of accidents might be a bad measure. Autopilot’s tendancy to hit parked emergency vehicles obstructing the road for instance might be a much more serious error than the error the human makes merging to avoid the vehicle.

Relying on a self reported black box accident report might not highlight the whole picture. For instance, Autopilot would I expect be most used on freeways were traffic conditions are more stable, and distance is most common. But the most time spent driving was on city streets, where all autonomous driving tech is struggling to go beyond a non-scalable system of heavily-mapped isolated regions. Most accidents happen on city streets. So the sample of all crashes from automobiles might not map well to the sample of Autopilot drivers. Indeed, given that without autopilot tesla drivers commit fewer accidents than the national average likely means a higher attention to the road by drivers in general, and a strong possibility that drivers have been over-riding Autopilot to avoid accidents. Absent data on overrides, I can take Tesla’s data and build a much less glowing perspective on safety.

Without active safety features, Tesla drivers encounter accidents a third less often than the average driver who may or may not have active safety features. This suggests that Tesla drivers are in general more attentive to the road. Given that many Autopilot errors are not ones an attentive human would have made, this suggests that the accidents we see likely have less attentive drivers. That suggests Tesla’s active safety features only have the benefit they do when an attentive driver is available to over-ride Autopilot.

Right now, most people dont have and can’t afford a Tesla w/ Autopilot. Once that changes, the drivership of Tesla cars will likely become less attentive, not more, as the drivership approaches the societal mean reflected in the national average. That means as it stands right now, we might actaully be seeing a less safe technology whose safety is being artificially boosted by current drivership.

Anonymous Coward says:

Re: Elon is right about everything

Trying to parse what you just said and I cannot tell if you are being sarcastic or not…

Is what you are trying to say is the software is not yet and is not yet ready to be released from beta testing to production. I personally do not want to be part of this beta test where when something goes wrong lives are on the line.

This comment has been flagged by the community. Click here to show it.

Clandestine (profile) says:

Re: Re: Elon is right about everything

Well, then do not use the auto-pilot feature. Just drive it like the rest of us. However, you might feel unsafe using the auto-pilot, but I’d feel safer in my standard car if you were not driving yours manually because it is much more likely that you will make a mistake and kill me than your car killing me.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Elon is right about everything

After the idea seems to work it is tweeked and tweeked until it does the thing it needs to do. Only then is the whole rest of the program added, things like safety and security. That is the part where the software works as advertised but needs to be made safe.

Autopilot doesn’t "work as advertised" until it’s safe to use. One doesn’t simply design an unsafe system and only later "add" security and safety. Safety isn’t a substance or object that can be taped onto a product in the final stages of development. To expect otherwise would be like trying to make ice cream healthier by adding vegetables after the ice cream has already been made. Security and safety should be continuously evaluated throughout the entire design process.

This comment has been flagged by the community. Click here to show it.

OGquaker says:

Re: Re: Neal Boudette lives in Detroit, no suck-up there

Tesla’s auto industry veteran Right; a Buick engineer lamented to me that they were ready with airbags 10 years before Detroit relented, my Mom’s 1962 Grand Prix had seat-belt anchors under the carpet (I crawled under and saw the welded nuts) but GM failed to sell belts (patented in 1885) until 1964. See ‘Tucker: The Man and His Dream’ (1988, produced by George Lucas) Plot Point: Apr 21, 2008 purposely did subpar work for Tesla and then stole trade secrets, Tesla sued last week in a California court.

this beta test where when something goes wrong lives are on the line What planet are you from? Detroit, Super Markets, Pringles, Fast food, Nutrasweet, Dr. Foutche beta test on everyone, the hepatitis vaccine was "beta tested" on thousands of people (RIP) around the world from 1969 until a New version stopped killing people in 1983. The CDC sent my healthy Mother a letter, suggesting a trial, she May have been exposed to hepatitis from a bad hotel sewer the letter said … She was dead by 1981: "Acquired Immune Deficiency"
December 8, 2021 16:50 GMT: https://www.reuters.com/business/autos-transportation/renault-zoe-goes-hero-zero-european-safety-agency-rating-2021-12-08/ Biden’s PAYING Detroit to make electric cars in Mexico, but nothing for Tesla: they paid back their Federal loan WITH INTEREST many years ago.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re: Elon is right about everything

"Elon is a software engineer which means he gets the idea, then implements the idea,"

er, since when?

i think you mean ‘elon is an immature richboi, which means he gets an idea, then people try to fulfil his crazy idea while he sits back and acts like he does all the work’.

He ‘has ideas’ (or ‘visions’)
The only difference between him and the person gibbering outside the local methadone dispensory is his dad was wealthy, and he leveraged that into having himself viewed as someone smart and productive, despite him never doing much either productive or smart except taking credit. The only real difference between him and Trump? 1302 weeks.

Scary Devil Monastery (profile) says:

Re: Elon is right about everything

"Elon has a vision and needs to make progress towards it now and not in 20 years time. Sure, things will need to be tweeked, but that vision needs to be implemented."

The issue being that Elon cutting corners on safety means that vision comes at the cost of investments – and possibly lives – which weren’t volunteered for that purpose.

"Elons software as is everyone else’s"

It really isn’t. Software which may endanger lives is not to be issued in beta version.

At the end of the day if the result of the means is casualties the end goal is ruined. This is how you turn a laudable vision into a failed pie-in-the-sky project. By using means which result in the end goal never being reached.

This comment has been deemed insightful by the community.
JoeCool (profile) says:

Re: Elon is right about everything

If it’s not safe and secure, it should be on a test track, not city streets. That’s why we test drivers and their cars – to keep the public safe. It’s why the cops look for unsafe driving or people breaking the rules (or at least it is SUPPOSED to be why they’re out on the roads). Until the automated cars are proven to be at least as safe as a human driver, they shouldn’t be on public roads.

This comment has been deemed insightful by the community.
Anonymous Coward says:

humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.

… and a quite remarkable analytic engine driving those eyes and those cars.

When Musk produces a car that can greet me by name, despite any changes I might make to my appearance, I might agree that his car MIGHT be able to drive with just cameras alone. I might even look to buy a kit conversion for my car.

This comment has been deemed insightful by the community.
James Burkhardt (profile) says:

Re: Lack of research from Mr Bode showing through again.

Your fallacy is….Ad-hominem

You’ve dismissed the actual discussion at hand by seizing on Mr. Bode’s use of an outdated industry average. I wanted to say you were implementing the fallacy fallacy, that you picked up a minor error that actually underplayed Karl’s point, and made like that renders the article meaningless.

But you didn’t even try to make that silly argument, instead just making an ad-hom attack. Makes it easier to choose not to explain how both those errors came about, since you clearly don’t care.

K`Tetch (profile) says:

Re: Lack of research from Mr Bode showing through again.

2500lb is a well accepted ‘gut value’ for a car. I think the actual average weight these days of new cars is 2800lb, but when you consider many older cars, it brings the weight down (my current car is 2350lb, I’ve had two volvos that were 2180lbs (the glorious 300 series), and a 140mph 4-seat car (MG Metro twin-turbo) that was 1850lb (including the extra engine stabilizers, intercooler, and a Citroen active antiroll suspension)

so yes, as a shorthand for car "2500lb death machine" is a well accepted common use reference to both average weight, and how easy it is for it to kill.

But if that’s the ONLY criticism you could manage to actually level (that you don’t understand idioms), says a lot about how good the article was.

This comment has been flagged by the community. Click here to show it.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...