Report Showcases How Elon Musk Undermined His Own Engineers And Endangered Public Safety
from the first-do-no-harm dept
For a long time now, it’s been fairly clear that consumer safety was an afterthought for some of the more well known companies developing self-driving technology. That was made particularly clear a few years back with Uber’s fatality in Tempe, Arizona, which revealed that the company really hadn’t thought much at all about public safety. The car involved in the now notorious fatality wasn’t even programmed to detect jaywalkers, and there was little or no structure at Uber to meaningfully deal with public safety issues. The race to the pot of innovation gold was all consuming, and all other considerations (including human lives) were afterthoughts.
That same cavalier disregard for public safety has been repeatedly obvious over at Tesla, where the company’s undercooked “autopilot” technology has increasingly resulted in a nasty series of ugly mishaps, and, despite years of empty promises, still doesn’t work as marketed or promised. That’s, of course, not great for the public, who didn’t opt in to having their lives put at risk by 2,500 pound death machines for innovation’s sake. Every week there’s new evidence and lawsuits showing this technology is undercooked and dangerous, and every week we seemingly find new ways to downplay it.
This week the scope of Elon Musk’s failures on this front became more clear thanks to a New York Times piece, which profiles how corner cutting on the autopilot project was an active choice by Musk at several points in the development cycle. The piece repeatedly and clearly shows that Musk overstated what the technology was capable of for the better part of the last decade:
“As the guiding force behind Autopilot, Mr. Musk pushed it in directions other automakers were unwilling to take this kind of technology, interviews with 19 people who worked on the project over the last decade show. Mr. Musk repeatedly misled buyers about the services? abilities, many of those people say. All spoke on the condition of anonymity, fearing retaliation from Mr. Musk and Tesla.”
Musk’s bravado, and the exaggeration of the sophistication of Autopilot, helped encourage some customers to have too much trust in the product or actively misuse it. Constantly pushing undercooked software and firmware updates without proper review also created safety challenges. But the article tends to focus heavily on how Musk repeatedly undermined his own engineers through stubborn decisions that undermined both overall safety and engineer expertise, like Musk’s unyielding belief that full automated driving could be accomplished with just cameras, and not cameras and radar (or other detection tech):
“Within Tesla, some argued for pairing cameras with radar and other sensors that worked better in heavy rain and snow, bright sunshine and other difficult conditions. For several years, Autopilot incorporated radar, and for a time Tesla worked on developing its own radar technology. But three people who worked on the project said Mr. Musk had repeatedly told members of the Autopilot team that humans could drive with only two eyes and that this meant cars should be able to drive with cameras alone.”
The article also makes it clear that employees that were overly happy to please Musk’s whims only tended to make the overall quality and safety issues worse. And when employees did challenge Musk in a bid to improve quality and safety, things very often didn’t go well:
“In mid-2015, Mr. Musk met with a group of Tesla engineering managers to discuss their plans for the second version of Autopilot. One manager, an auto industry veteran named Hal Ockerse, told Mr. Musk he wanted to include a computer chip and other hardware that could monitor the physical components of Autopilot and provide backup if parts of the system suddenly stopped working, according to two people with knowledge of the meeting.
But Mr. Musk slapped down the idea, they said, arguing it would slow the progress of the project as Tesla worked to build a system that could drive cars by themselves. Already angry after Autopilot malfunctioned on his morning drive that day, Mr. Musk berated Mr. Ockerse for even suggesting the idea. Mr. Ockerse soon left the company.”
None of this is particularly surprising for folks who have objectively watched Musk, but it does catalog his erratic bravado and risk taking in a comprehensive way that makes all of it seem notably more concrete. For a man whose reputation is one of engineering savvy, the report repeatedly showcases how Musk refused to actually listen to his own engineers. There’s little doubt Musk has been innovative, but the report does a fairly solid job showcasing how a not insubstantial portion of his near-deified reputation is more than a little hollow.