Elon Says Teslas Drive Themselves. The Crash Data Tesla Tried To Hide From A Court Says Otherwise

from the doesn't-seem-particularly-trustworthy dept

Just days after a jury found Tesla partially liable in a fatal Autopilot crash and ordered the company to pay over $200 million, Elon Musk took to Twitter with a bold proclamation: “Teslas can drive themselves!”

The timing couldn’t be worse. Because thanks to a devastating article by Electrek’s Fred Lambert that digs deep into the trial transcripts, we now know just how far Tesla went to hide the truth about what happened in that crash. The company systematically withheld evidence, misled police investigators, and actively obstructed efforts to understand how its technology failed—behavior that looks suspiciously like criminal obstruction of justice, yet somehow apparently carries no criminal consequences.

This isn’t just about one lawsuit. It’s about how Tesla’s behavior threatens to undermine public trust in autonomous vehicle technology at precisely the moment when that trust is most crucial.

Let’s be clear: self-driving technology has enormous potential to save lives. Human drivers cause roughly 94% of serious traffic crashes, according to a decade-old study by the National Highway Traffic Safety Administration. Even imperfect autonomous systems could dramatically reduce that toll, and we shouldn’t hold them to an impossible standard of perfection.

But here’s the problem: overselling what these systems can actually do—and then covering up when they fail—threatens to poison public acceptance of the technology entirely. If people lose trust because companies like Tesla made promises they couldn’t keep, we could end up rejecting technology that might otherwise save thousands of lives.

The aviation industry figured this out decades ago. When planes crash, investigators swarm the scene, companies cooperate fully with authorities, and the entire industry learns from failures. That transparency has made flying extraordinarily safe. But Tesla’s approach in this Autopilot case shows the exact opposite mentality.

The Electrek story, based on trial transcripts from the recent case, reveals a pattern of deception that’s genuinely shocking. Here’s what Tesla did:

Within three minutes of the fatal crash, the Model S automatically uploaded a complete “collision snapshot”—video, sensor data, everything—to Tesla’s servers, then deleted the local copy. Tesla was the only entity with access to the critical evidence.

Within about three minutes of the crash, the Model S uploaded a “collision snapshot”—video, CAN‑bus streams, EDR data, etc.—to Tesla’s servers, the “Mothership”, and received an acknowledgement. The vehicle then deleted its local copy, resulting in Tesla being the only entity having access.

When police investigators tried to get the data, Tesla’s lawyer literally scripted their evidence request. As the homicide investigator testified:

“He said it’s not necessary. ‘Write me a letter and I’ll tell you what to put in the letter.'”

But the lawyer deliberately crafted the letter to avoid sending the actual crash data, instead providing infotainment logs and owner’s manuals.

McCarthy specifically crafted the letter to ommit sharing the colllision snapshot, which includes bundled video, EDR, CAN bus, and Autopilot data.

Instead, Tesla provided the police with infotainment data with call logs, a copy of the Owner’s Manual, but not the actual crash telemetry from the Autopilot ECU.

Tesla never said that it already had this data for more than a month by now.

When police brought the car’s computer to a Tesla service center for help extracting data, Tesla technicians falsely claimed the data was “corrupted”—even though they had the complete dataset sitting on their servers the entire time.

For years, Tesla told courts and plaintiffs that the crucial collision data “didn’t exist.” Only when forensic experts finally gained access to the car’s computer and found metadata proving Tesla had the data all along did the company finally admit what it had done.

As Electrek reports:

The automaker had to admit to have the data all along.

During the trial, Mr. Schreiber, attorney for the plaintiffs, claimed that Tesla used the data for its own internal analysis of the crash:

“They not only had the snapshot — they used it in their own analysis. It shows Autopilot was engaged. It shows the acceleration and speed. It shows McGhee’s hands off the wheel.”

Yet, it didn’t give access to the police nor the family of the victim who have been trying to understand what happened to their daughter.

Just reading through the summary Electrek wrote about the timeline is horrifying and raises obvious questions about why there’s no criminal liability here:

  • Tesla had the data on its servers within minutes of the crash
  • When the police sought the data, Tesla redirected them toward other data
  • When the police sought Tesla’s help in extracting it from the computer, Tesla falsely claimed it was “corrupted”
  • Tesla invented an “auto-delete” feature that didn’t exist to try explain why it couldn’t originally find the data in the computer
  • When the plaintiffs asked for the data, Tesla said that it didn’t exist
  • Tesla only admitted to the existence of the data once presented with forensic evidence that it was created and transfered to its servers.

When the collision data finally came to light, it painted a damning picture. Electrek’s summary of the forensic analysis is quite something:

  • Autopilot was active
  • Autosteer was controlling the vehicle
  • No manual braking or steering override was detected from the driver
  • There was no record of a “Take Over Immediately” alert, despite approaching a T-intersection with a stationary vehicle in its path.
  • Moore found logs showing Tesla systems were capable of issuing such warnings, but did not in this case.
  • Map and vision data from the ECU revealed:
  • Map data from the Autopilot ECU included a flag that the area was a “restricted Autosteer zone.”
  • Despite this, the system allowed Autopilot to remain engaged at full speed.

That last point is crucial. Tesla knew this wasn’t an appropriate place for Autopilot to operate, but the system didn’t disengage or warn the driver. The NTSB had specifically warned Tesla to “incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.”

Tesla appeared to ignore that recommendation.

The jury found that the driver in this case bears primary responsibility—he admitted to being distracted and not using Autopilot properly. The jury assigned him 67% of the blame. But they also found Tesla 33% responsible, and that matters.

As Electrek notes:

However, there’s also no doubt that Autopilot was active, didn’t prevent the crash despite Tesla claiming it is safer than humans, and Tesla was warned to use better geo-fencing and driver monitoring to prevent abuse of the system like that.

This case (unlike some other stories about autonomous vehicles) isn’t about punishing innovation or holding technology to impossible standards. It’s about holding companies accountable when they oversell their capabilities and then actively obstruct efforts to learn from failures.

Tesla’s behavior in this case—the years of lies, the misdirection of police, the withholding of critical evidence—represents everything wrong with how some tech companies approach safety and accountability. It’s the opposite of what we need to build public trust in autonomous vehicles.

Self-driving technology can eventually make our roads safer. But getting there requires companies that are transparent about their systems’ limitations, cooperative with safety investigations, and committed to continuous improvement based on real-world data.

Tesla’s cover-up in this case shows a company more interested in protecting its stock price (the biggest source of Elon’s wealth) than protecting lives. And Musk’s tweet claiming “Teslas can drive themselves” just days after this devastating evidence came to light shows he’s learned nothing.

If we want autonomous vehicles to fulfill their life-saving potential, we need companies that act more like airlines after a crash investigation (full transparency, immediate cooperation, system-wide improvements) and less like Tesla in this case (cover-ups, obstruction, and doubling down on dangerous claims).

The technology itself isn’t the problem. The corporate culture that prioritizes PR over safety is.

Filed Under: , , , , , , ,
Companies: tesla

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Elon Says Teslas Drive Themselves. The Crash Data Tesla Tried To Hide From A Court Says Otherwise”

Subscribe: RSS Leave a comment
37 Comments
This comment has been deemed insightful by the community.
This comment has been deemed funny by the community.
Slow Joe Crow says:

Full Self Crashing

I have shifted from calling Tesla’s SAE Level 2 ADAS system “Partial Self Driving” to “Full Self Crashing” due to its frequent failures. Similarly their Level 1 system is Semi-Autopilot, since a Level 1 ADAS is basically a fancy cruise control. Neither Level 1 nor Level 2 are in any way autonomous. That requires SAE Level 3 or higher, and AFAIK the only retail vehicles with Level 3 are Mercedes, since Waymo and Cruise do not sell their vehicles.

Anonymous Coward says:

So it seems that Teslas can also crash themselves.
To be fair, crashing a plane may results in more than 200 deaths, a national tragedy, not only few ones (and “just” a $200M fine).
If Tesla operates on buses one day, it would certainly be subject to much more scrutiny (depending of how much the bride from his buddy Donald would be).

Anonymous Coward says:

Re:

To be fair, crashing a plane may results in more than 200 deaths, a national tragedy, not only few ones […]

What happens when Tesla’s vehicles (or its C&C center) are hacked, and commands are sent out to every Tesla that has self-driving engaged?

The potential casualty count is several orders of magnitude higher than 200.

If anyone can take over these vehicles en masse, they’ll become a weapon far more widely distributed and far more deadly than a few airplanes were on 9/11/2001.

Anonymous Coward says:

Re: Re:

What happens when Tesla’s vehicles (or its C&C center) are hacked, and commands are sent out to every Tesla that has self-driving engaged?

It’ll be treated as unavoidable—the standard “no computer is complete secure” response (and don’t ask whether we really needed computers here, whether they needed to be networked, or whether the manufacturers needed remote access). And then everyone will just kind of forget about it, and learn nothing.

But why do you say “that has self-driving engaged”? I think some obvious first commands to send to all the cars would be “turn on, and engage self-driving mode”. The cars do support remote starting and locking/unlocking through the cellular network.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

He's evil, not stupid, treat him as such

Tesla’s cover-up in this case shows a company more interested in protecting its stock price (the biggest source of Elon’s wealth) than protecting lives. And Musk’s tweet claiming “Teslas can drive themselves” just days after this devastating evidence came to light shows he’s learned nothing.

From earlier in the article…

The company systematically withheld evidence, misled police investigators, and actively obstructed efforts to understand how its technology failed—behavior that looks suspiciously like criminal obstruction of justice, yet somehow apparently carries no criminal consequences.

The problem isn’t that Elon hasn’t learned anything, rather the problem is that he has, namely that he can openly lie to the legal system, withhold or even destroy evidence and get caught and still get away with it.

Anonymous Coward says:

Re: No, he's rich and can buy a president...

Elon can lie to the legal system and withhold or destroy evidence because he’s the wealthiest person in the world and has demonstrated that he can buy the President of the U.S. It’s also clear that the current (In)Justice Department can also be purchased. However, payment must be made in some cybercoin controlled by the Trump family. Why should he bother adhering to the law when he is immune?

Anonymous Coward says:

a bold proclamation: “Teslas can drive themselves!”

That part’s not wrong. They can drive themselves, right into fire trucks, highway barriers, and the like. The appropriate question, then, is whether they should drive themselves. And I’m leaning toward “no”—but, actually, I still don’t know whether they’re better or worse than human drivers on average. So my “no” is based more on the company’s bad behavior than any assessment of their technology.

Anonymous Coward says:

Re:

…I still don’t know whether [Teslas are] better or worse than human drivers on average.

In the UK, Australia, and New Zealand collectively, there seems to be far fewer crashes than in the US alone. Could it be because they drive on the opposite side of the road and the majority of drivers around the world are right-eyed, meaning car drivers in the first three countries have the advantage of seeing the bus or truck or whatever nice and early?

Anonymous Coward says:

Re: Re:

Could it be because they drive on the opposite side of the road and the majority of drivers around the world are right-eyed

I guess it could be, but I’d be quite surprised. The effect on close-up vision is small enough that most people don’t ever know which of their eyes is dominant. It’d make less difference for far-away objects, and it’s not as if people generally need a lot of warning to avoid hitting a bus.

Internationally, teaching and testing for U.S. drivers is often considered lax (although Americans would know it varies by state). Many people get licenses without ever having skidded, steered with power-steering incapacitated, or even done an emergency stop. I don’t know about the specific countries you mentioned, but I’ll guess they have stricter standards—although I don’t know how much this affects actual collision rates.

Anonymous Coward says:

Re: Re: Re:2

It’s definitely an interesting point, but the evidence I’ve found mostly points to it being significant for “one-eyed” things such as archery and shooting. There are people saying it makes some difference to driving, but I’ve seen nothing to suggest it accounts for most of the difference between countries. I’d expect motor vehicle departments and insurers to be testing for that, were it so important.

Anyway, only about 70% of people are right-eye-dominant, so it’d be pretty easy to compare accident statistics across these two groups, without crossing country borders.

This comment has been flagged by the community. Click here to show it.

Koby (profile) says:

Pay The Price

Claiming that these vehicles can drive themselves is a fraud. The only time they seem to work well is if they drive them on straight roads in Nevada, with no traffic, no pedestrians, and ideal weather conditions. The engineers developing these systems realize that they don’t make correct decisions often enough when driven elsewhere, and are hopeful that we will allow their vehicles onto the road for an alpha test, even if it endangers everyone else.

Jeff Green (profile) says:

It could have been a queue waiting for a school bus that the Tesla hit. Teslas are big heavy cars moving fast, they are perfectly capable of killing a dozen or more people. An aircraft crash where everyone walks away still gets investigated.
The car should definitely automatically upload all the data in the system in the event of a crash, to a publicly owned server which can be accessed by all interested parties.

Anonymous Coward says:

Re:

The car should definitely automatically upload all the data in the system in the event of a crash […]

Not a good idea. What if the car can’t upload the data, because the equipment has been destroyed or because there’s no data connection? What if the uploaded data is tampered with? (as seems highly likely given what we’ve seen here) What if the uploaded data is deleted? What if…and so on.

The way to handle this is the same way that airliner data is handled: an onboard, highly tamper-resistant black box that must not be removed by anyone except (non-Tesla) investigators. Tesla will resist this not only because it’ll impede their ability to modify/fabricate/destroy evidence, but because it’ll increase the cost of vehicles.

Anonymous Coward says:

Re: Re:

Not a good idea. What if the car can’t upload the data, because the equipment has been destroyed or because there’s no data connection?

Well, then it’ll have to be copied from the car. So what? It doesn’t affect the decision of what the car should do, only what it can do.

What if the uploaded data is tampered with? (as seems highly likely given what we’ve seen here) What if the uploaded data is deleted? What if…and so on.

Again, so what? What if the car’s copy is tampered with or deleted? How does an extra data channel make things worse.

Jeff’s idea of publishing is a bad one for driver privacy, especially considering how many microphones and cameras (including internal cameras) are present. But in terms of airliners, secure uploading is a good idea (that would not replace the black box and CVR). Already, published ADS-B data has helped in many investigations.

an onboard, highly tamper-resistant black box that must not be removed by anyone except (non-Tesla) investigators

That’s probably overkill. It’s likely that the current non-hardened systems will survive almost all road crashes, and they’re not gonna spend weeks in the ocean before being found.

Tanner Andrews (profile) says:

Re: Re: partly there

The way to handle this is the same way that airliner data is handled: an onboard, highly tamper-resistant black box

Evidently this is already mostrly in place. The problem is that the Tesla ``black box” sends the data to the Tesla server — then deletes it. A legitimate ``black box” would not delete the data. This does not seem like a difficult change to what is already there.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Regular people can drive themselves, too.

And they crash quite a lot, also. Self-driving cars are already a lot safer than the average driver (they wouldn’t be allowed out of the prototype stage otherwise, precisely because any accident will be overhyped) and they soon will be so, so many times safer.

There will still be accidents. It doesn’t matter how safe they become; there will be accidents.

Companies will of course try to minimize bad PR. I don’t really believe your characterizations of a “cover-up”, particularly in any sense that would entail illegal withholding of evidence. You don’t provide any direct citations and the one article you link to doesn’t either. They seem primarily to be quoting dramatic statements made by opposing counsel, which y’know, is opposing.

This isn’t just about one lawsuit. It’s about how Tesla’s behavior threatens to undermine public trust in autonomous vehicle technology at precisely the moment when that trust is most crucial.

No, it’s about Elon Musk, and how he took away your preferred method of censorship (gov directed no less!) and ideological viewpoint discrimination. How dare someone prove you wrong! You will smear him and his companies at any point. I’m surprised the best smear for SPaceX you’ve found so far is “they put satellites in orbit, which sometimes blocks the view!”

Stephen T. Stone (profile) says:

Re:

They seem primarily to be quoting dramatic statements made by opposing counsel, which y’know, is opposing.

For what reason would the “opposing counsel” in this case lie about Tesla not turning over information that Tesla knew it possessed but refused to turn over⁠—especially when lying under oath in court could get “opposing counsel” sanctioned or even disbarred?

it’s about Elon Musk, and how he took away your preferred method of censorship … and ideological viewpoint discrimination

Are you sure you want to willingly associate yourself with a eugenics-obsessed bigot who’s worth more money that 99.9% of the world combined and the Nazi bar he owns and operates?

Ellie (profile) says:

Re: Elizabeth Holmes lies

Who says she killed 0 people with her lies?!

The State of Arizona approved her Theranos blood test facilities inside every location of a big pharmacy chain here. They were wildly inaccurate. I know because my doctor sent me there, and the results showed my cholesterol up 120 points so she prescribed medicine for it. I didn’t want to take it, asked her to re-order at a normal lab that didn’t do finger pricks like Theranos. Results came back: cholesterol and LDL were normal, same as they usually were.

I wouldn’t have died from taking the unnecessary meds. If I were old and fragile and got meds for wrong blood tests, it could have been very bad.

Anonymous Coward says:

Human drivers cause roughly 94% of serious traffic crashes, according to a decade-old study by the National Highway Traffic Safety Administration.

In other news, water is wet.

Obviously human drivers are the cause of problems largely caused by human drivers. I mean “serious traffic crashes” literally[0] means the cars movement was the problem. Additionally, if things like “animals on the road” were the primary cause of traffic accidents… we would have done something(s) to keep animals off the roads. So unless we have tons of monkeys and kittens driving vehicles, it’s the human drivers at fault by definition.

[0] In this case, this word is also used literally, and not, as is often the case now, figuratively.

Anonymous Coward says:

If people lose trust because companies like Tesla made promises they couldn’t keep

Sorry Mike, but anyone paying attention has long since given up on that.

  • Full Self Driving available by 2018 (nope)
  • 1M robotaxis by 2020 (still 0)
  • The “2020” Tesla Roadster (doesn’t exist and probably never will despite taking preorders)
  • Tesla Semi by 2019 (they won’t sell them for some reason…lol)
  • Always free supercharging
  • Battery swaps (not from Tesla!)

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Subscribe to Our Newsletter

Get all our posts in your inbox with the Techdirt Daily Newsletter!

We don’t spam. Read our privacy policy for more info.

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt needs your support! Get the first Techdirt Commemorative Coin with donations of $100
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...