Tesla is refusing to produce Musk to answer about his recorded safety claims because, they contend, what if he didn’t make them?
Attorneys in a lawsuit against Tesla want to ask Elon Musk some questions about some key past claims he made about the product in the media. Specifically, his past claims about the safety and reliability of its self-driving features.
Tesla lawyers argued that, as a public figure susceptible to the slings and arrows of the internet, the statements at issue might have been digitally altered deepfakes.
But…
Seriously, were they? Musk’s side doesn’t actually know for sure. Doesn’t this then seem like something the lawyers could hash out at, say, a deposition? California’s Judge Evette D. Pennypacker certainly thinks so, finding the only thing deep about this request is how it was “deeply troubling to the court.”
“Their position is that because Mr. Musk is famous and might be more of a target for deep fakes, his public statements are immune,” the judge wrote. “In other words, Mr. Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.”
Well, that went about as well as a SpaceX launch, huh?
Judge Pennypacker’s order is tentative. The parties will convene today to grant Tesla a chance to change her mind. If they’re smart, they’ll claim that their whole motion was a deepfake and try to get some credibility back by running away from this argument.
This is not to discount the seriousness of the issue. Deepfake technology is already a rising fraud threat and poses significant challenges to intellectual property rights regimes. But until now, few focused on the danger it poses to the civil discovery process. To wit, with deepfakes growing more and more sophisticated, more and more lawyers might… make gobsmackingly stupid arguments like this one.
Deepfakes will become serious business in discovery. Litigating over the reliability of audio and video evidence will soon become a battleground for attorneys and digital forensic experts in some high-profile matter. But at its absolute best, “this is a deepfake” is not an excuse to avoid a deposition altogether.
And Tesla didn’t even bring that argument to the table! They didn’t say “this is a deepfake,” they came with, “I dunno… maybe these were deepfakes?”
Yeah, someone could have deepfaked Musk saying that stuff. If that’s your stance, let him say under oath that he didn’t say what the video says and then you go prove to a jury that they should trust the boy idiot over their eyes.
In fact, that’s what makes this argument such an extra level of stupid: if Tesla wants to claim those clips are fake, it increases the relevance of Musk’s testimony under oath! Because the gateway argument Tesla would need to win is that Musk does not believe he made the recorded statements and the trier of fact should get to evaluate the witness assert that in sworn testimony.
Did anyone over there even game out what making this argument might mean?
Elon Musk Likely Must Give Deposition in Fatal Tesla Autopilot Crash Suit [Bloomberg]
Joe Patrice is a senior editor at Above the Law and co-host of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you’re interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.
For more of the latest in litigation, regulation, deals and financial services trends, sign up for Finance Docket, a partnership between Breaking Media publications Above the Law and Dealbreaker.