Video has long been considered one of the most reliable and effective forms of evidence in the courtroom. After all, there’s nothing more impactful to a jury than being able to see the car accident, brutal killing, or robbery with their own eyes. But what if the court can no longer trust what they see with their own eyes? What happens if video evidence can be radically altered to misrepresent and twist the truth rather than prove it?
Elon Musk told a tech conference at Los Angeles in 2016 and told the interviewers and the recording video: “A Model S and Model X at this point can drive autonomously with greater safety than a person. Right now.”
The video stayed up on YouTube for almost seven years until it was recently presented as evidence as part of a lawsuit brought by the estate of a man who died when his Tesla crashed while on its self-driving feature. The family’s attorneys used Musk’s own words as evidence against him.
Telsa’s lawyers argued that the video of Musk at the conference in 2016 may have been doctored, as Musk, like many public figures, may be the subject of “deep fake” videos that purport to show him saying and doing things he never actually said or did. Deep fake videos have become an increasing concern as technology makes it easier for anyone to create images and moving pictures that don’t exist or events that never happened. The intelligence community and political scientists are concerned that such fake videos could be used to spread disinformation, impersonate politicians, scam people, and manipulate elections or entire populations.
How Courts Separate Fake Evidence from Real Evidence
Although deep fake videos are a valid concern in politics and national security, the judicial system has a certain degree of built in protection through all the rules of evidence and procedures that have been built over the centuries. In order for a jury to be fooled by a deep fake during a trial, the video would have to be deemed admissible evidence by a court over the objections of an opposing attorney who has ever incentive to scrutinize every piece of evidence being offered at trial.
The following procedures, in totality, would make it difficult for a fake video to make it all the way to a jury:
For evidence to be admissible, the party producing the video must establish chain of custody. The party must describe every person who has had access to the video and when. The chain of custody would reveal if anyone with motive would be tempted to tamper with the evidence.
For evidence to be admissible, the party producing evidence must establish a foundation for the evidence. They must show the video to the party that made the video and that person must state under oath that they made the video, whether they altered the video in any manner, and the context of the video being presented.
Once the foundation for the video has been established, the burden of proof shifts to the opposing party to provide evidence that the video was doctored. The opposing party can:
Hire experts to explain the technical reasons that show how the video was falsified.
Call witnesses who saw events play out differently than what the video shows.
Produce contrary videos of the same events
These procedures are not foolproof, but they will debut all but the most genuine looking videos. Additionally, courts take false evidence seriously. Anyone who knowingly presents doctored videos to defraud the court may be subject to monetary fines and other penalties.
Could Deepfake Videos Erode Public Trust in Evidence?
A potential issue with deep fake videos is not whether the courts will be flooded with fake evidence, but whether accusations of fake evidence will drive legal fees. Attorneys could spend considerable effort and time combating such claims as they become increasingly prevalent.
However, the societal impact is the erosion of reality and fiction in the American consciousness. Covid-19 and the 2020 elections have unleashed a torrent of conspiracy theories across the country. We may reach an infliction point where people just discount any videos they see as “deep fakes.” This of course may be the goal of men like Elon Musk or the defendants from the Capitol Insurrection. As the court noted, figures like Musk want to say whatever they want without consequence and deep fake videos are the perfect shield against their own words.
However, Telsa’s attorneys overplayed their hand. They argued that because the video of his Los Angles conference could be fake, Musk was not obligated to give testimony. This was the wrong way to go about it, as anyone who believes there is a fake evidence against them would likely want the opportunity to dispute it. Instead, Telsa attempted to use the excuse of potential fake videos to try and hide their client. The court ordered Musk to testify anyway.
Still, the possibility remains that someone will eventually try to present false footage as evidence. Conversely, more defendants may argue that the video against them was somehow doctored. Although society as a whole may not be ready for this technology, for once the judiciary’s archaic procedures have allowed them to get ahead of the curve when it comes to deep fake videos.
Do I Need a Lawyer If I Think a Video Has Been Faked?
The rules of evidence are technical and complex even for video evidence. Evidence is one of the most important aspects of any trial. If you need help with evidence issues, it is in your best interest to hire a skilled criminal defense or civil trial lawyer who has mastery of the rules of evidence.
The post Claiming Deepfake Videos Is Not Enough To Throw Out Video Evidence in Court appeared first on Law Blog.