The latest industry news and insights
Deepfake audio evidence was used in a UK child custody battle in an effort to discredit the father, as reported by The National News. Byron James, a partner at law firm Expatriate Law involved in the case said a heavily doctored recording of his client had been presented in court as evidence in a family dispute.
In the edited version of the audio, the child’s father was heard making direct and “violent” threats towards his wife. However, when digital forensics experts examined the recording, they found it had been manipulated to include words not used by the client. Mr James stated,
“This is a case where the mother has denied the father access to the children and said he was dangerous because of his violent and threatening behaviour, She produced an audio file that she said proved he had explicitly threatened her. We were able to see it had been edited after the original phone call took place and we were also able to establish which parts of it had been edited. The mother used software and online tutorials to put together a plausible audio file.”
Manipulated video or audio recordings, sometimes referred to as deep fakes, risk becoming an increasing issue for police, the courts and other law enforcement agencies. Mr James said it was the first time he had encountered doctored audio evidence in his career but said that all courts needed to be vigilant.
Local legal experts are warning parents going through a divorce against travelling with their child over the holidays unless they have permission under a custody agreement. Mr James outlined what had happened to his client, who lives in the Emirates.
“We were lucky to get the original audio file and be able to study the metadata on the recording. She [the wife] said [the doctored recording] justified her stance and that he [the husband] should not be allowed to see the children. If we hadn’t been able to challenge this piece of evidence, then it would have negatively affected him and portrayed him as a violent and aggressive man. It raises all sorts of questions about what sort of evidence you can rely on. Is there sufficient judicial training to identify digital evidence that has been manipulated in this manner?”
Mr James went on to suggest that it would never occur to most judges that deepfake audio evidence could be submitted and that recordings could be taken at face value, unfairly influencing the outcome of trials.
After submitting an enquiry, a member of our team will be in touch with you as soon as possible
Your information will only be used to contact you, and is lawfully in accordance with the General Data Protection Regulation (GDPR) act, 2018.