24Business

Ai deepfakes could increase divorce costs, threatening evidence in court


Americans who want to resolve divorce and get custody of their children could collect unforeseen court costs trying to deny artificial intelligence (AI) -Genated Deepfake Videos, Photos and documents, according to the leading lawyer for family law.

Michelle O’Neill, co -founder of a law firm headquartered in Dallas, said Fox job These courts notice the “real increase” of false evidence, often created with AI. The problem, she said, is becoming more frequent and judges are taught in schools and conferences to stay awake.

One kind of defective evidence is generated by AI Revenge porn – including false images and videos of individuals dealing with intimate acts. O’Neill notes that although Deepfakes, they primarily broke in news when they influence celebrities, this question also affects ordinary citizens who experience termination or divorce litigation through the family court.

Most small businesses use artificial intelligence

The use of artificial intelligence to generate false images and videos could raise costs for clients undergoing a divorce. (East/Kirill Kudryavtsev/AFP via Getty Images/Getty Images)

O’Neill’s claim of these types of Ai-rated content “explodes on the scene” is substantiated by statistics showing that the distribution of a video of Deepfake, not including photos 900% on an annual basis Since 2019.

“When the client brings me evidence, I have to question my clients more than I ever had about where you got it? How did you get it? You know, where did he come from?” O’neill said.

The problem also overwhelmingly affects women. Sensity AI research company has consistently revealed that between 90% and 95% of all internet deepfakes is non -sensual porn. About 90% of that number is Non -consensual porn woman.

Despite the stunning number, O’Neill says that the social media platforms are slow to act.

The first lady of Melania Trump spoke on the Capitol hill in early March, the first time after returning to the White House, participating in a roundtable with the legislators and victims of revenge porn and ai-rated Deepfakes.

Congress is currently zero in punishing the abuse of the internet, which included non -consensual, explicit paintings.

Ai fraud is expanding. The new tool is trying to fight them

The green wire frame model includes the cast of the lower face during the creation of a synthetic video on the resuscitation of the face, which is alternatively known as Deepfake, London, Britain on February 12, 2019. (Reuters TV / Reuters / Reuters Photos)

Law of descent is the proposal of the law presented in Senate by Sens. Ted Cruz, R-Texas and Amy Klobuchar, D-minn. Therefore, a federal crime would publish or threaten by publishing, non-consensual intimate images, including “digital forgery” made of artificial intelligence. The proposal of the law unanimously adopted the Senate earlier in 2025, and Cruz said on Monday that he believed that the home would bring it before it became a law.

While the government is advocating for new laws, O’Neill says AI used AI to create false and explicit content remains a “real threat” to the judicial system.

“The integrity of our very judicial system depends on the integrity of the evidence that you can enter and present. If you cannot rely on the integrity of the evidence presented to the judge, if the judge cannot even rely on the integrity of the evidence they receive – our judicial system may be fully in accordance with the existence of artificial intelligence.

Ai, O’Neill notes, also adversely affects the economic challenge of Americans who have become prey to false court evidence. Now, an individual who challenges the authenticity of recognized evidence may have to pay for a Video Forensic expert in order to conduct a test of examination and check.

Almost 50% of voters said that Deepfakes has a certain impact on the choice decision: Survey

Artificial intelligence Deepfakes is a serious risk of the judicial system, said Michelle O’Neill Family Law Lawyer. (Getty Images / Getty Images)

False evidence can even be expanded to videos that indicate a child’s abuse when two parties are fighting for custody. If the party does not have the financial resources to recover that evidence of abuse is generated by AI, judges now have to decide whether to take over the word alleged victims or trust the recordings that have entered court.

“What happens to people who have no money [to disprove that]?? So, not only do we have a threat to the integrity of the judicial system, but we also have a problem with the right approach, “O’Neill said.

Family right lawyer noted that judges primarily see a nasty use of AI in creating fake documents, such as fake banking records or drug tests.

One judge also told O’Neil that they had encountered the forgery of the audiocase who threw the other party into a negative light. The quality of the recording was not true enough. The judge reprimanded the individual and the evidence was excluded.

Get a job with Fox on a clicking movement here

However, with the rapid increase in this technology, O’Neill cares that the gap between what is real and what is generated by Ai-ratio.

“I think this is a question at many levels of our society. And, you know, drawing attention to that is something that is very important,” she said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Social Media Auto Publish Powered By : XYZScripts.com