[ad_1]
“Imagine a girl waking up and finding an explicit photo or video that looks like her but it isn’t her,” Dorota Mani told The Post.
That’s what happened to Mani’s 14-year-old daughter, Francesca, who discovered in October that classmates at Westfield High School, in Union County, New Jersey, had been circulating a photo that appeared to depict her in the nude.
But the photo wasn’t real. It was generated by boys in her class using artificial intelligence “deepfake” technology.
All the perpetrator needed was to take an innocent photo of Francesca from her social media and use one of dozens of apps or websites, to turn it into a fake nude.
This week the ease with which vile pornographic “deepfakes” can be created exploded into public view as a nude image of Taylor Swift flooded social media.
The twisted version of the world’s biggest pop star was so ubiquitous that X, formerly Twitter, had to censor searches of her name to stop innocent users seeing the image.
And a new analysis for the Associated Press revealed that more deepfake content was posted online this year than every other prior year combined.
The issue was only mentioned once on Wednesday on Capitol Hill when senators quizzed tech CEOs including Mark Zuckerberg, who runs Facebook and Instagram, Evan Spiegel who leads Snapchat, and X’s Linda Yaccarino on children’s safety for four hours.
But Francesca was, her mother said, one of dozens of victims of deepfake pornography at her high school alone — suggesting the problem is regularly affecting millions of teenage American girls.
And there is no federal law which prevents the images being spread, because they are not currently categorized as child abuse images.
“It was a few bored, spoiled teenagers that decided to have some fun at the expense of their classmates,” her mother said. “There should be accountability, and the girls should know there is accountability.”
The school district said the number affected was “far fewer” than 30 but declined to disclose the exact number.
Mani, 44, thought she was doing everything possible to protect her daughter online but was blindsided by deepfake technology being used against her.
“I am an educated, informed middle aged woman. I spoke to my children about Snapchat and about TikTok,” she said. “But to be honest with you, I didn’t know that with one click, you can create such a thing.”
Mani, a mom-of-two, learned her daughter was a victim when she was called by the school to tell her, and was then contacted by local police to ask if she and Francesca wanted to submit a formal complaint.
But eventually police told the Manis and other victims’ families that they could find no New Jersey or federal law which made the deepfakes illegal, while the school adopted new rules to explicitly make sharing the images an offense.
School superintendent Dr. Raymond González told The Post that Westfield, one of New Jersey’s most affluent towns. was not alone in dealing with the problem.
“All school districts are grappling with the challenges and impact of artificial intelligence,” he said.
“The Westfield Public School District has safeguards in place… [we] continue to strengthen our efforts by educating our students and establishing clear guidelines to ensure that these new technologies are used responsibly.”
Now Mani and Francesca have taken their plea for action to Congress and other political leaders — including an expected meeting at the White House next week.
“We found out that there are no laws to protect us from it, no school AI policies, no state laws, no federal laws,” Mani told The Post. “We were just appalled with how the situation was dealt with.
“Everybody wants to see us broken and cry on the floor. This is not who we are. Girls should stand up for themselves and not accept just being a victim.”
Both are calling for criminal penalties for the distribution of deepfakes and legal protection for victims.
The mother and daughter have collaborated with state lawmakers in New Jersey to craft a bill that would make sharing deepfake porn a finable offense — and could attach prison time to it, too.
The state would join Texas, Virginia, and New York, which have all criminalized nonconsensual deepfake porn.
They are backing bipartisan legislation in the Senate, led by Democratic senator Amy Klobuchar and Republican Josh Hawley, to make non-consensual deepfakes illegal, under the proposed DEFIANCE Act, and a similar effort in the House by Democratic congressman Joe Morelle, who represents Rochester, NY, called the Preventing Deepfakes of Intimate Images Act.
“Nobody—neither celebrities nor ordinary Americans—should ever have to find themselves featured in AI pornography,” Hawley said. “Innocent people have a right to defend their reputations and hold perpetrators accountable in court.”
At a press conference with Rep. Morelle in Washington, D.C., in January, Francesca said: “Just because I’m a teenager doesn’t mean my voice isn’t powerful. What happened to me and my classmates was not cool, and there’s no way I’m just going to shrug and let it slide.”
[ad_2]
Source link