Advertisement

5 takeaways from CNBC’s investigation into ‘nudify’ apps and sites

Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.

Jordan Wyatt | CNBC

In the summer of 2024, a group of women in the Minneapolis area learned that a male friend used their Facebook photos mixed with artificial intelligence to create sexualized images and videos.   

Using an AI site called DeepSwap, the man secretly created deepfakes of the friends and over 80 women in the Twin Cities region. The discovery created emotional trauma and led the group to seek the help of a sympathetic state senator.

As a CNBC investigation shows, the rise of “nudify” apps and sites has made it easier than ever for people to create nonconsensual, explicit deepfakes. Experts said these services are all over the Internet, with many being promoted via Facebook ads, available for download on the Apple and Google app stores and easily accessed using simple web searches.

“That’s the reality of where the technology is right now, and that means that any person can really be victimized,” said Haley McNamara, senior vice president of strategic initiatives and programs at the National Center on Sexual Exploitation.

CNBC’s reporting shines a light on the legal quagmire surrounding AI, and how a group of friends became key figures in the fight against nonconsensual, AI-generated porn.

Here are five takeaways from the investigation.

The women lack legal recourse

The harm is real

Deepfakes are easier to create than ever

Nudify service DeepSwap is hard to find

The site that was used to create the content is called DeepSwap, and there’s not much information about it online.

In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote from Penyne Wu, who was identified in the release as CEO and co-founder. The media contact on the release was Shawn Banks, who was listed as marketing manager. 

CNBC was unable to find information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.

DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”

However, in July, the same DeepSwap page had no mention of Mindspark, and references to Ireland instead said Hong Kong. 

AI’s collateral damage

Maye Quade’s bill, which is still being considered, would fine tech companies that offer nudify services $500,000 for every nonconsensual, explicit deepfake that they generate in the state of Minnesota.

Some experts are concerned, however, that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts.

In late July, Trump signed executive orders as part of the White House’s AI Action Plan, underscoring AI development as a “national security imperative.” 

Kelley hopes that any federal AI push doesn’t jeopardize the efforts of the Minnesota women.

“I’m concerned that we will continue to be left behind and sacrificed at the altar of trying to have some geopolitical race for powerful AI,” Kelley said.

WATCH: The alarming rise of AI ‘nudify’ apps that create explicit images of real people.

The alarming rise of AI ‘nudify’ apps that create explicit images of real people

https://image.cnbcfm.com/api/v1/image/108202454-1758646478899-three_shot_3.png?v=1758925569&w=1920&h=1080

2025-09-28 07:00:01

Leave a Reply

Your email address will not be published. Required fields are marked *

Social Media Auto Publish Powered By : XYZScripts.com