In June of last yr, Jessica Guistolise received a text message that may change her life.
While the technology consultant was dining with colleagues on a piece trip in Oregon, her phone alerted her to a text from an acquaintance named Jenny, who said she had urgent information to share about her estranged husband, Ben.
After an almost two-hour conversation with Jenny later that night, Guistolise recalled, she was dazed and in a state of panic. Jenny told her she’d found pictures on Ben’s computer of greater than 80 women whose social media photos were used to create deepfake pornography — videos and photos of sexual activities made using artificial intelligence to merge real photos with pornographic images. All the ladies in Ben’s images lived within the Minneapolis area.
Jenny used her phone to snap pictures of images on Ben’s computer, Guistolise said. The screenshots, a few of which were viewed by CNBC, revealed that Ben used a site called DeepSwap to create the deepfakes. DeepSwap falls right into a category of “nudify” sites which have proliferated because the emergence of generative AI lower than three years ago.Â
CNBC decided not to make use of Jenny’s surname with the intention to protect her privacy and withheld Ben’s surname resulting from his assertion of mental health struggles. They are actually divorced.
Guistolise said that after talking to Jenny, she was eager to cut her trip short and rush home.
In Minneapolis the ladies’s experiences would soon spark a growing opposition to AI deepfake tools and those that use them.
Considered one of the manipulated photos Guistolise saw upon her return was generated using a photograph from a family vacation. One other was from her goddaughter’s college graduation. Each had been taken from her Facebook page. Â
“The primary time I saw the actual images, I feel something inside me shifted, like fundamentally modified,” said Guistolise, 42.
CNBC interviewed greater than two dozen people — including victims, their relations, attorneys, sexual-abuse experts, AI and cybersecurity researchers, trust and safety staff within the tech industry, and lawmakers — to learn the way nudify web sites and apps work and to grasp their real-life impact on people.
“It isn’t something that I would want for on anybody,” Guistolise said.
Jessica Guistolise, Megan Hurley and Molly Kelley talk with CNBC in Minneapolis, Minnesota, on July 11, 2025, about fake pornographic images and videos depicting their faces made by their mutual friend Ben using AI site DeepSwap.
Jordan Wyatt | CNBC
Nudify apps represent a small but rapidly growing corner of the brand new AI universe, which exploded following the arrival of OpenAI’s ChatGPT in late 2022. Since then, Meta, Alphabet, Microsoft, Amazon and others have collectively spent lots of of billions of dollars investing in AI and pursuing artificial general intelligence, or AGI — technology that would rival and even surpass the capabilities of humans.Â
For consumers, many of the excitement thus far has been around chatbots and image generators that allow users to perform complex tasks with easy text prompts. There’s also the burgeoning market of AI companions, and a number of agents designed to reinforce productivity.Â
But victims of nudify apps are experiencing the flip side of the AI boom. Due to generative AI, products equivalent to DeepSwap are really easy to make use of — requiring no coding ability or technical expertise — that they could be accessed by nearly anyone. Guistolise and others said they worry that it’s only a matter of time before the technology spreads widely, leaving many more people to suffer the implications.
Guistolise filed a police report in regards to the case and obtained a restraining order against Ben. But she and her friends quickly realized there was an issue with that strategy.
Ben’s actions could have been legal.Â
The ladies involved weren’t underage. And so far as they were aware, the deepfakes hadn’t been distributed, existing only on Ben’s computer. While they feared that the videos and pictures were on a server somewhere and will find yourself within the hands of bad actors, there was nothing of that kind that they may pin on Ben.Â
Considered one of the opposite women involved was Molly Kelley, a law student who would spend the following yr helping the group navigate AI’s uncharted legal maze.Â
“He didn’t break any laws that we’re aware of,” Kelley said, referring to Ben’s behavior. “And that’s problematic.”
Ben admitted to creating the deepfakes, and told CNBC by email that he feels guilty and ashamed of his behavior.
Jenny described Ben’s actions as “horrific, inexcusable, and unforgivable,” in an emailed statement.
“From the moment I learned the reality, my loyalty has been with the ladies affected, and my focus stays on how best to support them as they navigate their recent reality,” she wrote. “This shouldn’t be a problem that can resolve itself. We’d like stronger laws to make sure accountability — not just for the individuals who misuse this technology, but in addition for the businesses that enable its use on their platforms.”
Available
Like other recent and simple-to-use AI tools, experts say that many apps which have nudify services advertise on Facebook and can be found to download from the Apple App Store and Google Play Store.
Haley McNamara, senior vp on the National Center on Sexual Exploitation, said nudify apps and sites have made it “very easy to create realistic sexually explicit, deepfake imagery of an individual based off of 1 photo in less time than it takes to brew a cup of coffee.”
Two photos of Molly Kelley’s face and considered one of Megan Hurley’s appear on a screenshot taken from a pc belonging to their mutual friend Ben, who used the ladies’s Facebook photos without their consent to make fake pornographic images and videos using the AI site DeepSwap, July 11, 2025.
A spokesperson from Meta, Facebook’s owner, said in an announcement that the corporate has strict rules barring ads that contain nudity and sexual intercourse and that it shares information it learns about nudify services with other corporations through an industrywide child-safety initiative. Meta characterised the nudify ecosystem as an adversarial space and said it’s improving its technology to try to stop bad actors from running ads.Â
Apple told CNBC that it usually removes and rejects apps that violate its app store guidelines related to content deemed offensive, misleading and overtly sexual and pornographic.Â
Google declined to comment.
The difficulty extends well beyond the U.S.
In June 2024, around the identical time the ladies in Minnesota discovered what was happening, an Australian man was sentenced to nine years in prison for creating deepfake content of 26 women. That very same month, media reports detailed an investigation by Australian authorities into a college incident wherein an adolescent allegedly created and distributed deepfake content of nearly 50 female classmates.
“Regardless of the worst potential of any technology is, it’s almost at all times exercised against women and girls first,” said Mary Anne Franks, professor on the George Washington University Law School.
Security researchers from the University of Florida and Georgetown University wrote in a research paper presented in August that nudify tools are taking design cues from popular consumer apps and using familiar subscription models. DeepSwap charges users $19.99 a month to access “premium” advantages, which incorporates credits that could be used for AI video generation, faster processing and higher-quality images.
The researchers said the “nudification platforms have gone fully mainstream” and are “advertised on Instagram and hosted in app stores.”
Guistolise said she knew that individuals could use AI to create nonconsensual porn, but she didn’t realize how easy and accessible the apps were until she saw an artificial version of herself participating in raunchy, explicit activity.Â
In response to the screenshots of Ben’s DeepSwap page, the faces of Guistolise and the opposite Minnesota women sit neatly in rows of eight, like in a college yearbook. Clicking on the photos, Jenny’s pictures show, results in a group of computer-generated clones engaged in a wide range of sexual acts. The ladies’s faces had been merged with the nude bodies of other women.
DeepSwap’s privacy policy states that users have seven days to take a look at the content from the time they upload it to the location, and that the information is stored for that period on servers in Ireland. DeepSwap’s site says it deletes the information at that time, but users can download it within the interim onto their very own computer.Â
The positioning also has a terms of service page, which says users shouldn’t upload any content that “incorporates any private or personal information of a 3rd party without such third party’s consent.” Based on the experiences of the Minnesota women, who provided no consent, it’s unclear whether DeepSwap has any enforcement mechanism.Â
DeepSwap provides little publicly by the use of contact information and didn’t reply to multiple CNBC requests for comment.
CNBC reporting found AI site DeepSwap, shown here, was utilized by a Minneapolis man to create fake pornographic images and videos depicting the faces of greater than 80 of his friends and acquaintances.
In a press release published in July, DeepSwap used a Hong Kong dateline and included a quote attributed to an individual the discharge identified as CEO and co-founder Penyne Wu. The media contact on the discharge was listed as marketing manager Shawn Banks.Â
CNBC was unable to seek out information online about Wu, and sent multiple emails to the address provided for Banks, but received no response.Â
DeepSwap’s website currently lists “MINDSPARK AI LIMITED” as its company name, provides an address in Dublin, and states that its terms of service are “governed by and construed in accordance with the laws of Ireland.”
Nevertheless, in July, the identical DeepSwap page had no mention of Mindspark, and references to Ireland as a substitute said Hong Kong.Â
Psychological trauma
Kelley, 42, came upon about her inclusion in Ben’s AI portfolio after receiving a text message from Jenny. She invited Jenny over that afternoon.
After learning what happened, Kelley, who was six months pregnant on the time, said it took her hours to muster the strength to view the photos captured from Jenny’s phone. Kelley said what she saw was her face “very realistically on another person’s body, in images and videos.”Â
Kelley said her stress level spiked to a level that it soon began to affect her health. Her doctor warned her that an excessive amount of cortisol, brought on by stress, would cause her body not “to make any insulin,” Kelley recalled.Â
“I used to be not having fun with life in any respect like this,” said Kelley, who, like Guistolise, filed a police report on the matter.
Kelley said that in Jenny’s photos she recognized a few of her good friends, including many she knew from the service industry in Minneapolis. She said she then notified the ladies and he or she purchased facial-recognition software to assist discover the opposite victims in order that they might be informed. About half a dozen victims have yet to be identified, she said.
“It was incredibly time consuming and really stressful because I used to be attempting to work,” she said.Â
Victims of nudify tools can experience significant trauma, resulting in suicidal thoughts, self-harm and a fear of trust, said Ari Ezra Waldman, a law professor at University of California, Irvine who testified at a 2024 House committee hearing on the harms of deepfakes.
Waldman said even when nudified images have not been posted publicly, subjects can fear that the photographs may eventually be shared, and “now someone has this dangling over their head like a sword of Damocles.”Â
“Everyone seems to be subject to being objectified or pornographied by everyone else,” he said.Â
Three victims showed CNBC explicit, AI-created deepfake images depicting their faces in addition to those of other women, during an interview in Minneapolis, Minnesota, on July 11, 2025.
Megan Hurley, 42, said she was attempting to enjoy a cruise last summer off the western coast of Canada when she received an urgent text message from Kelley. Her vacation was ruined.Â
Hurley described quick feelings of deep paranoia after returning home to Minneapolis. She said she had awkward conversations with an ex-boyfriend and other male friends, asking them to take screenshots in the event that they ever saw AI-generated porn online that looked like her.Â
“I do not know what your porn consumption is like, but in case you ever see me, could you please screencap and let me know where it’s?” Hurley said, describing the sorts of messages she sent on the time. “Because we might have the option to prove dissemination at that time.”
Hurley said she contacted the FBI but never heard back. She also filled out a web-based FBI crime report, which she shared with CNBC. The FBI confirmed that it received CNBC’s request for comment, but didn’t provide a response.
The group of girls began looking for help from lawmakers. They were led to Minnesota state Sen. Erin Maye Quade, a Democrat who had previously sponsored a bill that became a state statute criminalizing the “nonconsensual dissemination of a deep fake depicting intimate parts or sexual acts.” Â
Kelley landed a video call with the senator in early August 2024.Â
Within the virtual meeting, several women from the group told their stories, and explained their frustrations in regards to the limited legal recourse available. Maye Quade went to work on a brand new bill, which she announced in February, that may compel AI corporations to shut down apps using their technology to create nudify services.Â
The bill, which continues to be being considered, would effective tech corporations that supply nudify services $500,000 for each nonconsensual, explicit deepfake that they generate within the state of Minnesota.
Maye Quade told CNBC in an interview that the bill is the trendy equivalent of longstanding laws that make it illegal for an individual to peep into another person’s window and snap explicit photos without consent.Â
“We just have not grappled with the emergence of AI technology in the identical way,” Maye Quade said.
Minnesota state Sen. Erin Maye Quade, at left, talks to CNBC’s Jonathan Vanian and Katie Tarasov in Minneapolis on July 11, 2025, about her efforts to pass state laws that may effective tech corporations that supply nudify services $500,000 for each nonconsensual, explicit deepfake image they generate in her state.
Jordan Wyatt | CNBC
But Maye Quade acknowledged that enforcing the law against corporations based overseas presents a major challenge.Â
“For this reason I feel a federal response is more appropriate,” she said. “Because actually having a federal government, a rustic could take way more actions with corporations which are based in other countries.”
Kelley, who gave birth to her son in September 2024, characterised considered one of her late October meetings with Maye Quade and the group as a “blur,” because she said she was “mentally and physically unwell resulting from sleep deprivation and stress.”
She said she now avoids social media.Â
“I never announced the birth of my second child,” Kelley said. “There’s plenty of individuals on the market who do not know that I had a baby. I just didn’t need to put it online.”
The early days of deepfake pornography
The rise of deepfakes could be traced back to 2018. That is when videos showing former President Barack Obama giving speeches that never existed and actor Jim Carrey, as a substitute of Jack Nicholson, appearing in “The Shining” began going viral.Â
Lawmakers sounded the alarm. Sites equivalent to Pornhub and Reddit responded by pledging to take down nonconsensual content from their platforms. Reddit said on the time that it removed a big deepfake-related subreddit as a part of an enforcement of a policy banning “involuntary pornography.”
The community congregated elsewhere. One popular place was MrDeepFakes, which hosted explicit AI-generated videos and served as a web-based discussion forum.Â
By 2023, MrDeepFakes became the highest deepfake site on the net, hosting 43,000 sexualized videos containing nearly 4,000 individuals, based on a 2025 study of the location by researchers from Stanford University and the University of California San Diego.
MrDeepFakes claimed to host only “celebrity” deepfakes, however the researchers found “that lots of of targeted individuals have little to no online or public presence.” The researchers also discovered a burgeoning economy, with some users agreeing to create custom deepfakes for others at a mean cost of $87.50 per video, the paper said.
Some ads for nudify services have gone to more mainstream locations. Alexios Mantzarlis, an AI security expert at Cornell Tech, earlier this yr discovered greater than 8,000 ads on the Meta ad library across Facebook and Instagram for a nudify service called CrushAI.Â
AI apps and sites like Undress, DeepNude and CrushAI are among the “nudify” tools that could be used to create fake pornographic images and videos depicting real people’s faces pulled from innocuous online photos.
Emily Park | CNBC
At the very least one DeepSwap ad ran on Instagram in October, based on the social media company’s ad library. The account related to running the ad doesn’t seem like officially tied to DeepSwap, but Mantzarlis said he suspects the account might have been an affiliate partner of the nudify service.
Meta said it reviewed ads related to the Instagram account in query and didn’t find any violations.
Top nudify services are sometimes found on third-party affiliate sites equivalent to ThePornDude that earn money by mentioning them, Mantzarlis said.Â
In July, Mantzarlis co-authored a report analyzing 85 nudify services. The report found that the services receive 18.6 million monthly unique visitors in aggregate, though Mantzarlis said that figure doesn’t keep in mind individuals who share the content in places equivalent to Discord and Telegram.
As a business, nudify services are a small a part of the generative AI market. Mantzarlis estimates annual revenue of about $36 million, but he said that is a conservative prediction and includes only AI-generated content from sites that specifically promote nudify services.Â
MrDeepFakes abruptly shut down in May, shortly after its key operator was publicly identified in a joint investigative report from Canada’s CBC News, Danish news sites Politiken and Tjekdet, and online investigative outlet Bellingcat.
CNBC reached out by email to the address that was related to the person named because the operator in some materials from the CBC report, but received no reply.Â
With MrDeepFakes going dark, Discord has emerged as an increasingly popular meeting spot, experts said. Known mostly for its use in the net gaming community, Discord has roughly 200 million global monthly energetic users who access its servers to debate shared interests.Â
CNBC identified several public Discord servers, including one related to DeepSwap, where users seemed to be asking others within the forum to create sexualized deepfakes based on photos they shared.Â
Leigh Cassidy Gibson, a researcher on the University of Florida, co-authored the 2025 paper that checked out “20 popular and easy-to-find nudification web sites.” She confirmed to CNBC that while DeepSwap wasn’t named, it was considered one of the sites she and her colleagues studied to grasp the market. More recently, she said, they’ve turned their attention to numerous Discord servers where users seek tutorials and how-to guides on creating AI-generated, sexual content.
Discord declined to comment.
‘It’s insane to me that that is legal at once’
On the federal level, the federal government has no less than taken note.Â
“A one that violates considered one of the publication offenses pertaining to depictions of adults is subject to criminal fines, imprisonment of as much as two years, or each,” based on the law’s text.
Experts told CNBC that the law still doesn’t address the central issue facing the Minnesota women, because there is no evidence that the fabric was distributed online.Â
Maye Quade’s bill in Minnesota emphasizes that the creation of the fabric is the core problem and requires a legal response.Â
Some experts are concerned that the Trump administration’s plans to bolster the AI sector will undercut states’ efforts. In late July, Trump signed executive orders as a part of the White House’s AI Motion Plan, underscoring AI development as a “national security imperative.”Â
As a part of Trump’s proposed spending bill earlier this yr, states would have been deterred from regulating AI for a 10-year period or risk losing certain government subsidies related to AI infrastructure. The Senate struck down that provision in July, keeping it out of the bill Trump signed in August. Â
“I might not put it past them attempting to resurrect the moratorium,” said Waldman, of UC Irvine, regarding the tech industry’s continued influence on AI policy.
A White House official told CNBC that the Take It Down Act, which was supported by the Trump administration and signed months prior to the AI Motion Plan, criminalizes nonconsensual deepfakes. The official said the AI Motion Plan encourages states to permit federal laws to override individual state laws.
In San Francisco, home to OpenAI and other high-valued AI startups, town can pursue civil cases against nudify services resulting from California consumer protection laws. Last yr San Francisco sued 16 corporations related to nudify apps.
The San Francisco City Attorney’s office said in June that an investigation related to the lawsuits had led to 10 of the most-visited nudify web sites being taken offline or not being accessible in California. Considered one of the businesses that was sued, Briver LLC, settled with town and has agreed to pay $100,000 in civil penalties. Moreover, Briver not operates web sites that may create nonconsensual deepfake pornography, town attorney’s office said.
Further south, in Silicon Valley, Meta in June sued Hong Kong-based Joy Timeline HK, the corporate behind CrushAI. Meta said that Joy Timeline attempted to “circumvent Meta’s ad review process and proceed placing these ads, after they were repeatedly removed for breaking our rules.”
Still, Mantzarlis, who has been publishing his research on Indicator, said he continues to seek out nudify-related ads on Meta’s platforms.Â
Mantzarlis and a colleague from the American Sunlight Project discovered 4,215 ads for 15 AI nudifier services that ran on Facebook and Instagram since June 11, they wrote in a joint report on Sept. 10. Mantzarlis said Meta eventually removed the ads, a few of which were more subtle than others in implying nudifying capabilities. Â
Meta told CNBC that earlier this month that it removed 1000’s of ads linked to corporations offering nudify services and sent the entities cease-and-desist letters for violating the corporate’s ad guidelines.
In Minnesota, the group of friends try to get on with their lives while continuing to advocate for change.Â
Guistolise said she wants people to appreciate that AI is potentially getting used to harm them in ways they never imagined.
“It is so essential that individuals know that this really is on the market and it’s really accessible and it’s very easy to do, and it really must stop,” Guistolise said. “So here we’re.”
Survivors of sexual violence can seek confidential support from the National Sexual Assault Hotline at 1-800-656-4673.






