child abuse

“csam-generated-by-ai-is-still-csam,”-doj-says-after-rare-arrest

“CSAM generated by AI is still CSAM,” DOJ says after rare arrest

“CSAM generated by AI is still CSAM,” DOJ says after rare arrest

The US Department of Justice has started cracking down on the use of AI image generators to produce child sexual abuse materials (CSAM).

On Monday, the DOJ arrested Steven Anderegg, a 42-year-old “extremely technologically savvy” Wisconsin man who allegedly used Stable Diffusion to create “thousands of realistic images of prepubescent minors,” which were then distributed on Instagram and Telegram.

The cops were tipped off to Anderegg’s alleged activities after Instagram flagged direct messages that were sent on Anderegg’s Instagram account to a 15-year-old boy. Instagram reported the messages to the National Center for Missing and Exploited Children (NCMEC), which subsequently alerted law enforcement.

During the Instagram exchange, the DOJ found that Anderegg sent sexually explicit AI images of minors soon after the teen made his age known, alleging that “the only reasonable explanation for sending these images was to sexually entice the child.”

According to the DOJ’s indictment, Anderegg is a software engineer with “professional experience working with AI.” Because of his “special skill” in generative AI (GenAI), he was allegedly able to generate the CSAM using a version of Stable Diffusion, “along with a graphical user interface and special add-ons created by other Stable Diffusion users that specialized in producing genitalia.”

After Instagram reported Anderegg’s messages to the minor, cops seized Anderegg’s laptop and found “over 13,000 GenAI images, with hundreds—if not thousands—of these images depicting nude or semi-clothed prepubescent minors lasciviously displaying or touching their genitals” or “engaging in sexual intercourse with men.”

In his messages to the teen, Anderegg seemingly “boasted” about his skill in generating CSAM, the indictment said. The DOJ alleged that evidence from his laptop showed that Anderegg “used extremely specific and explicit prompts to create these images,” including “specific ‘negative’ prompts—that is, prompts that direct the GenAI model on what not to include in generated content—to avoid creating images that depict adults.” These go-to prompts were stored on his computer, the DOJ alleged.

Anderegg is currently in federal custody and has been charged with production, distribution, and possession of AI-generated CSAM, as well as “transferring obscene material to a minor under the age of 16,” the indictment said.

Because the DOJ suspected that Anderegg intended to use the AI-generated CSAM to groom a minor, the DOJ is arguing that there are “no conditions of release” that could prevent him from posing a “significant danger” to his community while the court mulls his case. The DOJ warned the court that it’s highly likely that any future contact with minors could go unnoticed, as Anderegg is seemingly tech-savvy enough to hide any future attempts to send minors AI-generated CSAM.

“He studied computer science and has decades of experience in software engineering,” the indictment said. “While computer monitoring may address the danger posed by less sophisticated offenders, the defendant’s background provides ample reason to conclude that he could sidestep such restrictions if he decided to. And if he did, any reoffending conduct would likely go undetected.”

If convicted of all four counts, he could face “a total statutory maximum penalty of 70 years in prison and a mandatory minimum of five years in prison,” the DOJ said. Partly because of “special skill in GenAI,” the DOJ—which described its evidence against Anderegg as “strong”—suggested that they may recommend a sentencing range “as high as life imprisonment.”

Announcing Anderegg’s arrest, Deputy Attorney General Lisa Monaco made it clear that creating AI-generated CSAM is illegal in the US.

“Technology may change, but our commitment to protecting children will not,” Monaco said. “The Justice Department will aggressively pursue those who produce and distribute child sexual abuse material—or CSAM—no matter how that material was created. Put simply, CSAM generated by AI is still CSAM, and we will hold accountable those who exploit AI to create obscene, abusive, and increasingly photorealistic images of children.”

“CSAM generated by AI is still CSAM,” DOJ says after rare arrest Read More »

child-abusers-are-covering-their-tracks-with-better-use-of-crypto

Child abusers are covering their tracks with better use of crypto

silhouette of child

For those who trade in child sexual exploitation images and videos in the darkest recesses of the Internet, cryptocurrency has been both a powerful tool and a treacherous one. Bitcoin, for instance, has allowed denizens of that criminal underground to buy and sell their wares with no involvement from a bank or payment processor that might reveal their activities to law enforcement. But the public and surprisingly traceable transactions recorded in Bitcoin’s blockchain have sometimes led financial investigators directly to pedophiles’ doorsteps.

Now, after years of evolution in that grim cat-and-mouse game, new evidence suggests that online vendors of what was once commonly called “child porn” are learning to use cryptocurrency with significantly more skill and stealth—and that it’s helping them survive longer in the Internet’s most abusive industry.

Today, as part of an annual crime report, cryptocurrency tracing firm Chainalysis revealed new research that analyzed blockchains to measure the changing scale and sophistication of the cryptocurrency-based sale of child sexual abuse materials, or CSAM, over the past four years. Total revenue from CSAM sold for cryptocurrency has actually gone down since 2021, Chainalysis found, along with the number of new CSAM sellers accepting crypto. But the sophistication of crypto-based CSAM sales has been increasing. More and more, Chainalysis discovered, sellers of CSAM are using privacy tools like “mixers” and “privacy coins” that obfuscate their money trails across blockchains.

Perhaps because of that increased savvy, the company found that CSAM vendors active in 2023 persisted online—and evaded law enforcement—for a longer time than in any previous year, and about 57 percent longer than even in 2022. “Growing sophistication makes identification harder. It makes tracing harder, it makes prosecution harder, and it makes rescuing victims harder,” says Eric Jardine, the researcher who led the Chainalysis study. “So that sophistication dimension is probably the worst one you could see increasing over time.”

Better stealth, longer criminal lifespans

Scouring blockchains, Chainalysis researchers analyzed around 400 cryptocurrency wallets of CSAM sellers and more than 10,000 buyers who sent funds to them over the past four years. Their most disturbing finding in that broad economic study was that crypto-based CSAM sellers seem to have a longer lifespan online than ever, suggesting a kind of relative impunity. On average, CSAM vendors who were active in 2023 remained online for 884 days, compared with 560 days for those active in 2022 and just 112 days in 2020.

To explain that new longevity for some of the most harmful actors on the Internet, Chainalysis points to how CSAM vendors are increasingly laundering their proceeds with cryptocurrency mixers—services that blend users’ funds to make tracing more difficult—such as ChipMixer and Sinbad. (US and German law enforcement shut down ChipMixer in March 2023, but Sinbad remains online despite facing US sanctions for money laundering.) In 2023, Chainalysis found that about 46 percent of CSAM vendors used mixers, up from around 22 percent in 2020.

Chainalysis also found that CSAM vendors are increasingly using “instant exchanger” services that often collect little or no identifying information on traders and allow them to swap bitcoin for cryptocurrencies like Monero and Zcash—”privacy coins” designed to obfuscate or encrypt their blockchains to make tracing their cash-outs of profits far more difficult. Chainalysis’ Jardine says that Monero in particular seems to be gaining popularity among CSAM purveyors. In the company’s investigations, Chainalysis has seen it used repeatedly by CSAM sellers laundering funds through instant exchangers, and in multiple cases it has also seen CSAM forums post Monero addresses to solicit donations. While the instant exchangers did offer other cryptocurrencies, including the privacy coin Zcash, Chainalysis’ report states that “we believe Monero to be the currency of choice for laundering via instant exchangers.”

Child abusers are covering their tracks with better use of crypto Read More »