take it down act

trump-to-sign-law-forcing-platforms-to-remove-revenge-porn-in-48-hours

Trump to sign law forcing platforms to remove revenge porn in 48 hours

Likely wearisome for victims, the law won’t be widely enforced for about a year, while any revenge porn already online continues spreading. Perhaps most frustrating, once the law kicks in, victims will still need to police their own revenge porn online. And the 48-hour window leaves time for content to be downloaded and reposted, leaving them vulnerable on any unmonitored platforms.

Some victims are already tired of fighting this fight. Last July, when Google started downranking deepfake porn apps to make AI-generated NCII less discoverable, one deepfake victim, Sabrina Javellana, told The New York Times that she spent months reporting harmful content on various platforms online. And that didn’t stop the fake images from spreading. Joe Morelle, a Democratic US representative who has talked to victims of deepfake porn and sponsored laws to help them, agreed that “these images live forever.”

“It just never ends,” Javellana said. “I just have to accept it.”

Andrea Powell—director of Alecto AI, an app founded by a revenge porn survivor that helps victims remove NCII online—warned on a 2024 panel that Ars attended that requiring victims to track down “their own imagery [and submit] multiple claims across different platforms [increases] their sense of isolation, shame, and fear.”

While the Take It Down Act seems flawed, passing a federal law imposing penalties for allowing deepfake porn posts could serve as a deterrent for bad actors or possibly spark a culture shift by making it clear that posting AI-generated NCII is harmful.

Victims have long suggested that consistency is key to keeping revenge porn offline, and the Take It Down Act certainly offers that, creating a moderately delayed delete button on every major platform.

Although it seems clear that the Take It Down Act will surely make it easier than ever to report NCII, whether the law will effectively reduce the spread of NCII online is an unknown and will likely hinge on the 48-hour timeline overcoming criticisms.

Trump to sign law forcing platforms to remove revenge porn in 48 hours Read More »

trump’s-hasty-take-it-down-act-has-“gaping-flaws”-that-threaten-encryption

Trump’s hasty Take It Down Act has “gaping flaws” that threaten encryption


Legal challenges will likely immediately follow law’s passage, experts said.

Everyone expects that the Take It Down Act—which requires platforms to remove both real and artificial intelligence-generated non-consensual intimate imagery (NCII) within 48 hours of victims’ reports—will likely pass a vote in the House of Representatives tonight.

After that, it goes to Donald Trump’s desk, where the president has confirmed that he will promptly sign it into law, joining first lady Melania Trump in strongly campaigning for its swift passing. Victims-turned-advocates, many of them children, similarly pushed lawmakers to take urgent action to protect a growing number of victims from the increasing risks of being repeatedly targeted in fake sexualized images or revenge porn that experts say can quickly spread widely online.

Digital privacy experts tried to raise some concerns, warning that the law seemed overly broad and could trigger widespread censorship online. Given such a short window to comply, platforms will likely remove some content that may not be NCII, the Electronic Frontier Foundation (EFF) warned. And even more troublingly, the law does not explicitly exempt encrypted messages, which could potentially encourage platforms to one day break encryption due to the liability threat. Also, it seemed likely that the removal process could be abused by people who hope platforms will automatically remove any reported content, especially after Trump admitted that he would use the law to censor his enemies.

None of that feedback mattered, the EFF’s assistant director of federal affairs, Maddie Daly, told Ars. Lawmakers accepted no amendments in their rush to get the bill to Trump’s desk. There was “immense pressure,” Daly said, “to quickly pass this bill without full consideration.” Because of the rush, Daly suggested that the Take It Down Act still has “gaping flaws.”

While the tech law is expected to achieve the rare feat of getting through Congress at what experts told Ars was a record pace, both supporters and critics also expect that the law will just as promptly be challenged in courts.

Supporters have suggested that any litigation exposing flaws could result in amendments. They’re simultaneously bracing for that backlash, while preparing for the win ahead of the vote tonight and hoping that the law can survive any subsequent legal attacks mostly intact.

Experts disagree on encryption threats

In a press conference hosted by the nonprofit Americans for Responsible Innovation, Slade Bond—who serves as chair of public policy for the law firm Cuneo Gilbert & LaDuca, LLP—advocated for the law passing, warning, “we should not let caution be the enemy of progress.”

Bond joined other supporters in suggesting that apparent threats to encryption or online speech are “far-fetched.”

On his side was Encode’s vice president of public policy, Adam Billen, who pushed back on the claim that companies might break encryption due to the law’s vague text.

Billen predicted that “most encrypted content” wouldn’t be threatened with takedowns—supposedly including private or direct messages—because he argued that the law explicitly covers content that is published (and, importantly, not just distributed) on services that provide a “forum for specifically user generated content.”

“In our mind, encryption simply just is not a question under this bill, and we have explicitly opposed other legislation that would explicitly break encryption,” Billen said.

That may be one way of reading the law, but Daly told Ars that the EFF’s lawyers had a different take.

“We just don’t agree with that reading,” she said. “As drafted, what will likely pass the floor tonight is absolutely a threat to encryption. There are exemptions for email services, but direct messages, cloud storage, these are not exempted.”

Instead, she suggested that lawmakers jammed the law through without weighing amendments that might have explicitly shielded encryption or prevented politicized censorship.

At the supporters’ press conference, Columbia Law School professor Tim Wu suggested that, for lawmakers facing a public vote, opposing the bill became “totally untenable” because “there’s such obvious harm” and “such a visceral problem with fake porn, particularly of minors.”

Supporter calls privacy fears “hypothetical”

Stefan Turkheimer, vice president of public policy for the anti-sexual abuse organization RAINN, agreed with Wu that the growing problem required immediate regulatory action. While various reports have indicated for the past year that the amount of AI-generated NCII is rising, Turkheimer suggested that all statistics are severely undercounting and outdated as he noted that RAINN’s hotline reports are “doubling” monthly for this kind of abuse.

Coming up for a final vote amid an uptick in abuse reports, the Take It Down Act seeks to address harms that most people find “patently offensive,” Turkheimer said, suggesting it was the kind of bill that “can only get killed in the dark.”

However, Turkheimer was the only supporter at the press conference who indicated that texting may be part of the problem that the law could potentially address, perhaps justifying critics’ concerns. He thinks deterring victims’ harm is more important than weighing critics’ fears of censorship or other privacy risks.

“This is a real harm that a lot of people are experiencing, that every single time that they get a text message or they go on the Internet, they may see themselves in a non-consensual image,” Turkheimer said. “That is the real problem, and we’re balancing” that against “sort of a hypothetical problem on the other end, which is that some people’s speech might be affected.”

Remedying text-based abuse could become a privacy problem, an EFF blog suggested, since communications providers “may be served with notices they simply cannot comply with, given the fact that these providers cannot view the contents of messages on their platforms. Platforms may respond by abandoning encryption entirely in order to be able to monitor content—turning private conversations into surveilled spaces.”

That’s why Daly told Ars that the EFF “is very concerned about the effects of Take It Down,” viewing it as a “massive privacy violation.”

“Congress should protect victims of NCII, but we don’t think that Take It Down is the way to do this or that it will actually protect victims,” Daly said.

Further, the potential for politicians to weaponize the takedown system to censor criticism should not be ignored, the EFF warned in another blog. “There are no penalties whatsoever to dissuade a requester from simply insisting that content is NCII,” the blog noted, urging Congress to instead “focus on enforcing and improving the many existing civil and criminal laws that address NCII, rather than opting for a broad takedown regime.”

“Non-consensual intimate imagery is a serious problem that deserves serious consideration, not a hastily drafted, overbroad bill that sweeps in legal, protected speech,” the EFF said.

That call largely fell on deaf ears. Once the law passes, the EFF will continue recommending encrypted services as a reliable means to protect user privacy, Daly said, but remains concerned about the unintended consequences of the law’s vague encryption language.

Although Bond said that precedent is on supporters’ side—arguing “the Supreme Court has been abundantly clear for decades that the First Amendment is not a shield for the type of content that the Take It Down Act is designed to address,” like sharing child sexual abuse materials or engaging in sextortion—Daly said that the EFF remains optimistic that courts will intervene to prevent critics’ worst fears.

“We expect to see challenges to this,” Daly said. “I don’t think this will pass muster.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Trump’s hasty Take It Down Act has “gaping flaws” that threaten encryption Read More »

take-it-down-act-nears-passage;-critics-warn-trump-could-use-it-against-enemies

Take It Down Act nears passage; critics warn Trump could use it against enemies


Anti-deepfake bill raises concerns about censorship and breaking encryption.

The helicopter with outgoing US President Joe Biden and first lady Dr. Jill Biden departs from the East Front of the United States Capitol after the inauguration of Donald Trump on January 20, 2025 in Washington, DC. Credit: Getty Images

An anti-deepfake bill is on the verge of becoming US law despite concerns from civil liberties groups that it could be used by President Trump and others to censor speech that has nothing to do with the intent of the bill.

The bill is called the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes On Websites and Networks Act, or Take It Down Act. The Senate version co-sponsored by Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.) was approved in the Senate by unanimous consent in February and is nearing passage in the House. The House Committee on Energy and Commerce approved the bill in a 49-1 vote yesterday, sending it to the House floor.

The bill pertains to “nonconsensual intimate visual depictions,” including both authentic photos shared without consent and forgeries produced by artificial intelligence or other technological means. Publishing intimate images of adults without consent could be punished by a fine and up to two years of prison. Publishing intimate images of minors under 18 could be punished with a fine or up to three years in prison.

Online platforms would have 48 hours to remove such images after “receiving a valid removal request from an identifiable individual (or an authorized person acting on behalf of such individual).”

“No man, woman, or child should be subjected to the spread of explicit AI images meant to target and harass innocent victims,” House Commerce Committee Chairman Brett Guthrie (R-Ky.) said in a press release. Guthrie’s press release included quotes supporting the bill from first lady Melania Trump, two teen girls who were victimized with deepfake nudes, and the mother of a boy whose death led to an investigation into a possible sextortion scheme.

Free speech concerns

The Electronic Frontier Foundation has been speaking out against the bill, saying “it could be easily manipulated to take down lawful content that powerful people simply don’t like.” The EFF pointed to Trump’s comments in an address to a joint session of Congress last month, in which he suggested he would use the bill for his own ends.

“Once it passes the House, I look forward to signing that bill into law. And I’m going to use that bill for myself too if you don’t mind, because nobody gets treated worse than I do online, nobody,” Trump said, drawing laughs from the crowd at Congress.

The EFF said, “Congress should believe Trump when he says he would use the Take It Down Act simply because he’s ‘treated badly,’ despite the fact that this is not the intention of the bill. There is nothing in the law, as written, to stop anyone—especially those with significant resources—from misusing the notice-and-takedown system to remove speech that criticizes them or that they disagree with.”

Free speech concerns were raised in a February letter to lawmakers sent by the Center for Democracy & Technology, the Authors Guild, Demand Progress Action, the EFF, Fight for the Future, the Freedom of the Press Foundation, New America’s Open Technology Institute, Public Knowledge, and TechFreedom.

The bill’s notice and takedown system “would result in the removal of not just nonconsensual intimate imagery but also speech that is neither illegal nor actually NDII [nonconsensual distribution of intimate imagery]… While the criminal provisions of the bill include appropriate exceptions for consensual commercial pornography and matters of public concern, those exceptions are not included in the bill’s takedown system,” the letter said.

The letter also said the bill could incentivize online platforms to use “content filtering that would break encryption.” The bill “excludes email and other services that do not primarily consist of user-generated content from the NTD [notice and takedown] system,” but “direct messaging services, cloud storage systems, and other similar services for private communication and storage, however, could be required to comply with the NTD,” the letter said.

The bill “contains serious threats to private messaging and free speech online—including requirements that would force companies to abandon end-to-end encryption so they can read and moderate your DMs,” Public Knowledge said today.

Democratic amendments voted down

Rep. Yvette Clarke (D-N.Y.) cast the only vote against the bill in yesterday’s House Commerce Committee hearing. But there were also several party-line votes against amendments submitted by Democrats.

Democrats raised concerns both about the bill not being enforced strictly enough and that bad actors could abuse the takedown process. The first concern is related to Trump firing both Democratic members of the Federal Trade Commission.

Rep. Kim Schrier (D-Wash.) called the Take It Down Act an “excellent law” but said, “right now it’s feeling like empty words because my Republican colleagues just stood by while the administration fired FTC commissioners, the exact people who enforce this law… it feels almost like my Republican colleagues are just giving a wink and a nod to the predators out there who are waiting to exploit kids and other innocent victims.”

Rep. Darren Soto (D-Fla.) offered an amendment to delay the bill’s effective date until the Democratic commissioners are restored to their positions. Ranking Member Frank Pallone, Jr. (D-N.J.) said that with a shorthanded FTC, “there’s going to be no enforcement of the Take It Down Act. There will be no enforcement of anything related to kids’ privacy.”

Rep. John James (R-Mich.) called the amendment a “thinly veiled delay tactic” and “nothing less than an attempt to derail this very important bill.” The amendment was defeated in a 28-22 vote.

Democrats support bill despite losing amendment votes

Rep. Debbie Dingell (D-Mich.) said she strongly supports the bill but offered an amendment that she said would tighten up the text and close loopholes. She said her amendment “ensures constitutionally protected speech is preserved by incorporating essential provisions for consensual content and matters of public concern. My goal is to protect survivors of abuse, not suppress lawful expression or shield misconduct from public accountability.”

Dingell’s amendment was also defeated in a 28-22 vote.

Pallone pitched an amendment that he said would “prevent bad actors from falsely claiming to be authorized from making takedown requests on behalf of someone else.” He called it a “common sense guardrail to protect against weaponization of this bill to take down images that are published with the consent of the subject matter of the images.” The amendment was rejected in a voice vote.

The bill was backed by RAINN (Rape, Abuse & Incest National Network), which praised the committee vote in a statement yesterday. “We’ve worked with fierce determination for the past year to bring Take It Down forward because we know—and survivors know—that AI-assisted sexual abuse is sexual abuse and real harm is being done; real pain is caused,” said Stefan Turkheimer, RAINN’s VP of public policy.

Cruz touted support for the bill from over 120 organizations and companies. The list includes groups like NCMEC (National Center for Missing & Exploited Children) and the National Center on Sexual Exploitation (NCOSE), along with various types of advocacy groups and tech companies Microsoft, Google, Meta, IBM, Amazon, and X Corp.

“As bad actors continue to exploit new technologies like generative artificial intelligence, the Take It Down Act is crucial for ending the spread of exploitative sexual material online, holding Big Tech accountable, and empowering victims of revenge and deepfake pornography,” Cruz said yesterday.

Photo of Jon Brodkin

Jon is a Senior IT Reporter for Ars Technica. He covers the telecom industry, Federal Communications Commission rulemakings, broadband consumer affairs, court cases, and government regulation of the tech industry.

Take It Down Act nears passage; critics warn Trump could use it against enemies Read More »