First Amendment

tiktok-loses-supreme-court-fight,-prepares-to-shut-down-sunday

TikTok loses Supreme Court fight, prepares to shut down Sunday


TikTok has said it’s preparing to shut down Sunday.

A TikTok influencer holds a sign that reads “Keep TikTok” outside the US Supreme Court Building as the court hears oral arguments on whether to overturn or delay a law that could lead to a ban of TikTok in the U.S., on January 10, 2025 in Washington, DC. Credit: Kayla Bartkowski / Stringer | Getty Images News

TikTok has lost its Supreme Court appeal in a 9–0 decision and will likely shut down on January 19, a day before Donald Trump’s inauguration, unless the app can be sold before the deadline, which TikTok has said is impossible.

During the trial last Friday, TikTok lawyer Noel Francisco warned SCOTUS that upholding the Biden administration’s divest-or-sell law would likely cause TikTok to “go dark—essentially the platform shuts down” and “essentially… stop operating.” On Wednesday, TikTok reportedly began preparing to shut down the app for all US users, anticipating the loss.

But TikTok’s claims that the divest-or-sell law violated Americans’ free speech rights did not supersede the government’s compelling national security interest in blocking a foreign adversary like China from potentially using the app to spy on or influence Americans, SCOTUS ruled.

“We conclude that the challenged provisions do not violate petitioners’ First Amendment rights,” the SCOTUS opinion said, while acknowledging that “there is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community.”

Late last year, TikTok and its owner, the Chinese-owned company ByteDance, urgently pushed SCOTUS to intervene before the law’s January 19 enforcement date. Ahead of SCOTUS’ decision, TikTok warned it would have no choice but to abruptly shut down a thriving platform where many Americans get their news, express their views, and make a living.

The US had argued the law was necessary to protect national security interests as the US-China trade war intensifies, alleging that China could use the app to track and influence TikTok’s 170 million American users. A lower court had agreed that the US had a compelling national security interest and rejected arguments that the law violated the First Amendment, triggering TikTok’s appeal to SCOTUS. Today, the Supreme Court upheld that ruling.

According to SCOTUS, the divest-or-sell law is “content-neutral” and only triggers intermediate scrutiny. That requires that the law doesn’t burden “substantially more speech than necessary” to serve the government’s national security interests, rather than strict scrutiny which would force the government to protect those interests through the least restrictive means.

Further, the government was right to single TikTok out, SCOTUS wrote, due to its “scale and susceptibility to foreign adversary control, together with the vast swaths of sensitive data the platform collects.”

“Preventing China from collecting vast amounts of sensitive data from 170 million US TikTok users” is a “decidedly content agnostic” rationale, justices wrote.

“The Government had good reason to single out TikTok for special treatment,” the opinion said.

TikTok CEO Shou Zi Chew posted a statement on TikTok reacting to the ruling, thanking Trump for committing to “work with TikTok” to avoid a shut down and telling users to “rest assured, we will do everything in our power to ensure our platform thrives” in the US.

Momentum to ban TikTok has shifted

First Amendment advocates condemned the SCOTUS ruling. The American Civil Liberties Union called it a “major blow to freedom of expression online,” and the Electronic Frontier Foundation’s civil liberties director David Greene accused justices of sweeping “past the undisputed content-based justification for the law” to “rule only based on the shaky data privacy concerns.”

While the SCOTUS ruling was unanimous, justice Sonia Sotomayor said that  “precedent leaves no doubt” that the law implicated the First Amendment and “plainly” imposed a burden on any US company that distributes TikTok’s speech and any content creator who preferred TikTok as a publisher of their speech.

Similarly concerned was justice Neil Gorsuch, who wrote in his concurring opinion that he harbors “serious reservations about whether the law before us is ‘content neutral’ and thus escapes ‘strict scrutiny.'” Gorsuch also said he didn’t know “whether this law will succeed in achieving its ends.”

“But the question we face today is not the law’s wisdom, only its constitutionality,” Gorsuch wrote. “Given just a handful of days after oral argument to issue an opinion, I cannot profess the kind of certainty I would like to have about the arguments and record before us. All I can say is that, at this time and under these constraints, the problem appears real and the response to it not unconstitutional.”

For TikTok and content creators defending the app, the stakes were incredibly high. TikTok repeatedly denied there was any evidence of spying and warned that enforcing the law would allow the government to unlawfully impose “a massive and unprecedented speech restriction.”

But the Supreme Court declined to order a preliminary injunction to block the law until Trump took office, instead deciding to rush through oral arguments and reach a decision prior to the law’s enforcement deadline. Now TikTok has little recourse if it wishes to maintain US operations, as justices suggested during the trial that even if a president chose to not enforce the law, providing access to TikTok or enabling updates could be viewed as too risky for app stores or other distributors.

The law at the center of the case—the Protecting Americans from Foreign Adversary Controlled Applications Act—had strong bipartisan support under the Biden administration.

But President-elect Donald Trump said he opposed a TikTok ban, despite agreeing that US national security interests in preventing TikTok spying on or manipulating Americans were compelling. And this week, Senator Ed Markey (D-Mass.) has introduced a bill to extend the deadline ahead of a potential TikTok ban, and a top Trump adviser, Congressman Mike Waltz, has said that Trump plans to stop the ban and “keep TikTok from going dark,” the BBC reported. Even the Biden administration, whose justice department just finished arguing why the US needed to enforce the law to SCOTUS, “is considering ways to keep TikTok available,” sources told NBC News.

“What might happen next to TikTok remains unclear,” Gorsuch noted in the opinion.

Will Trump save TikTok?

It will likely soon be clear whether Trump will intervene. Trump filed a brief in December, requesting that the Supreme Court stay enforcement of the law until after he takes office because allegedly only he could make a deal to save TikTok. He criticized SCOTUS for rushing the decision and suggested that Congress’ passage of the law may have been “legislative encroachment” that potentially “binds his hands” as president.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

TikTok’s CEO Chew signaled to users that Trump is expected to step in.

“On behalf of everyone at TikTok and all our users across the country, I want to thank President Trump for his commitment to work with us to find a solution that keeps TikTok available in the United States,” Chew’s statement said.

Chew also reminded Trump that he has 60 billion views of his content on TikTok and perhaps stands to lose a major platform through the ban.

“We are grateful and pleased to have the support of a president who truly understands our platform, one who has used TikTok to express his own thoughts and perspectives,” Chew said.

Trump seemingly has limited options to save TikTok, Forbes suggested. At trial, justices disagreed on whether Trump could legally decide to simply not enforce the law. And efforts to pause enforcement or claim compliance without evidence that ByteDance is working on selling off TikTok could be blocked by the court, analysts said. And while ByteDance has repeatedly said it’s unwilling to sell TikTok US, it’s possible, one analyst suggested to Forbes, that ByteDance might be more willing to divest “in exchange for Trump backing off his threat of high tariffs on Chinese imports.”

On Tuesday, a Bloomberg report suggested that China was considering whether selling TikTok to Elon Musk might be a good bargaining chip to de-escalate Trump’s attacks in the US-China trade war.

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

TikTok loses Supreme Court fight, prepares to shut down Sunday Read More »

texas-defends-requiring-id-for-porn-to-scotus:-“we’ve-done-this-forever”

Texas defends requiring ID for porn to SCOTUS: “We’ve done this forever”

“You can use VPNs, the click of a button, to make it seem like you’re not in Texas,” Shaffer argued. “You can go through the search engines, you can go through social media, you can access the same content in the ways that kids are likeliest to do.”

Texas attorney Aaron Nielson argued that the problem of kids accessing porn online has only gotten “worse” in the decades since Texas has been attempting less restrictive and allegedly less effective means like content filtering. Now, age verification is Texas’ preferred solution, and strict scrutiny shouldn’t apply to a law that just asks someone to show ID to see adult content, Nielson argued.

“In our history we have always said kids can’t come and look at this stuff,” Nielson argued. “So it seems not correct to me as a historical matter to say, well actually it’s always been presumptively unconstitutional. … But we’ve done it forever. Strict scrutiny somehow has always been satisfied.”

Like groups suing, Texas also asked the Supreme Court to be very clear when writing guidance for the 5th Circuit should the court vacate and remand the case. But Texas wants justices to reiterate that just because the case was remanded, that doesn’t mean the 5th Circuit can’t reinstitute the stay on the preliminary injunction that was ordered following the 5th Circuit’s prior review.

On rebuttal, Shaffer told SCOTUS that out of “about 20 other laws that by some views may look a lot like Texas'” law, “this is the worst of them.” He described Texas Attorney General Ken Paxton as a “hostile regulator who’s saying to adults, you should not be here.”

“I strongly urge this court to stick with strict scrutiny as the applicable standard of review when we’re talking about content-based burdens on speakers,” Shaffer said.

In a press release, Vera Eidelman, a senior staff attorney with the ACLU Speech, Privacy, and Technology Project, said that “efforts to childproof the Internet not only hurt everyone’s ability to access information, but often give the government far too much leeway to go after speech it doesn’t like—all while failing to actually protect children.”

Texas defends requiring ID for porn to SCOTUS: “We’ve done this forever” Read More »

trump-told-scotus-he-plans-to-make-a-deal-to-save-tiktok

Trump told SCOTUS he plans to make a deal to save TikTok

Several members of Congress— Senator Edward J. Markey (D-Mass.), Senator Rand Paul (R-Ky.), and Representative Ro Khanna (D-Calif.)—filed a brief agreeing that “the TikTok ban does not survive First Amendment scrutiny.” They agreed with TikTok that the law is “illegitimate.”

Lawmakers’ “principle justification” for the ban—”preventing covert content manipulation by the Chinese government”—masked a “desire” to control TikTok content, they said. Further, it could be achieved by a less-restrictive alternative, they said, a stance which TikTok has long argued for.

Attorney General Merrick Garland defended the Act, though, urging SCOTUS to remain laser-focused on the question of whether a forced sale of TikTok that would seemingly allow the app to continue operating without impacting American free speech violates the First Amendment. If the court agrees that the law survives strict scrutiny, TikTok could still be facing an abrupt shutdown in January.

The Supreme Court has scheduled oral arguments to begin on January 10. TikTok and content creators who separately sued to block the law have asked for their arguments to be divided, so that the court can separately weigh “different perspectives” when deciding how to approach the First Amendment question.

In its own brief, TikTok has asked SCOTUS to strike the portions of the law singling out TikTok or “at the very least” explain to Congress that “it needed to do far better work either tailoring the Act’s restrictions or justifying why the only viable remedy was to prohibit Petitioners from operating TikTok.”

But that may not be necessary if Trump prevails. Trump told the court that TikTok was an important platform for his presidential campaign and that he should be the one to make the call on whether TikTok should remain in the US—not the Supreme Court.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

Trump told SCOTUS he plans to make a deal to save TikTok Read More »

supreme-court-to-decide-if-tiktok-should-be-banned-or-sold

Supreme Court to decide if TikTok should be banned or sold

While the controversial US law doesn’t necessarily ban TikTok, it does seem designed to make TikTok “go away,” Greene said, and such a move to interfere with a widely used communications platform seems “unprecedented.”

“The TikTok ban itself and the DC Circuit’s approval of it should be of great concern even to those who find TikTok undesirable or scary,” Greene said in a statement. “Shutting down communications platforms or forcing their reorganization based on concerns of foreign propaganda and anti-national manipulation is an eminently anti-democratic tactic, one that the US has previously condemned globally.”

Greene further warned that the US “cutting off a tool used by 170 million Americans to receive information and communicate with the world, without proving with evidence that the tools are presently seriously harmful” would “greatly” lower “well-established standards for restricting freedom of speech in the US.”

TikTok partly appears to be hoping that President-elect Donald Trump will disrupt enforcement of the law, but Greene said it remains unclear if Trump’s plan to “save TikTok” might just be a plan to support a sale to a US buyer. At least one former Trump ally, Steven Mnuchin, has reportedly expressed interest in buying the app.

For TikTok, putting pressure on Trump will likely be the next step, “if the Supreme Court ever says, ‘we agree the law is valid,'” Greene suggested.

“Then that’s it,” Greene said. “There’s no other legal recourse. You only have political recourses.”

Like other civil rights groups, the EFF plans to remain on TikTok’s side as the SCOTUS battle starts.

“We are pleased that the Supreme Court will take the case and will urge the justices to apply the appropriately demanding First Amendment scrutiny,” Greene said.

Supreme Court to decide if TikTok should be banned or sold Read More »

facing-ban-next-month,-tiktok-begs-scotus-for-help

Facing ban next month, TikTok begs SCOTUS for help

TikTok: Ban is slippery slope to broad US censorship

According to TikTok, the government’s defense of the ban to prevent China from wielding a “covert” influence over Americans is a farce invented by lawyers to cover up the true mission of censorship. If the lower court’s verdict stands, TikTok alleged, “then Congress will have free rein to ban any American from speaking simply by identifying some risk that the speech is influenced by a foreign entity.”

TikTok doesn’t want to post big disclaimers on the app warning of “covert” influence, claiming that the government relied on “secret evidence” to prove this influence occurs on TikTok. But if the Supreme Court agrees that the government needed to show more than “bare factual assertions” to back national security claims the lower court said justified any potential speech restrictions, then the court will also likely agree to reverse the lower court’s decision, TikTok suggested.

It will become much clearer by January 6 whether the January 19 ban will take effect, at which point TikTok would shut down, booting all US users from the app. TikTok urged the Supreme Court to agree it is in the public interest to delay the ban and review the constitutional claims to prevent any “extreme” harms to both TikTok and US users who depend on the app for news, community, and income.

If SCOTUS doesn’t intervene, TikTok said that the lower court’s “flawed legal rationales would open the door to upholding content-based speech bans in contexts far different than this one.”

“Fearmongering about national security cannot obscure the threat that the Act itself poses to all Americans,” TikTok alleged, while suggesting that even Congress would agree that a “modest delay” in enforcing the law wouldn’t pose any immediate risk to US national security. Congress is also aware that a sale would not be technically, commercially, or legally possible in the timeframe provided, TikTok said. A temporary injunction would prevent irreparable harms, TikTok said, including the irreparable harm courts have long held is caused by restricting speech of Americans for any amount of time.

“An interim injunction is also appropriate because it will give the incoming Administration time to determine its position, as the President-elect and his advisors have voiced support for saving TikTok,” TikTok argued.

Ars could not immediately reach TikTok for comment.

Facing ban next month, TikTok begs SCOTUS for help Read More »

judge-slams-florida-for-censoring-political-ad:-“it’s-the-first-amendment,-stupid”

Judge slams Florida for censoring political ad: “It’s the First Amendment, stupid”


Florida threatened TV stations over ad that criticized state’s abortion law.

A woman holding an MRI displaying a brain tumor.

Screenshot of political advertisement featuring a woman describing her experience having an abortion after being diagnosed with brain cancer. Credit: Floridians Protecting Freedom

US District Judge Mark Walker had a blunt message for the Florida surgeon general in an order halting the government official’s attempt to censor a political ad that opposes restrictions on abortion.

“To keep it simple for the State of Florida: it’s the First Amendment, stupid,” Walker, an Obama appointee who is chief judge in US District Court for the Northern District of Florida, wrote yesterday in a ruling that granted a temporary restraining order.

“Whether it’s a woman’s right to choose, or the right to talk about it, Plaintiff’s position is the same—’don’t tread on me,'” Walker wrote later in the ruling. “Under the facts of this case, the First Amendment prohibits the State of Florida from trampling on Plaintiff’s free speech.”

The Florida Department of Health recently sent a legal threat to broadcast TV stations over the airing of a political ad that criticized abortion restrictions in Florida’s Heartbeat Protection Act. The department in Gov. Ron DeSantis’ administration claimed the ad falsely described the abortion law, which could be weakened by a pending ballot question.

Floridians Protecting Freedom, the group that launched the TV ad and is sponsoring a ballot question to lift restrictions on abortion, sued Surgeon General Joseph Ladapo and Department of Health general counsel John Wilson. Wilson has resigned.

Surgeon general blocked from further action

Walker’s order granting the group’s motion states that “Defendant Ladapo is temporarily enjoined from taking any further actions to coerce, threaten, or intimate repercussions directly or indirectly to television stations, broadcasters, or other parties for airing Plaintiff’s speech, or undertaking enforcement action against Plaintiff for running political advertisements or engaging in other speech protected under the First Amendment.”

The order expires on October 29 but could be replaced by a preliminary injunction that would remain in effect while litigation continues. A hearing on the motion for a preliminary injunction is scheduled for the morning of October 29.

The pending ballot question would amend the state Constitution to say, “No law shall prohibit, penalize, delay, or restrict abortion before viability or when necessary to protect the patient’s health, as determined by the patient’s healthcare provider. This amendment does not change the Legislature’s constitutional authority to require notification to a parent or guardian before a minor has an abortion.”

Walker’s ruling said that Ladapo “has the right to advocate for his own position on a ballot measure. But it would subvert the rule of law to permit the State to transform its own advocacy into the direct suppression of protected political speech.”

Federal Communications Commission Chairwoman Jessica Rosenworcel recently criticized state officials, writing that “threats against broadcast stations for airing content that conflicts with the government’s views are dangerous and undermine the fundamental principle of free speech.”

State threatened criminal proceedings

The Floridians Protecting Freedom advertisement features a woman who “recalls her decision to have an abortion in Florida in 2022,” and “states that she would not be able to have an abortion for the same reason under the current law,” Walker’s ruling said.

Caroline, the woman in the ad, states that “the doctors knew if I did not end my pregnancy, I would lose my baby, I would lose my life, and my daughter would lose her mom. Florida has now banned abortion even in cases like mine. Amendment 4 is going to protect women like me; we have to vote yes.”

The ruling described the state government response:

Shortly after the ad began running, John Wilson, then general counsel for the Florida Department of Health, sent letters on the Department’s letterhead to Florida TV stations. The letters assert that Plaintiff’s political advertisement is false, dangerous, and constitutes a “sanitary nuisance” under Florida law. The letter informed the TV stations that the Department of Health must notify the person found to be committing the nuisance to remove it within 24 hours pursuant to section 386.03(1), Florida Statutes. The letter further warned that the Department could institute legal proceedings if the nuisance were not timely removed, including criminal proceedings pursuant to section 386.03(2)(b), Florida Statutes. Finally, the letter acknowledged that the TV stations have a constitutional right to “broadcast political advertisements,” but asserted this does not include “false advertisements which, if believed, would likely have a detrimental effect on the lives and health of pregnant women in Florida.” At least one of the TV stations that had been running Plaintiff’s advertisement stopped doing so after receiving this letter from the Department of Health.

The Department of Health claimed the ad “is categorically false” because “Florida’s Heartbeat Protection Act does not prohibit abortion if a physician determines the gestational age of the fetus is less than 6 weeks.”

Floridians Protecting Freedom responded that the woman in the ad made true statements, saying that “Caroline was diagnosed with stage four brain cancer when she was 20 weeks pregnant; the diagnosis was terminal. Under Florida law, abortions may only be performed after six weeks gestation if ‘[t]wo physicians certify in writing that, in reasonable medical judgment, the termination of the pregnancy is necessary to save the pregnant woman’s life or avert a serious risk of substantial and irreversible physical impairment of a major bodily function of the pregnant woman other than a psychological condition.'”

Because “Caroline’s diagnosis was terminal… an abortion would not have saved her life, only extended it. Florida law would not allow an abortion in this instance because the abortion would not have ‘save[d] the pregnant woman’s life,’ only extended her life,” the group said.

Judge: State should counter with its own speech

Walker’s ruling said the government can’t censor the ad by claiming it is false:

Plaintiff’s argument is correct. While Defendant Ladapo refuses to even agree with this simple fact, Plaintiff’s political advertisement is political speech—speech at the core of the First Amendment. And just this year, the United States Supreme Court reaffirmed the bedrock principle that the government cannot do indirectly what it cannot do directly by threatening third parties with legal sanctions to censor speech it disfavors. The government cannot excuse its indirect censorship of political speech simply by declaring the disfavored speech is “false.”

State officials must show that their actions “were narrowly tailored to serve a compelling government interest,” Walker wrote. A “narrowly tailored solution” in this case would be counterspeech, not censorship, he wrote.

“For all these reasons, Plaintiff has demonstrated a substantial likelihood of success on the merits,” the ruling said. Walker wrote that a ruling in favor of the state would open the door to more censorship:

This case pits the right to engage in political speech against the State’s purported interest in protecting the health and safety of Floridians from “false advertising.” It is no answer to suggest that the Department of Health is merely flexing its traditional police powers to protect health and safety by prosecuting “false advertising”—if the State can rebrand rank viewpoint discriminatory suppression of political speech as a “sanitary nuisance,” then any political viewpoint with which the State disagrees is fair game for censorship.

Walker then noted that Ladapo “has ample, constitutional alternatives to mitigate any harm caused by an injunction in this case.” The state is already running “its own anti-Amendment 4 campaign to educate the public about its view of Florida’s abortion laws and to correct the record, as it sees fit, concerning pro-Amendment 4 speech,” Walker wrote. “The State can continue to combat what it believes to be ‘false advertising’ by meeting Plaintiff’s speech with its own.”

Photo of Jon Brodkin

Jon is a Senior IT Reporter for Ars Technica. He covers the telecom industry, Federal Communications Commission rulemakings, broadband consumer affairs, court cases, and government regulation of the tech industry.

Judge slams Florida for censoring political ad: “It’s the First Amendment, stupid” Read More »

rfk-jr’s-anti-vaccine-group-can’t-sue-meta-for-agreeing-with-cdc,-judge-rules

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules

Independent presidential candidate Robert F. Kennedy Jr.

Enlarge / Independent presidential candidate Robert F. Kennedy Jr.

The Children’s Health Defense (CHD), an anti-vaccine group founded by Robert F. Kennedy Jr, has once again failed to convince a court that Meta acted as a state agent when censoring the group’s posts and ads on Facebook and Instagram.

In his opinion affirming a lower court’s dismissal, US Ninth Circuit Court of Appeals Judge Eric Miller wrote that CHD failed to prove that Meta acted as an arm of the government in censoring posts. Concluding that Meta’s right to censor views that the platforms find “distasteful” is protected by the First Amendment, Miller denied CHD’s requested relief, which had included an injunction and civil monetary damages.

“Meta evidently believes that vaccines are safe and effective and that their use should be encouraged,” Miller wrote. “It does not lose the right to promote those views simply because they happen to be shared by the government.”

CHD told Reuters that the group “was disappointed with the decision and considering its legal options.”

The group first filed the complaint in 2020, arguing that Meta colluded with government officials to censor protected speech by labeling anti-vaccine posts as misleading or removing and shadowbanning CHD posts. This caused CHD’s traffic on the platforms to plummet, CHD claimed, and ultimately, its pages were removed from both platforms.

However, critically, Miller wrote, CHD did not allege that “the government was actually involved in the decisions to label CHD’s posts as ‘false’ or ‘misleading,’ the decision to put the warning label on CHD’s Facebook page, or the decisions to ‘demonetize’ or ‘shadow-ban.'”

“CHD has not alleged facts that allow us to infer that the government coerced Meta into implementing a specific policy,” Miller wrote.

Instead, Meta “was entitled to encourage” various “input from the government,” justifiably seeking vaccine-related information provided by the World Health Organization (WHO) and the US Centers for Disease Control and Prevention (CDC) as it navigated complex content moderation decisions throughout the pandemic, Miller wrote.

Therefore, Meta’s actions against CHD were due to “Meta’s own ‘policy of censoring,’ not any provision of federal law,” Miller concluded. “The evidence suggested that Meta had independent incentives to moderate content and exercised its own judgment in so doing.”

None of CHD’s theories that Meta coordinated with officials to deprive “CHD of its constitutional rights” were plausible, Miller wrote, whereas the “innocent alternative”—”that Meta adopted the policy it did simply because” CEO Mark Zuckerberg and Meta “share the government’s view that vaccines are safe and effective”—appeared “more plausible.”

Meta “does not become an agent of the government just because it decides that the CDC sometimes has a point,” Miller wrote.

Equally not persuasive were CHD’s notions that Section 230 immunity—which shields platforms from liability for third-party content—”‘removed all legal barriers’ to the censorship of vaccine-related speech,” such that “Meta’s restriction of that content should be considered state action.”

“That Section 230 operates in the background to immunize Meta if it chooses to suppress vaccine misinformation—whether because it shares the government’s health concerns or for independent commercial reasons—does not transform Meta’s choice into state action,” Miller wrote.

One judge dissented over Section 230 concerns

In his dissenting opinion, Judge Daniel Collins defended CHD’s Section 230 claim, however, suggesting that the appeals court erred and should have granted CHD injunctive and declaratory relief from alleged censorship. CHD CEO Mary Holland told The Defender that the group was pleased the decision was not unanimous.

According to Collins, who like Miller is a Trump appointee, Meta could never have built its massive social platforms without Section 230 immunity, which grants platforms the ability to broadly censor viewpoints they disfavor.

It was “important to keep in mind” that “the vast practical power that Meta exercises over the speech of millions of others ultimately rests on a government-granted privilege to which Meta is not constitutionally entitled,” Collins wrote. And this power “makes a crucial difference in the state-action analysis.”

As Collins sees it, CHD could plausibly allege that Meta’s communications with government officials about vaccine-related misinformation targeted specific users, like the “disinformation dozen” that includes both CHD and Kennedy. In that case, it appears possible to Collins that Section 230 provides a potential opportunity for government to target speech that it disfavors through mechanisms provided by the platforms.

“Having specifically and purposefully created an immunized power for mega-platform operators to freely censor the speech of millions of persons on those platforms, the Government is perhaps unsurprisingly tempted to then try to influence particular uses of such dangerous levers against protected speech expressing viewpoints the Government does not like,” Collins warned.

He further argued that “Meta’s relevant First Amendment rights” do not “give Meta an unbounded freedom to work with the Government in suppressing speech on its platforms.” Disagreeing with the majority, he wrote that “in this distinctive scenario, applying the state-action doctrine promotes individual liberty by keeping the Government’s hands away from the tempting levers of censorship on these vast platforms.”

The majority agreed, however, that while Section 230 immunity “is undoubtedly a significant benefit to companies like Meta,” lawmakers’ threats to weaken Section 230 did not suggest that Meta’s anti-vaccine policy was coerced state action.

“Many companies rely, in one way or another, on a favorable regulatory environment or the goodwill of the government,” Miller wrote. “If that were enough for state action, every large government contractor would be a state actor. But that is not the law.”

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules Read More »

kids-online-safety-act-passes-senate-despite-concerns-it-will-harm-kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

The Kids Online Safety Act (KOSA) easily passed the Senate today despite critics’ concerns that the bill may risk creating more harm than good for kids and perhaps censor speech for online users of all ages if it’s signed into law.

KOSA received broad bipartisan support in the Senate, passing with a 91–3 vote alongside the Children’s Online Privacy Protection Action (COPPA) 2.0. Both laws seek to control how much data can be collected from minors, as well as regulate the platform features that could harm children’s mental health.

Only Senators Ron Wyden (D-Ore.), Rand Paul (R-Ky.), and Mike Lee (R-Utah) opposed the bills.

In an op-ed for The Courier-Journal, Paul argued that KOSA imposes a “duty of care” to mitigate harms to minors on their platforms that “will not only stifle free speech, but it will deprive Americans of the benefits of our technological advancements.”

“With the Internet, today’s children have the world at their fingertips,” Paul wrote, but if KOSA passes, even allegedly benign content like “pro-life messages” or discussion of a teen overcoming an eating disorder could be censored if platforms fear compliance issues.

“While doctors’ and therapists’ offices close at night and on weekends, support groups are available 24 hours a day, seven days a week for people who share similar concerns or have the same health problems. Any solution to protect kids online must ensure the positive aspects of the Internet are preserved,” Paul wrote.

During a KOSA critics’ press conference today, Dara Adkison—the executive director of a group providing resources for transgender youths called TransOhio—expressed concerns that lawmakers would target sites like TransOhio if the law also passed in the House, where the bill heads next.

“I’ve literally had legislators tell me to my face that they would love to see our website taken off the Internet because they don’t want people to have the kinds of vital community resources that we provide,” Adkison said.

Paul argued that what was considered harmful to kids was subjective, noting that a key flaw with KOSA was that “KOSA does not explicitly define the term ‘mental health disorder.'” Instead, platforms are to refer to the definition in “the fifth edition of the Diagnostic and Statistical Manual of Mental Health Disorders” or “the most current successor edition.”

“That means the scope of the bill could change overnight without any action from America’s elected representatives,” Paul warned, suggesting that “KOSA opens the door to nearly limitless content regulation because platforms will censor users rather than risk liability.”

Ahead of the vote, Senator Richard Blumenthal (D-Conn.)—who co-sponsored KOSA—denied that the bill strove to regulate content, The Hill reported. To Blumenthal and other KOSA supporters, its aim instead is to ensure that social media is “safe by design” for young users.

According to The Washington Post, KOSA and COPPA 2.0 passing “represent the most significant restrictions on tech platforms to clear a chamber of Congress in decades.” However, while President Joe Biden has indicated he would be willing to sign the bill into law, most seem to agree that KOSA will struggle to pass in the House of Representatives.

A senior tech policy director for Chamber of Progress—a progressive tech industry policy coalition—Todd O’Boyle, has said that currently there is “substantial opposition” in the House. O’Boyle said that he expects that the political divide will be enough to block KOSA’s passage and prevent giving “the power” to the Federal Trade Commission (FTC) or “the next president” to “crack down on online speech” or otherwise pose “a massive threat to our constitutional rights.”

“If there’s one thing the far-left and far-right agree on, it’s that the next chair of the FTC shouldn’t get to decide what online posts are harmful,” O’Boyle said.

Kids Online Safety Act passes Senate despite concerns it will harm kids Read More »

elon-musk’s-x-may-succeed-in-blocking-calif.-content-moderation-law-on-appeal

Elon Musk’s X may succeed in blocking Calif. content moderation law on appeal

Judgment call —

Elon Musk’s X previously failed to block the law on First Amendment grounds.

Elon Musk’s X may succeed in blocking Calif. content moderation law on appeal

Elon Musk’s fight defending X’s content moderation decisions isn’t just with hate speech researchers and advertisers. He has also long been battling regulators, and this week, he seemed positioned to secure a potentially big win in California, where he’s hoping to permanently block a law that he claims unconstitutionally forces his platform to justify its judgment calls.

At a hearing Wednesday, three judges in the 9th US Circuit Court of Appeals seemed inclined to agree with Musk that a California law requiring disclosures from social media companies that clearly explain their content moderation choices likely violates the First Amendment.

Passed in 2022, AB-587 forces platforms like X to submit a “terms of service report” detailing how they moderate several categories of controversial content. Those categories include hate speech or racism, extremism or radicalization, disinformation or misinformation, harassment, and foreign political interference, which X’s lawyer, Joel Kurtzberg, told judges yesterday “are the most controversial categories of so-called awful but lawful speech.”

The law would seemingly require more transparency than ever from X, making it easy for users to track exactly how much controversial content X flags and removes—and perhaps most notably for advertisers, how many users viewed concerning content.

To block the law, X sued in 2023, arguing that California was trying to dictate its terms of service and force the company to make statements on content moderation that could generate backlash. X worried that the law “impermissibly” interfered with both “the constitutionally protected editorial judgments” of social media companies, as well as impacted users’ speech by requiring companies “to remove, demonetize, or deprioritize constitutionally protected speech that the state deems undesirable or harmful.”

Any companies found to be non-compliant could face stiff fines of up to $15,000 per violation per day, which X considered “draconian.” But last year, a lower court declined to block the law, prompting X to appeal, and yesterday, the appeals court seemed more sympathetic to X’s case.

At the hearing, Kurtzberg told judges that the law was “deeply threatening to the well-established First Amendment interests” of an “extraordinary diversity of” people, which is why X’s complaint was supported by briefs from reporters, freedom of the press advocates, First Amendment scholars, “conservative entities,” and people across the political spectrum.

All share “a deep concern about a statute that, on its face, is aimed at pressuring social media companies to change their content moderation policies, so as to carry less or even no expression that’s viewed by the state as injurious to its people,” Kurtzberg told judges.

When the court pointed out that seemingly the law simply required X to abide by content moderation policies for each category defined in its own terms of service—and did not compel X to adopt any policy or position that it did not choose—Kurtzberg pushed back.

“They don’t mandate us to define the categories in a specific way, but they mandate us to take a position on what the legislature makes clear are the most controversial categories to moderate and define,” Kurtzberg said. “We are entitled to respond to the statute by saying we don’t define hate speech or racism. But the report also asks about policies that are supposedly, quote, ‘intended’ to address those categories, which is a judgment call.”

“This is very helpful,” Judge Anthony Johnstone responded. “Even if you don’t yourself define those categories in the terms of service, you read the law as requiring you to opine or discuss those categories, even if they’re not part of your own terms,” and “you are required to tell California essentially your views on hate speech, extremism, harassment, foreign political interference, how you define them or don’t define them, and what you choose to do about them?”

“That is correct,” Kurtzberg responded, noting that X considered those categories the most “fraught” and “difficult to define.”

Elon Musk’s X may succeed in blocking Calif. content moderation law on appeal Read More »

meta-tells-court-it-won’t-sue-over-facebook-feed-killing-tool—yet

Meta tells court it won’t sue over Facebook feed-killing tool—yet

Meta tells court it won’t sue over Facebook feed-killing tool—yet

This week, Meta asked a US district court in California to toss a lawsuit filed by a professor, Ethan Zuckerman, who fears that Meta will sue him if he releases a tool that would give Facebook users an automated way to easily remove all content from their feeds.

Zuckerman has alleged that the imminent threat of a lawsuit from Meta has prevented him from releasing Unfollow Everything 2.0, suggesting that a cease-and-desist letter sent to the creator of the original Unfollow Everything substantiates his fears.

He’s hoping the court will find that either releasing his tool would not breach Facebook’s terms of use—which prevent “accessing or collecting data from Facebook ‘using automated means'”—or that those terms conflict with public policy. Among laws that Facebook’s terms allegedly conflict with are the First Amendment, section 230 of the Communications Decency Act, the Computer Fraud and Abuse Act (CFAA), as well as California’s Computer Data Access and Fraud Act (CDAFA) and state privacy laws.

But Meta claimed in its motion to dismiss that Zuckerman’s suit is too premature, mostly because the tool has not yet been built and Meta has not had a chance to review the “non-existent tool” to determine how Unfollow Everything 2.0 might impact its platform or its users.

“Besides bald assertions about how Plaintiff intends Unfollow Everything 2.0 to work and what he plans to do with it, there are no concrete facts that would enable this Court to adjudicate potential legal claims regarding this tool—which, at present, does not even operate in the real world,” Meta argued.

Meta wants all of Zuckerman’s claims to be dismissed, arguing that “adjudicating Plaintiff’s claims would require needless rulings on hypothetical applications of California law, would likely result in duplicative litigation, and would encourage forum shopping.”

At the heart of Meta’s defense is a claim that there’s no telling yet if Zuckerman will ever be able to release the tool, although Zuckerman said he was prepared to finish the build within six weeks of a court win. Last May, Zuckerman told Ars that because Facebook’s functionality could change while the lawsuit is settled, it’s better to wait to finish building the tool because Facebook’s design is always changing.

Meta claimed that Zuckerman can’t confirm if Unfollow Everything 2.0 would work as described in his suit precisely because his findings are based on Facebook’s current interface, and the “process for unfollowing has changed over time and will likely continue to change.”

Further, Meta argued that the original Unfollow Everything performed in a different way—by logging in on behalf of users and automatically unfollowing everything, rather than performing the automated unfollowing when the users themselves log in. Because of that, Meta argued that the new tool may not prompt the same response from Meta.

A senior staff attorney at the Knight Institute who helped draft Zuckerman’s complaint, Ramya Krishnan, told Ars that the two tools operate nearly identically, however.

“Professor Zuckerman’s tool and the original Unfollow Everything work in essentially the same way,” Krishnan told Ars. “They automatically unfollow all of a user’s friends, groups, and pages after the user installs the tool and logs in to Facebook using their web browser.”

Ultimately, Meta claimed that there’s no telling if Meta would even sue over the tool’s automated access to user data, dismissing Zuckerman’s fears as unsubstantiated.

Only when the tool is out in the wild and Facebook is able to determine “actual, concrete facts about how it works in practice” that “may prove problematic” will Meta know if a legal response is needed, Meta claimed. Without reviewing the technical specs, Meta argued, Meta has no way to assess the damages or know if it would sue over a breach of contract, as alleged, or perhaps over other claims not alleged, such as trademark infringement.

Meta tells court it won’t sue over Facebook feed-killing tool—yet Read More »

scotus-nixes-injunction-that-limited-biden-admin-contacts-with-social-networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

On Wednesday, the Supreme Court tossed out claims that the Biden administration coerced social media platforms into censoring users by removing COVID-19 and election-related content.

Complaints alleging that high-ranking government officials were censoring conservatives had previously convinced a lower court to order an injunction limiting the Biden administration’s contacts with platforms. But now that injunction has been overturned, re-opening lines of communication just ahead of the 2024 elections—when officials will once again be closely monitoring the spread of misinformation online targeted at voters.

In a 6–3 vote, the majority ruled that none of the plaintiffs suing—including five social media users and Republican attorneys general in Louisiana and Missouri—had standing. They had alleged that the government had “pressured the platforms to censor their speech in violation of the First Amendment,” demanding an injunction to stop any future censorship.

Plaintiffs may have succeeded if they were instead seeking damages for past harms. But in her opinion, Justice Amy Coney Barrett wrote that partly because the Biden administration seemingly stopped influencing platforms’ content policies in 2022, none of the plaintiffs could show evidence of a “substantial risk that, in the near future, they will suffer an injury that is traceable” to any government official. Thus, they did not seem to face “a real and immediate threat of repeated injury,” Barrett wrote.

“Without proof of an ongoing pressure campaign, it is entirely speculative that the platforms’ future moderation decisions will be attributable, even in part,” to government officials, Barrett wrote, finding that an injunction would do little to prevent future censorship.

Instead, plaintiffs’ claims “depend on the platforms’ actions,” Barrett emphasized, “yet the plaintiffs do not seek to enjoin the platforms from restricting any posts or accounts.”

“It is a bedrock principle that a federal court cannot redress ‘injury that results from the independent action of some third party not before the court,'” Barrett wrote.

Barrett repeatedly noted “weak” arguments raised by plaintiffs, none of which could directly link their specific content removals with the Biden administration’s pressure campaign urging platforms to remove vaccine or election misinformation.

According to Barrett, the lower court initially granting the injunction “glossed over complexities in the evidence,” including the fact that “platforms began to suppress the plaintiffs’ COVID-19 content” before the government pressure campaign began. That’s an issue, Barrett said, because standing to sue “requires a threshold showing that a particular defendant pressured a particular platform to censor a particular topic before that platform suppressed a particular plaintiff’s speech on that topic.”

“While the record reflects that the Government defendants played a role in at least some of the platforms’ moderation choices, the evidence indicates that the platforms had independent incentives to moderate content and often exercised their own judgment,” Barrett wrote.

Barrett was similarly unconvinced by arguments that plaintiffs risk platforms removing future content based on stricter moderation policies that were previously coerced by officials.

“Without evidence of continued pressure from the defendants, the platforms remain free to enforce, or not to enforce, their policies—even those tainted by initial governmental coercion,” Barrett wrote.

Judge: SCOTUS “shirks duty” to defend free speech

Justices Clarence Thomas and Neil Gorsuch joined Samuel Alito in dissenting, arguing that “this is one of the most important free speech cases to reach this Court in years” and that the Supreme Court had an “obligation” to “tackle the free speech issue that the case presents.”

“The Court, however, shirks that duty and thus permits the successful campaign of coercion in this case to stand as an attractive model for future officials who want to control what the people say, hear, and think,” Alito wrote.

Alito argued that the evidence showed that while “downright dangerous” speech was suppressed, so was “valuable speech.” He agreed with the lower court that “a far-reaching and widespread censorship campaign” had been “conducted by high-ranking federal officials against Americans who expressed certain disfavored views about COVID-19 on social media.”

“For months, high-ranking Government officials placed unrelenting pressure on Facebook to suppress Americans’ free speech,” Alito wrote. “Because the Court unjustifiably refuses to address this serious threat to the First Amendment, I respectfully dissent.”

At least one plaintiff who opposed masking and vaccines, Jill Hines, was “indisputably injured,” Alito wrote, arguing that evidence showed that she was censored more frequently after officials pressured Facebook into changing their policies.

“Top federal officials continuously and persistently hectored Facebook to crack down on what the officials saw as unhelpful social media posts, including not only posts that they thought were false or misleading but also stories that they did not claim to be literally false but nevertheless wanted obscured,” Alito wrote.

While Barrett and the majority found that platforms were more likely responsible for injury, Alito disagreed, writing that with the threat of antitrust probes or Section 230 amendments, Facebook acted like “a subservient entity determined to stay in the good graces of a powerful taskmaster.”

Alito wrote that the majority was “applying a new and heightened standard” by requiring plaintiffs to “untangle Government-caused censorship from censorship that Facebook might have undertaken anyway.” In his view, it was enough that Hines showed that “one predictable effect of the officials’ action was that Facebook would modify its censorship policies in a way that affected her.”

“When the White House pressured Facebook to amend some of the policies related to speech in which Hines engaged, those amendments necessarily impacted some of Facebook’s censorship decisions,” Alito wrote. “Nothing more is needed. What the Court seems to want are a series of ironclad links.”

“That is regrettable,” Alito said.

SCOTUS nixes injunction that limited Biden admin contacts with social networks Read More »

elon-musk’s-x-defeats-australia’s-global-takedown-order-of-stabbing-video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Australia’s safety regulator has ended a legal battle with X (formerly Twitter) after threatening approximately $500,000 daily fines for failing to remove 65 instances of a religiously motivated stabbing video from X globally.

Enforcing Australia’s Online Safety Act, eSafety commissioner Julie Inman-Grant had argued it would be dangerous for the videos to keep spreading on X, potentially inciting other acts of terror in Australia.

But X owner Elon Musk refused to comply with the global takedown order, arguing that it would be “unlawful and dangerous” to allow one country to control the global Internet. And Musk was not alone in this fight. The legal director of a nonprofit digital rights group called the Electronic Frontier Foundation (EFF), Corynne McSherry, backed up Musk, urging the court to agree that “no single country should be able to restrict speech across the entire Internet.”

“We welcome the news that the eSafety Commissioner is no longer pursuing legal action against X seeking the global removal of content that does not violate X’s rules,” X’s Global Government Affairs account posted late Tuesday night. “This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed.”

Inman-Grant was formerly Twitter’s director of public policy in Australia and used that experience to land what she told The Courier-Mail was her “dream role” as Australia’s eSafety commissioner in 2017. Since issuing the order to remove the video globally on X, Inman-Grant had traded barbs with Musk (along with other Australian lawmakers), responding to Musk labeling her a “censorship commissar” by calling him an “arrogant billionaire” for fighting the order.

On X, Musk arguably got the last word, posting, “Freedom of speech is worth fighting for.”

Safety regulator still defends takedown order

In a statement, Inman-Grant said early Wednesday that her decision to discontinue proceedings against X was part of an effort to “consolidate actions,” including “litigation across multiple cases.” She ultimately determined that dropping the case against X would be the “option likely to achieve the most positive outcome for the online safety of all Australians, especially children.”

“Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community,” Inman-Grant said, still defending the order despite dropping it.

In court, X’s lawyer Marcus Hoyne had pushed back on such logic, arguing that the eSafety regulator’s mission was “pointless” because “footage of the attack had now spread far beyond the few dozen URLs originally identified,” the Australian Broadcasting Corporation reported.

“I stand by my investigators and the decisions eSafety made,” Inman-Grant said.

Other Australian lawmakers agree the order was not out of line. According to AP News, Australian Minister for Communications Michelle Rowland shared a similar statement in parliament today, backing up the safety regulator while scolding X users who allegedly took up Musk’s fight by threatening Inman-Grant and her family. The safety regulator has said that Musk’s X posts incited a “pile-on” from his followers who allegedly sent death threats and exposed her children’s personal information, the BBC reported.

“The government backs our regulators and we back the eSafety Commissioner, particularly in light of the reprehensible threats to her physical safety and the threats to her family in the course of doing her job,” Rowland said.

Elon Musk’s X defeats Australia’s global takedown order of stabbing video Read More »