free speech

tiktok-loses-supreme-court-fight,-prepares-to-shut-down-sunday

TikTok loses Supreme Court fight, prepares to shut down Sunday


TikTok has said it’s preparing to shut down Sunday.

A TikTok influencer holds a sign that reads “Keep TikTok” outside the US Supreme Court Building as the court hears oral arguments on whether to overturn or delay a law that could lead to a ban of TikTok in the U.S., on January 10, 2025 in Washington, DC. Credit: Kayla Bartkowski / Stringer | Getty Images News

TikTok has lost its Supreme Court appeal in a 9–0 decision and will likely shut down on January 19, a day before Donald Trump’s inauguration, unless the app can be sold before the deadline, which TikTok has said is impossible.

During the trial last Friday, TikTok lawyer Noel Francisco warned SCOTUS that upholding the Biden administration’s divest-or-sell law would likely cause TikTok to “go dark—essentially the platform shuts down” and “essentially… stop operating.” On Wednesday, TikTok reportedly began preparing to shut down the app for all US users, anticipating the loss.

But TikTok’s claims that the divest-or-sell law violated Americans’ free speech rights did not supersede the government’s compelling national security interest in blocking a foreign adversary like China from potentially using the app to spy on or influence Americans, SCOTUS ruled.

“We conclude that the challenged provisions do not violate petitioners’ First Amendment rights,” the SCOTUS opinion said, while acknowledging that “there is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community.”

Late last year, TikTok and its owner, the Chinese-owned company ByteDance, urgently pushed SCOTUS to intervene before the law’s January 19 enforcement date. Ahead of SCOTUS’ decision, TikTok warned it would have no choice but to abruptly shut down a thriving platform where many Americans get their news, express their views, and make a living.

The US had argued the law was necessary to protect national security interests as the US-China trade war intensifies, alleging that China could use the app to track and influence TikTok’s 170 million American users. A lower court had agreed that the US had a compelling national security interest and rejected arguments that the law violated the First Amendment, triggering TikTok’s appeal to SCOTUS. Today, the Supreme Court upheld that ruling.

According to SCOTUS, the divest-or-sell law is “content-neutral” and only triggers intermediate scrutiny. That requires that the law doesn’t burden “substantially more speech than necessary” to serve the government’s national security interests, rather than strict scrutiny which would force the government to protect those interests through the least restrictive means.

Further, the government was right to single TikTok out, SCOTUS wrote, due to its “scale and susceptibility to foreign adversary control, together with the vast swaths of sensitive data the platform collects.”

“Preventing China from collecting vast amounts of sensitive data from 170 million US TikTok users” is a “decidedly content agnostic” rationale, justices wrote.

“The Government had good reason to single out TikTok for special treatment,” the opinion said.

TikTok CEO Shou Zi Chew posted a statement on TikTok reacting to the ruling, thanking Trump for committing to “work with TikTok” to avoid a shut down and telling users to “rest assured, we will do everything in our power to ensure our platform thrives” in the US.

Momentum to ban TikTok has shifted

First Amendment advocates condemned the SCOTUS ruling. The American Civil Liberties Union called it a “major blow to freedom of expression online,” and the Electronic Frontier Foundation’s civil liberties director David Greene accused justices of sweeping “past the undisputed content-based justification for the law” to “rule only based on the shaky data privacy concerns.”

While the SCOTUS ruling was unanimous, justice Sonia Sotomayor said that  “precedent leaves no doubt” that the law implicated the First Amendment and “plainly” imposed a burden on any US company that distributes TikTok’s speech and any content creator who preferred TikTok as a publisher of their speech.

Similarly concerned was justice Neil Gorsuch, who wrote in his concurring opinion that he harbors “serious reservations about whether the law before us is ‘content neutral’ and thus escapes ‘strict scrutiny.'” Gorsuch also said he didn’t know “whether this law will succeed in achieving its ends.”

“But the question we face today is not the law’s wisdom, only its constitutionality,” Gorsuch wrote. “Given just a handful of days after oral argument to issue an opinion, I cannot profess the kind of certainty I would like to have about the arguments and record before us. All I can say is that, at this time and under these constraints, the problem appears real and the response to it not unconstitutional.”

For TikTok and content creators defending the app, the stakes were incredibly high. TikTok repeatedly denied there was any evidence of spying and warned that enforcing the law would allow the government to unlawfully impose “a massive and unprecedented speech restriction.”

But the Supreme Court declined to order a preliminary injunction to block the law until Trump took office, instead deciding to rush through oral arguments and reach a decision prior to the law’s enforcement deadline. Now TikTok has little recourse if it wishes to maintain US operations, as justices suggested during the trial that even if a president chose to not enforce the law, providing access to TikTok or enabling updates could be viewed as too risky for app stores or other distributors.

The law at the center of the case—the Protecting Americans from Foreign Adversary Controlled Applications Act—had strong bipartisan support under the Biden administration.

But President-elect Donald Trump said he opposed a TikTok ban, despite agreeing that US national security interests in preventing TikTok spying on or manipulating Americans were compelling. And this week, Senator Ed Markey (D-Mass.) has introduced a bill to extend the deadline ahead of a potential TikTok ban, and a top Trump adviser, Congressman Mike Waltz, has said that Trump plans to stop the ban and “keep TikTok from going dark,” the BBC reported. Even the Biden administration, whose justice department just finished arguing why the US needed to enforce the law to SCOTUS, “is considering ways to keep TikTok available,” sources told NBC News.

“What might happen next to TikTok remains unclear,” Gorsuch noted in the opinion.

Will Trump save TikTok?

It will likely soon be clear whether Trump will intervene. Trump filed a brief in December, requesting that the Supreme Court stay enforcement of the law until after he takes office because allegedly only he could make a deal to save TikTok. He criticized SCOTUS for rushing the decision and suggested that Congress’ passage of the law may have been “legislative encroachment” that potentially “binds his hands” as president.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

TikTok’s CEO Chew signaled to users that Trump is expected to step in.

“On behalf of everyone at TikTok and all our users across the country, I want to thank President Trump for his commitment to work with us to find a solution that keeps TikTok available in the United States,” Chew’s statement said.

Chew also reminded Trump that he has 60 billion views of his content on TikTok and perhaps stands to lose a major platform through the ban.

“We are grateful and pleased to have the support of a president who truly understands our platform, one who has used TikTok to express his own thoughts and perspectives,” Chew said.

Trump seemingly has limited options to save TikTok, Forbes suggested. At trial, justices disagreed on whether Trump could legally decide to simply not enforce the law. And efforts to pause enforcement or claim compliance without evidence that ByteDance is working on selling off TikTok could be blocked by the court, analysts said. And while ByteDance has repeatedly said it’s unwilling to sell TikTok US, it’s possible, one analyst suggested to Forbes, that ByteDance might be more willing to divest “in exchange for Trump backing off his threat of high tariffs on Chinese imports.”

On Tuesday, a Bloomberg report suggested that China was considering whether selling TikTok to Elon Musk might be a good bargaining chip to de-escalate Trump’s attacks in the US-China trade war.

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

TikTok loses Supreme Court fight, prepares to shut down Sunday Read More »

trump-told-scotus-he-plans-to-make-a-deal-to-save-tiktok

Trump told SCOTUS he plans to make a deal to save TikTok

Several members of Congress— Senator Edward J. Markey (D-Mass.), Senator Rand Paul (R-Ky.), and Representative Ro Khanna (D-Calif.)—filed a brief agreeing that “the TikTok ban does not survive First Amendment scrutiny.” They agreed with TikTok that the law is “illegitimate.”

Lawmakers’ “principle justification” for the ban—”preventing covert content manipulation by the Chinese government”—masked a “desire” to control TikTok content, they said. Further, it could be achieved by a less-restrictive alternative, they said, a stance which TikTok has long argued for.

Attorney General Merrick Garland defended the Act, though, urging SCOTUS to remain laser-focused on the question of whether a forced sale of TikTok that would seemingly allow the app to continue operating without impacting American free speech violates the First Amendment. If the court agrees that the law survives strict scrutiny, TikTok could still be facing an abrupt shutdown in January.

The Supreme Court has scheduled oral arguments to begin on January 10. TikTok and content creators who separately sued to block the law have asked for their arguments to be divided, so that the court can separately weigh “different perspectives” when deciding how to approach the First Amendment question.

In its own brief, TikTok has asked SCOTUS to strike the portions of the law singling out TikTok or “at the very least” explain to Congress that “it needed to do far better work either tailoring the Act’s restrictions or justifying why the only viable remedy was to prohibit Petitioners from operating TikTok.”

But that may not be necessary if Trump prevails. Trump told the court that TikTok was an important platform for his presidential campaign and that he should be the one to make the call on whether TikTok should remain in the US—not the Supreme Court.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

Trump told SCOTUS he plans to make a deal to save TikTok Read More »

supreme-court-to-decide-if-tiktok-should-be-banned-or-sold

Supreme Court to decide if TikTok should be banned or sold

While the controversial US law doesn’t necessarily ban TikTok, it does seem designed to make TikTok “go away,” Greene said, and such a move to interfere with a widely used communications platform seems “unprecedented.”

“The TikTok ban itself and the DC Circuit’s approval of it should be of great concern even to those who find TikTok undesirable or scary,” Greene said in a statement. “Shutting down communications platforms or forcing their reorganization based on concerns of foreign propaganda and anti-national manipulation is an eminently anti-democratic tactic, one that the US has previously condemned globally.”

Greene further warned that the US “cutting off a tool used by 170 million Americans to receive information and communicate with the world, without proving with evidence that the tools are presently seriously harmful” would “greatly” lower “well-established standards for restricting freedom of speech in the US.”

TikTok partly appears to be hoping that President-elect Donald Trump will disrupt enforcement of the law, but Greene said it remains unclear if Trump’s plan to “save TikTok” might just be a plan to support a sale to a US buyer. At least one former Trump ally, Steven Mnuchin, has reportedly expressed interest in buying the app.

For TikTok, putting pressure on Trump will likely be the next step, “if the Supreme Court ever says, ‘we agree the law is valid,'” Greene suggested.

“Then that’s it,” Greene said. “There’s no other legal recourse. You only have political recourses.”

Like other civil rights groups, the EFF plans to remain on TikTok’s side as the SCOTUS battle starts.

“We are pleased that the Supreme Court will take the case and will urge the justices to apply the appropriately demanding First Amendment scrutiny,” Greene said.

Supreme Court to decide if TikTok should be banned or sold Read More »

facing-ban-next-month,-tiktok-begs-scotus-for-help

Facing ban next month, TikTok begs SCOTUS for help

TikTok: Ban is slippery slope to broad US censorship

According to TikTok, the government’s defense of the ban to prevent China from wielding a “covert” influence over Americans is a farce invented by lawyers to cover up the true mission of censorship. If the lower court’s verdict stands, TikTok alleged, “then Congress will have free rein to ban any American from speaking simply by identifying some risk that the speech is influenced by a foreign entity.”

TikTok doesn’t want to post big disclaimers on the app warning of “covert” influence, claiming that the government relied on “secret evidence” to prove this influence occurs on TikTok. But if the Supreme Court agrees that the government needed to show more than “bare factual assertions” to back national security claims the lower court said justified any potential speech restrictions, then the court will also likely agree to reverse the lower court’s decision, TikTok suggested.

It will become much clearer by January 6 whether the January 19 ban will take effect, at which point TikTok would shut down, booting all US users from the app. TikTok urged the Supreme Court to agree it is in the public interest to delay the ban and review the constitutional claims to prevent any “extreme” harms to both TikTok and US users who depend on the app for news, community, and income.

If SCOTUS doesn’t intervene, TikTok said that the lower court’s “flawed legal rationales would open the door to upholding content-based speech bans in contexts far different than this one.”

“Fearmongering about national security cannot obscure the threat that the Act itself poses to all Americans,” TikTok alleged, while suggesting that even Congress would agree that a “modest delay” in enforcing the law wouldn’t pose any immediate risk to US national security. Congress is also aware that a sale would not be technically, commercially, or legally possible in the timeframe provided, TikTok said. A temporary injunction would prevent irreparable harms, TikTok said, including the irreparable harm courts have long held is caused by restricting speech of Americans for any amount of time.

“An interim injunction is also appropriate because it will give the incoming Administration time to determine its position, as the President-elect and his advisors have voiced support for saving TikTok,” TikTok argued.

Ars could not immediately reach TikTok for comment.

Facing ban next month, TikTok begs SCOTUS for help Read More »

rfk-jr’s-anti-vaccine-group-can’t-sue-meta-for-agreeing-with-cdc,-judge-rules

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules

Independent presidential candidate Robert F. Kennedy Jr.

Enlarge / Independent presidential candidate Robert F. Kennedy Jr.

The Children’s Health Defense (CHD), an anti-vaccine group founded by Robert F. Kennedy Jr, has once again failed to convince a court that Meta acted as a state agent when censoring the group’s posts and ads on Facebook and Instagram.

In his opinion affirming a lower court’s dismissal, US Ninth Circuit Court of Appeals Judge Eric Miller wrote that CHD failed to prove that Meta acted as an arm of the government in censoring posts. Concluding that Meta’s right to censor views that the platforms find “distasteful” is protected by the First Amendment, Miller denied CHD’s requested relief, which had included an injunction and civil monetary damages.

“Meta evidently believes that vaccines are safe and effective and that their use should be encouraged,” Miller wrote. “It does not lose the right to promote those views simply because they happen to be shared by the government.”

CHD told Reuters that the group “was disappointed with the decision and considering its legal options.”

The group first filed the complaint in 2020, arguing that Meta colluded with government officials to censor protected speech by labeling anti-vaccine posts as misleading or removing and shadowbanning CHD posts. This caused CHD’s traffic on the platforms to plummet, CHD claimed, and ultimately, its pages were removed from both platforms.

However, critically, Miller wrote, CHD did not allege that “the government was actually involved in the decisions to label CHD’s posts as ‘false’ or ‘misleading,’ the decision to put the warning label on CHD’s Facebook page, or the decisions to ‘demonetize’ or ‘shadow-ban.'”

“CHD has not alleged facts that allow us to infer that the government coerced Meta into implementing a specific policy,” Miller wrote.

Instead, Meta “was entitled to encourage” various “input from the government,” justifiably seeking vaccine-related information provided by the World Health Organization (WHO) and the US Centers for Disease Control and Prevention (CDC) as it navigated complex content moderation decisions throughout the pandemic, Miller wrote.

Therefore, Meta’s actions against CHD were due to “Meta’s own ‘policy of censoring,’ not any provision of federal law,” Miller concluded. “The evidence suggested that Meta had independent incentives to moderate content and exercised its own judgment in so doing.”

None of CHD’s theories that Meta coordinated with officials to deprive “CHD of its constitutional rights” were plausible, Miller wrote, whereas the “innocent alternative”—”that Meta adopted the policy it did simply because” CEO Mark Zuckerberg and Meta “share the government’s view that vaccines are safe and effective”—appeared “more plausible.”

Meta “does not become an agent of the government just because it decides that the CDC sometimes has a point,” Miller wrote.

Equally not persuasive were CHD’s notions that Section 230 immunity—which shields platforms from liability for third-party content—”‘removed all legal barriers’ to the censorship of vaccine-related speech,” such that “Meta’s restriction of that content should be considered state action.”

“That Section 230 operates in the background to immunize Meta if it chooses to suppress vaccine misinformation—whether because it shares the government’s health concerns or for independent commercial reasons—does not transform Meta’s choice into state action,” Miller wrote.

One judge dissented over Section 230 concerns

In his dissenting opinion, Judge Daniel Collins defended CHD’s Section 230 claim, however, suggesting that the appeals court erred and should have granted CHD injunctive and declaratory relief from alleged censorship. CHD CEO Mary Holland told The Defender that the group was pleased the decision was not unanimous.

According to Collins, who like Miller is a Trump appointee, Meta could never have built its massive social platforms without Section 230 immunity, which grants platforms the ability to broadly censor viewpoints they disfavor.

It was “important to keep in mind” that “the vast practical power that Meta exercises over the speech of millions of others ultimately rests on a government-granted privilege to which Meta is not constitutionally entitled,” Collins wrote. And this power “makes a crucial difference in the state-action analysis.”

As Collins sees it, CHD could plausibly allege that Meta’s communications with government officials about vaccine-related misinformation targeted specific users, like the “disinformation dozen” that includes both CHD and Kennedy. In that case, it appears possible to Collins that Section 230 provides a potential opportunity for government to target speech that it disfavors through mechanisms provided by the platforms.

“Having specifically and purposefully created an immunized power for mega-platform operators to freely censor the speech of millions of persons on those platforms, the Government is perhaps unsurprisingly tempted to then try to influence particular uses of such dangerous levers against protected speech expressing viewpoints the Government does not like,” Collins warned.

He further argued that “Meta’s relevant First Amendment rights” do not “give Meta an unbounded freedom to work with the Government in suppressing speech on its platforms.” Disagreeing with the majority, he wrote that “in this distinctive scenario, applying the state-action doctrine promotes individual liberty by keeping the Government’s hands away from the tempting levers of censorship on these vast platforms.”

The majority agreed, however, that while Section 230 immunity “is undoubtedly a significant benefit to companies like Meta,” lawmakers’ threats to weaken Section 230 did not suggest that Meta’s anti-vaccine policy was coerced state action.

“Many companies rely, in one way or another, on a favorable regulatory environment or the goodwill of the government,” Miller wrote. “If that were enough for state action, every large government contractor would be a state actor. But that is not the law.”

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules Read More »

kids-online-safety-act-passes-senate-despite-concerns-it-will-harm-kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

The Kids Online Safety Act (KOSA) easily passed the Senate today despite critics’ concerns that the bill may risk creating more harm than good for kids and perhaps censor speech for online users of all ages if it’s signed into law.

KOSA received broad bipartisan support in the Senate, passing with a 91–3 vote alongside the Children’s Online Privacy Protection Action (COPPA) 2.0. Both laws seek to control how much data can be collected from minors, as well as regulate the platform features that could harm children’s mental health.

Only Senators Ron Wyden (D-Ore.), Rand Paul (R-Ky.), and Mike Lee (R-Utah) opposed the bills.

In an op-ed for The Courier-Journal, Paul argued that KOSA imposes a “duty of care” to mitigate harms to minors on their platforms that “will not only stifle free speech, but it will deprive Americans of the benefits of our technological advancements.”

“With the Internet, today’s children have the world at their fingertips,” Paul wrote, but if KOSA passes, even allegedly benign content like “pro-life messages” or discussion of a teen overcoming an eating disorder could be censored if platforms fear compliance issues.

“While doctors’ and therapists’ offices close at night and on weekends, support groups are available 24 hours a day, seven days a week for people who share similar concerns or have the same health problems. Any solution to protect kids online must ensure the positive aspects of the Internet are preserved,” Paul wrote.

During a KOSA critics’ press conference today, Dara Adkison—the executive director of a group providing resources for transgender youths called TransOhio—expressed concerns that lawmakers would target sites like TransOhio if the law also passed in the House, where the bill heads next.

“I’ve literally had legislators tell me to my face that they would love to see our website taken off the Internet because they don’t want people to have the kinds of vital community resources that we provide,” Adkison said.

Paul argued that what was considered harmful to kids was subjective, noting that a key flaw with KOSA was that “KOSA does not explicitly define the term ‘mental health disorder.'” Instead, platforms are to refer to the definition in “the fifth edition of the Diagnostic and Statistical Manual of Mental Health Disorders” or “the most current successor edition.”

“That means the scope of the bill could change overnight without any action from America’s elected representatives,” Paul warned, suggesting that “KOSA opens the door to nearly limitless content regulation because platforms will censor users rather than risk liability.”

Ahead of the vote, Senator Richard Blumenthal (D-Conn.)—who co-sponsored KOSA—denied that the bill strove to regulate content, The Hill reported. To Blumenthal and other KOSA supporters, its aim instead is to ensure that social media is “safe by design” for young users.

According to The Washington Post, KOSA and COPPA 2.0 passing “represent the most significant restrictions on tech platforms to clear a chamber of Congress in decades.” However, while President Joe Biden has indicated he would be willing to sign the bill into law, most seem to agree that KOSA will struggle to pass in the House of Representatives.

A senior tech policy director for Chamber of Progress—a progressive tech industry policy coalition—Todd O’Boyle, has said that currently there is “substantial opposition” in the House. O’Boyle said that he expects that the political divide will be enough to block KOSA’s passage and prevent giving “the power” to the Federal Trade Commission (FTC) or “the next president” to “crack down on online speech” or otherwise pose “a massive threat to our constitutional rights.”

“If there’s one thing the far-left and far-right agree on, it’s that the next chair of the FTC shouldn’t get to decide what online posts are harmful,” O’Boyle said.

Kids Online Safety Act passes Senate despite concerns it will harm kids Read More »

scotus-nixes-injunction-that-limited-biden-admin-contacts-with-social-networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

On Wednesday, the Supreme Court tossed out claims that the Biden administration coerced social media platforms into censoring users by removing COVID-19 and election-related content.

Complaints alleging that high-ranking government officials were censoring conservatives had previously convinced a lower court to order an injunction limiting the Biden administration’s contacts with platforms. But now that injunction has been overturned, re-opening lines of communication just ahead of the 2024 elections—when officials will once again be closely monitoring the spread of misinformation online targeted at voters.

In a 6–3 vote, the majority ruled that none of the plaintiffs suing—including five social media users and Republican attorneys general in Louisiana and Missouri—had standing. They had alleged that the government had “pressured the platforms to censor their speech in violation of the First Amendment,” demanding an injunction to stop any future censorship.

Plaintiffs may have succeeded if they were instead seeking damages for past harms. But in her opinion, Justice Amy Coney Barrett wrote that partly because the Biden administration seemingly stopped influencing platforms’ content policies in 2022, none of the plaintiffs could show evidence of a “substantial risk that, in the near future, they will suffer an injury that is traceable” to any government official. Thus, they did not seem to face “a real and immediate threat of repeated injury,” Barrett wrote.

“Without proof of an ongoing pressure campaign, it is entirely speculative that the platforms’ future moderation decisions will be attributable, even in part,” to government officials, Barrett wrote, finding that an injunction would do little to prevent future censorship.

Instead, plaintiffs’ claims “depend on the platforms’ actions,” Barrett emphasized, “yet the plaintiffs do not seek to enjoin the platforms from restricting any posts or accounts.”

“It is a bedrock principle that a federal court cannot redress ‘injury that results from the independent action of some third party not before the court,'” Barrett wrote.

Barrett repeatedly noted “weak” arguments raised by plaintiffs, none of which could directly link their specific content removals with the Biden administration’s pressure campaign urging platforms to remove vaccine or election misinformation.

According to Barrett, the lower court initially granting the injunction “glossed over complexities in the evidence,” including the fact that “platforms began to suppress the plaintiffs’ COVID-19 content” before the government pressure campaign began. That’s an issue, Barrett said, because standing to sue “requires a threshold showing that a particular defendant pressured a particular platform to censor a particular topic before that platform suppressed a particular plaintiff’s speech on that topic.”

“While the record reflects that the Government defendants played a role in at least some of the platforms’ moderation choices, the evidence indicates that the platforms had independent incentives to moderate content and often exercised their own judgment,” Barrett wrote.

Barrett was similarly unconvinced by arguments that plaintiffs risk platforms removing future content based on stricter moderation policies that were previously coerced by officials.

“Without evidence of continued pressure from the defendants, the platforms remain free to enforce, or not to enforce, their policies—even those tainted by initial governmental coercion,” Barrett wrote.

Judge: SCOTUS “shirks duty” to defend free speech

Justices Clarence Thomas and Neil Gorsuch joined Samuel Alito in dissenting, arguing that “this is one of the most important free speech cases to reach this Court in years” and that the Supreme Court had an “obligation” to “tackle the free speech issue that the case presents.”

“The Court, however, shirks that duty and thus permits the successful campaign of coercion in this case to stand as an attractive model for future officials who want to control what the people say, hear, and think,” Alito wrote.

Alito argued that the evidence showed that while “downright dangerous” speech was suppressed, so was “valuable speech.” He agreed with the lower court that “a far-reaching and widespread censorship campaign” had been “conducted by high-ranking federal officials against Americans who expressed certain disfavored views about COVID-19 on social media.”

“For months, high-ranking Government officials placed unrelenting pressure on Facebook to suppress Americans’ free speech,” Alito wrote. “Because the Court unjustifiably refuses to address this serious threat to the First Amendment, I respectfully dissent.”

At least one plaintiff who opposed masking and vaccines, Jill Hines, was “indisputably injured,” Alito wrote, arguing that evidence showed that she was censored more frequently after officials pressured Facebook into changing their policies.

“Top federal officials continuously and persistently hectored Facebook to crack down on what the officials saw as unhelpful social media posts, including not only posts that they thought were false or misleading but also stories that they did not claim to be literally false but nevertheless wanted obscured,” Alito wrote.

While Barrett and the majority found that platforms were more likely responsible for injury, Alito disagreed, writing that with the threat of antitrust probes or Section 230 amendments, Facebook acted like “a subservient entity determined to stay in the good graces of a powerful taskmaster.”

Alito wrote that the majority was “applying a new and heightened standard” by requiring plaintiffs to “untangle Government-caused censorship from censorship that Facebook might have undertaken anyway.” In his view, it was enough that Hines showed that “one predictable effect of the officials’ action was that Facebook would modify its censorship policies in a way that affected her.”

“When the White House pressured Facebook to amend some of the policies related to speech in which Hines engaged, those amendments necessarily impacted some of Facebook’s censorship decisions,” Alito wrote. “Nothing more is needed. What the Court seems to want are a series of ironclad links.”

“That is regrettable,” Alito said.

SCOTUS nixes injunction that limited Biden admin contacts with social networks Read More »

elon-musk’s-x-defeats-australia’s-global-takedown-order-of-stabbing-video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Australia’s safety regulator has ended a legal battle with X (formerly Twitter) after threatening approximately $500,000 daily fines for failing to remove 65 instances of a religiously motivated stabbing video from X globally.

Enforcing Australia’s Online Safety Act, eSafety commissioner Julie Inman-Grant had argued it would be dangerous for the videos to keep spreading on X, potentially inciting other acts of terror in Australia.

But X owner Elon Musk refused to comply with the global takedown order, arguing that it would be “unlawful and dangerous” to allow one country to control the global Internet. And Musk was not alone in this fight. The legal director of a nonprofit digital rights group called the Electronic Frontier Foundation (EFF), Corynne McSherry, backed up Musk, urging the court to agree that “no single country should be able to restrict speech across the entire Internet.”

“We welcome the news that the eSafety Commissioner is no longer pursuing legal action against X seeking the global removal of content that does not violate X’s rules,” X’s Global Government Affairs account posted late Tuesday night. “This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed.”

Inman-Grant was formerly Twitter’s director of public policy in Australia and used that experience to land what she told The Courier-Mail was her “dream role” as Australia’s eSafety commissioner in 2017. Since issuing the order to remove the video globally on X, Inman-Grant had traded barbs with Musk (along with other Australian lawmakers), responding to Musk labeling her a “censorship commissar” by calling him an “arrogant billionaire” for fighting the order.

On X, Musk arguably got the last word, posting, “Freedom of speech is worth fighting for.”

Safety regulator still defends takedown order

In a statement, Inman-Grant said early Wednesday that her decision to discontinue proceedings against X was part of an effort to “consolidate actions,” including “litigation across multiple cases.” She ultimately determined that dropping the case against X would be the “option likely to achieve the most positive outcome for the online safety of all Australians, especially children.”

“Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community,” Inman-Grant said, still defending the order despite dropping it.

In court, X’s lawyer Marcus Hoyne had pushed back on such logic, arguing that the eSafety regulator’s mission was “pointless” because “footage of the attack had now spread far beyond the few dozen URLs originally identified,” the Australian Broadcasting Corporation reported.

“I stand by my investigators and the decisions eSafety made,” Inman-Grant said.

Other Australian lawmakers agree the order was not out of line. According to AP News, Australian Minister for Communications Michelle Rowland shared a similar statement in parliament today, backing up the safety regulator while scolding X users who allegedly took up Musk’s fight by threatening Inman-Grant and her family. The safety regulator has said that Musk’s X posts incited a “pile-on” from his followers who allegedly sent death threats and exposed her children’s personal information, the BBC reported.

“The government backs our regulators and we back the eSafety Commissioner, particularly in light of the reprehensible threats to her physical safety and the threats to her family in the course of doing her job,” Rowland said.

Elon Musk’s X defeats Australia’s global takedown order of stabbing video Read More »

judge-halts-texas-probe-into-media-matters’-reporting-on-x

Judge halts Texas probe into Media Matters’ reporting on X

Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

Enlarge / Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

A judge has preliminarily blocked what Media Matters for America (MMFA) described as Texas Attorney General Ken Paxton’s attempt to “rifle through” confidential documents to prove that MMFA fraudulently manipulated X (formerly Twitter) data to ruin X’s advertising business, as Elon Musk has alleged.

After Musk accused MMFA of publishing reports that Musk claimed were designed to scare advertisers off X, Paxton promptly launched his own investigation into MMFA last November.

Suing MMFA over alleged violations of Texas’ Deceptive Trade Practices Act—which prohibits “disparaging the goods, services, or business of another by false or misleading representation of facts”—Paxton sought a wide range of MMFA documents through a civil investigative demand (CID). Filing a motion to block the CID, MMFA told the court that the CID had violated the media organization’s First Amendment rights, providing evidence that Paxton’s investigation and CID had chilled MMFA speech.

Paxton had requested Media Matters’ financial records—including “direct and indirect sources of funding for all Media Matters operations involving X research or publications”—as well as “internal and external communications” on “Musk’s purchase of X” and X’s current CEO Linda Yaccarino. He also asked for all of Media Matters’ communications with X representatives and X advertisers.

But perhaps most invasive, Paxton wanted to see all the communications about Media Matters’ X reporting that triggered the lawsuits, which, as US District Judge Amit Mehta wrote in an opinion published Friday, was a compelled disclosure that “poses a serious threat to the vitality of the newsgathering process.”

Mehta was concerned that MMFA showed that “Media Matters’ editorial leaders have pared back reporting and publishing, particularly on any topics that could be perceived as relating to the Paxton investigation”—including two follow-ups on its X reporting. Because of Paxton’s alleged First Amendment retaliation, MMFA said it did not publish “two pieces concerning X’s placement of advertising alongside antisemitic, pro-Nazi accounts”—”not out of legitimate concerns about fairness or accuracy,” but “out of fear of harassment, threats, and retaliation.”

According to Mehta’s order, Paxton did not contest that Texas’ lawsuit had chilled MMFA’s speech. Further, Paxton had given at least one podcast interview where he called upon other state attorneys general to join him in investigating MMFA.

Because Paxton “projected himself across state lines and asserted a pseudo-national executive authority,” Mehta wrote and repeatedly described MMFA as a “radical anti-free speech” or “radical left-wing organization,” the court had seen sufficient “evidence of retaliatory intent.”

“Notably,” Mehta wrote, Paxton remained “silent” and never “submitted a sworn declaration that explains his reasons for opening the investigation.”

In his press release, Paxton justified the investigation by saying, “We are examining the issue closely to ensure that the public has not been deceived by the schemes of radical left-wing organizations who would like nothing more than to limit freedom by reducing participation in the public square.”

Ultimately, Mehta granted MMFA’s request for a preliminary injunction to block Paxton’s CID because the judge found that the investigation and the CID have caused MMFA “to self-censor when making research and publication decisions, adversely affected the relationships between editors and reporters, and restricted communications with sources and journalists.”

“Only injunctive relief will ‘prevent the [ongoing] deprivation of free speech rights,'” Mehta’s opinion said, deeming MMFA’s reporting as “core First Amendment activities.”

Mehta’s order also banned Paxton from taking any steps to further his investigation until the lawsuit is decided.

In a statement Friday, MMFA President and CEO Angelo Carusone celebrated the win as not just against Paxton but also against Musk.

“Elon Musk encouraged Republican state attorneys general to use their power to harass their critics and stifle reporting about X,” Carusone said. “Ken Paxton was one of those AGs that took up the call and he was defeated. Today’s decision is a victory for free speech.”

Paxton has not yet responded to the preliminary injunction and his office did not respond to Ars’ request to comment..

Media Matters’ lawyer, Aria C. Branch, a partner at Elias Law Group, told Ars that “while Attorney General Paxton’s office has not yet responded to Friday’s ruling, the preliminary injunction should certainly put an end to these kind of lawless, politically motivated attempts to muzzle the press.”

Judge halts Texas probe into Media Matters’ reporting on X Read More »

x-filing-“thermonuclear-lawsuit”-in-texas-should-be-“fatal,”-media-matters-says

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says

Ever since Elon Musk’s X Corp sued Media Matters for America (MMFA) over a pair of reports that X (formerly Twitter) claims caused an advertiser exodus in 2023, one big question has remained for onlookers: Why is this fight happening in Texas?

In a motion to dismiss filed in Texas’ northern district last month, MMFA argued that X’s lawsuit should be dismissed not just because of a “fatal jurisdictional defect,” but “dismissal is also required for lack of venue.”

Notably, MMFA is based in Washington, DC, while “X is organized under Nevada law and maintains its principal place of business in San Francisco, California, where its own terms of service require users of its platform to litigate any disputes.”

“Texas is not a fair or reasonable forum for this lawsuit,” MMFA argued, suggesting that “the case must be dismissed or transferred” because “neither the parties nor the cause of action has any connection to Texas.”

Last Friday, X responded to the motion to dismiss, claiming that the lawsuit—which Musk has described as “thermonuclear”—was appropriately filed in Texas because MMFA “intentionally” targeted readers and at least two X advertisers located in Texas, Oracle and AT&T. According to X, because MMFA “identified Oracle, a Texas-based corporation, by name in its coverage,” MMFA “cannot claim surprise at being held to answer for its conduct in Texas.” X also claimed that Texas has jurisdiction because Musk resides in Texas and “makes numerous critical business decisions about X while in Texas.”

This so-called targeting of Texans caused a “substantial part” of alleged financial harms that X attributes to MMFA’s reporting, X alleged.

According to X, MMFA specifically targeted X in Texas by sending newsletters sharing its reports with “hundreds or thousands” of Texas readers and by allegedly soliciting donations from Texans to support MMFA’s reporting.

But MMFA pushed back, saying that “Texas subscribers comprise a disproportionately small percentage of Media Matters’ newsletter recipients” and that MMFA did “not solicit Texas donors to fund Media Matters’s journalism concerning X.” Because of this, X’s “efforts to concoct claim-related Texas contacts amount to a series of shots in the dark, uninformed guesses, and irrelevant tangents,” MMFA argued.

On top of that, MMFA argued that X could not attribute any financial harms allegedly caused by MMFA’s reports to either of the two Texas-based advertisers that X named in its court filings. Oracle, MMFA said, “by X’s own admission,… did not withdraw its ads” from X, and AT&T was not named in MMFA’s reporting, and thus, “any investigation AT&T did into its ad placement on X was of its own volition and is not plausibly connected to Media Matters.” MMFA has argued that advertisers, particularly sophisticated Fortune 500 companies, made their own decisions to stop advertising on X, perhaps due to widely reported increases in hate speech on X or even Musk’s own seemingly antisemitic posting.

Ars could not immediately reach X, Oracle, or AT&T for comment.

X’s suit allegedly designed to break MMFA

MMFA President Angelo Carusone, who is a defendant in X’s lawsuit, told Ars that X’s recent filing has continued to “expose” the lawsuit as a “meritless and vexatious effort to inflict maximum damage on critical research and reporting about the platform.”

“It’s solely designed to basically break us or stop us from doing the work that we were doing originally,” Carusone said, confirming that the lawsuit has negatively impacted MMFA’s hate speech research on X.

MMFA argued that Musk could have sued in other jurisdictions, such as Maryland, DC, or California, and MMFA would not have disputed the venue, but Carusone suggested that Musk sued in Texas in hopes that it would be “a more friendly jurisdiction.”

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says Read More »

amc-to-pay-$8m-for-allegedly-violating-1988-law-with-use-of-meta-pixel

AMC to pay $8M for allegedly violating 1988 law with use of Meta Pixel

Stream like no one is watching —

Proposed settlement impacts millions using AMC apps like Shudder and AMC+.

AMC to pay $8M for allegedly violating 1988 law with use of Meta Pixel

On Thursday, AMC notified subscribers of a proposed $8.3 million settlement that provides awards to an estimated 6 million subscribers of its six streaming services: AMC+, Shudder, Acorn TV, ALLBLK, SundanceNow, and HIDIVE.

The settlement comes in response to allegations that AMC illegally shared subscribers’ viewing history with tech companies like Google, Facebook, and X (aka Twitter) in violation of the Video Privacy Protection Act (VPPA).

Passed in 1988, the VPPA prohibits AMC and other video service providers from sharing “information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.” It was originally passed to protect individuals’ right to private viewing habits, after a journalist published the mostly unrevealing video rental history of a judge, Robert Bork, who had been nominated to the Supreme Court by Ronald Reagan.

The so-called “Bork Tapes” revealed little—other than that the judge frequently rented spy thrillers and British costume dramas—but lawmakers recognized that speech could be chilled by monitoring anyone’s viewing habits. While the law was born in the era of Blockbuster Video, subscribers suing AMC wrote in their amended complaint that “the importance of legislation like the VPPA in the modern era of datamining is more pronounced than ever before.”

According to subscribers suing, AMC allegedly installed tracking technologies—including the Meta Pixel, the X Tracking Pixel, and Google Tracking Technology—on its website, allowing their personally identifying information to be connected with their viewing history.

Some trackers, like the Meta Pixel, required AMC to choose what kind of activity can be tracked, and subscribers claimed that AMC had willingly opted into sharing video names and URLs with Meta, along with a Facebook ID. “Anyone” could use the Facebook ID, subscribers said, to identify the AMC subscribers “simply by entering https://www.facebook.com/[unencrypted FID]/” into a browser.

X’s ID could similarly be de-anonymized, subscribers alleged, by using tweeterid.com.

AMC “could easily program its AMC Services websites so that this information is not disclosed” to tech companies, subscribers alleged.

Denying wrongdoing, AMC has defended its use of tracking technologies but is proposing to settle with subscribers to avoid uncertain outcomes from litigation, the proposed settlement said.

A hearing to approve the proposed settlement has been scheduled for May 16.

If it’s approved, AMC has agreed to “suspend, remove, or modify operation of the Meta Pixel and other Third-Party Tracking Technologies so that use of such technologies on AMC Services will not result in AMC’s disclosure to the third-party technology companies of the specific video content requested or obtained by a specific individual.”

Google and X did not immediately respond to Ars’ request to comment. Meta declined to comment.

All registered users of AMC services who “requested or obtained video content on at least one of the six AMC services” between January 18, 2021, and January 10, 2024, are currently eligible to submit claims under the proposed settlement. The deadline to submit is April 9.

In addition to distributing the $8.3 million settlement fund among class members, subscribers will receive a free one-week digital subscription.

According to AMC’s notice to subscribers (full disclosure, I am one), AMC’s agreement to avoid sharing subscribers’ viewing histories may change if the VPPA is amended, repealed, or invalidated. If the law changes to permit sharing viewing data at the core of subscribers’ claim, AMC may resume sharing that information with tech companies.

That day could come soon if Patreon has its way. Recently, Patreon asked a federal judge to rule that the VPPA is unconstitutional.

Patreon’s lawsuit is similar in its use of the Meta Pixel, allegedly violating the VPPA by sharing video views on its platform with Meta.

Patreon has argued that the VPPA is unconstitutional because it chills speech. Patreon said that the law was enacted “for the express purpose of silencing disclosures about political figures and their video-watching, an issue of undisputed continuing public interest and concern.”

According to Patreon, the VPPA narrowly prohibits video service providers from sharing video titles, but not from sharing information that people may wish to keep private, such as “the genres, performers, directors, political views, sexual content, and every other detail of pre-recorded video that those consumers watch.”

Therefore, Patreon argued, the VPPA “restrains speech” while “doing little if anything to protect privacy” and never protecting privacy “by the least restrictive means.”

That lawsuit remains ongoing, but Patreon’s position is likely to be met with opposition from experts who typically also defend freedom of speech. Experts at the Electronic Privacy Information Center, like AMC subscribers suing, consider the VPPA one of America’s “strongest protections of consumer privacy against a specific form of data collection.” And the Electronic Frontier Foundation (EFF) has already moved to convince the court to reject Patreon’s claim, describing the VPPA in a blog as an “essential” privacy protection.

“EFF is second to none in fighting for everyone’s First Amendment rights in court,” EFF’s blog said. “But Patreon’s First Amendment argument is wrong and misguided. The company seeks to elevate its speech interests over those of Internet users who benefit from the VPPA’s protections.”

AMC to pay $8M for allegedly violating 1988 law with use of Meta Pixel Read More »