free speech

he-got-sued-for-sharing-public-youtube-videos;-nightmare-ended-in-settlement

He got sued for sharing public YouTube videos; nightmare ended in settlement


Librarian vows to stop invasive ed tech after ending lawsuit with Proctorio.

Librarian Ian Linkletter remains one of Proctorio’s biggest critics after 5-year legal battle. Credit: Ashley Linkletter

Nobody expects to get sued for re-posting a YouTube video on social media by using the “share” button, but librarian Ian Linkletter spent the past five years embroiled in a copyright fight after doing just that.

Now that a settlement has been reached, Linkletter told Ars why he thinks his 2020 tweets sharing public YouTube videos put a target on his back.

Linkletter’s legal nightmare started in 2020 after an education technology company, Proctorio, began monitoring student backlash on Reddit over its AI tool used to remotely scan rooms, identify students, and prevent cheating on exams. On Reddit, students echoed serious concerns raised by researchers, warning of privacy issues, racist and sexist biases, and barriers to students with disabilities.

At that time, Linkletter was a learning technology specialist at the University of British Columbia. He had been aware of Proctorio as a tool that some professors used, but he ultimately joined UBC students criticizing Proctorio, as, practically overnight, it became a default tool that every teacher relied on during the early stages of the pandemic.

To Linkletter, the AI tool not only seemed flawed, but it also seemingly made students more anxious about exams. However, he didn’t post any tweets criticizing the tech—until he grew particularly disturbed to see Proctorio’s CEO, Mike Olsen, “showing up in the comments” on Reddit to fire back at one of his university’s loudest student critics. Defending Proctorio, Olsen roused even more backlash by posting the student’s private chat logs publicly to prove the student “lied” about a support interaction, The Guardian reported.

“If you’re gonna lie bro … don’t do it when the company clearly has an entire transcript of your conversation,” Olsen wrote, later apologizing for the now-deleted post.

“That set me off, and I was just like, this is completely unacceptable for a CEO to be going after our students like this,” Linkletter told Ars.

The more that Linkletter researched Proctorio, the more concerned he became. Taking to then-Twitter, he posted a series of seven tweets over a couple days that linked to YouTube videos that Proctorio hosted in its help center. He felt the videos—which showed how Proctorio flagged certain behaviors, tracked “abnormal” eye and head movements, and scanned rooms—helped demonstrate why students were so upset. And while he had fewer than 1,000 followers, he hoped that the influential higher education administrators who followed him would see his posts and consider dropping the tech.

Rather than request Linkletter remove the tweets—which was the company’s standard practice—Proctorio moved quickly to delete the videos. Proctorio supposedly expected that the removals would put Linkletter on notice to stop tweeting out help center videos. Instead, Linkletter posted a screenshot of the help center showing all the disabled videos, while suggesting that Proctorio seemed so invested in secrecy that it was willing to gut its own support resources to censor criticism of their tools.

Together, the videos, the help center screenshot, and another screenshot showing course material describing how Proctorio works were enough for Proctorio to take Linkletter to court.

The ed tech company promptly filed a lawsuit and obtained a temporary injunction by spuriously claiming that Linkletter shared private YouTube videos containing confidential information. Because the YouTube videos—which were public but “unlisted” when Linkletter shared them—had been removed, Linkletter did not have to delete the seven tweets that initially caught Proctorio’s attention, but the injunction required that he remove two tweets, including the screenshots.

In the five years since, the legal fight dragged on, with no end in sight until last week, as Canadian courts tangled with copyright allegations that tested a recently passed law intended to shield Canadian rights to free expression, the Protection of Public Participation Act.

To fund his defense, Linkletter said in a blog announcing the settlement that he invested his life savings “ten times over.” Additionally, about 900 GoFundMe supporters and thousands of members of the Association of Administrative and Professional Staff at UBC contributed tens of thousands more. For the last year of the battle, a law firm, Norton Rose Fulbright, agreed to represent him on a pro bono basis, which Linkletter said “was a huge relief to me, as it meant I could defend myself all the way if Proctorio chose to proceed with the litigation.”

The terms of the settlement remain confidential, but both Linkletter and Proctorio confirmed that no money was exchanged.

For Proctorio, the settlement made permanent the injunction that restricted Linkletter from posting the company’s help center or instructional materials. But it doesn’t stop Linkletter from remaining the company’s biggest critic, as “there are no other restrictions on my freedom of expression,” Linkletter’s blog noted.

“I’ve won my life back!” Linkletter wrote, while reassuring his supporters that he’s “fine” with how things ended.

“It doesn’t take much imagination to understand why Proctorio is a nightmare for students,” Linkletter wrote. “I can say everything that matters about Proctorio using public information.”

Proctorio’s YouTube “mistake” triggered injunction

In a statement to Ars, Kevin Rockmael, Proctorio’s head of marketing, suggested that the ed tech company sees the settlement as a win.

“After years of successful litigation, we are pleased that this settlement (which did not include any monetary compensation) protects our interests by making our initial restraining order permanent,” Rockmael said. “Most importantly, we are glad to close this chapter and focus our efforts on helping teachers and educational institutions deliver valuable and secure assessments.”

Responding to Rockmael, Linkletter clarified that the settlement upholds a modified injunction, noting that Proctorio’s initial injunction was significantly narrowed after a court ruled it overly broad. Linkletter also pointed to testimony from Proctorio’s former head of marketing, John Devoy, whose affidavit “mistakenly” swearing that Linkletter was sharing private YouTube videos was the sole basis for the court approving the injunction. That testimony, Linkletter told Ars, suggested that Proctorio knew that the librarian had shared videos the company had accidentally made public and used it as “some sort of excuse to pull the trigger” on a lawsuit after Linkletter commented on the sub-Reddit incident.

“Even a child understands how YouTube works, so how are we supposed to trust a surveillance company that doesn’t?” Linkletter wrote in his blog.

Grilled by Linkletter’s lawyer, Devoy insisted that he was not “lying” when he claimed the videos Linkletter shared came from a private channel. Instead—even though he knew the difference between a private and public channel—Devoy claimed that he made a simple mistake, even suggesting that the inaccurate claim was a “typo.”

Linkletter maintains that Proctorio’s lawsuit had nothing to do with the videos he shared—which his legal team discovered had been shared publicly by many parties, including UBC, none of which Proctorio decided to sue. Instead, he felt targeted to silence his criticism of the company, and he successfully fought to keep Proctorio from accessing his private communications, which seemed to be a fishing expedition to find other critics to monitor.

“In my opinion, and this is just my opinion, one of the purposes of the lawsuit was to have a chilling effect on public discourse around proctoring,” Linkletter told Ars. “And it worked. I mean, a lot of people were scared to use the word Proctorio, especially in writing.”

Joe Mullin, a senior policy analyst who monitored Linkletter’s case for the nonprofit digital rights group the Electronic Frontier Foundation, agreed that Proctorio’s lawsuit risked chilling speech.

“We’re glad to see this lawsuit finally resolved in a way that protects Ian Linkletter’s freedom to speak out,” Mullin told Ars, noting that Linkletter “raised serious concerns about proctoring software at a time when students were subjected to unprecedented monitoring.”

“This case should never have dragged on for five years,” Mullin said. “Using copyright claims to retaliate against critics is wrong, and it chills public debate about surveillance technology.”

Preventing the “next” Proctorio

Linkletter is not the only critic to be targeted by Proctorio, Lia Holland, campaigns and communications director for a nonprofit digital rights group called Fight for the Future, told Ars.

Holland’s group was subpoenaed in a US fight after Proctorio sent a copyright infringement notice to Erik Johnson, a then-18-year-old college freshman who shared one of Linkletter’s screenshots. The ensuing litigation was similarly settled after Proctorio “threw every semi-plausible legal weapon at Johnson full force,” Holland told Ars. The pressure forced Johnson to choose between “living his life and his life being this suit from Proctorio,” Holland said.

Linkletter suspected that he and Johnson were added to a “list” of critics that Proctorio closely monitored online, but Proctorio has denied that such a list exists. Holland pushed back, though, telling Ars that Proctorio has “an incredibly long history of fudging the truth in the interest of profit.”

“We’re no strangers to Proctorio’s shady practices when it comes to oppressing dissent or criticism of their technologies,” Holland said. “I am utterly not shocked that they would employ tactics that appear to be doing the same thing when it comes to Ian Linkletter’s case.”

Regardless of Proctorio’s tactics for brand management, it seems clear that public criticism has impacted Proctorio’s sales, though. In 2021, Vice reported that student backlash led some schools to quickly abandon the software. UBC dropped Proctorio in 2021, too, citing “ethical concerns.”

Today, Linkletter works as an emerging technology and open education librarian at the British Columbia Institute of Technology (BCIT). While he considers himself an expert on Proctorio and continues to give lectures discussing harms of academic surveillance software, he’s ready to get away from discussing Proctorio now that the lawsuit has ended.

“I think I will continue to pay attention to what they do and say, and if there’s any new reports of harm that I can elevate,” Linkletter told Ars. “But I have definitely made my points in terms of my specific concerns, and I feel less obliged to spend more and more and more time repeating myself.”

Instead, Linkletter is determined to “prevent the next Proctorio” from potentially blindsiding students on his campus. In his role as vice chair of BCIT’s educational technology and learning design committee, he’s establishing “checks and balances” to ensure that if another pandemic-like situation arises forcing every student to work from home, he can stop “a bunch of creepy stuff” from being rolled out.

“I spent the last year advocating for and implementing algorithmic impact assessments as a mandatory thing that the institute has to do, including identifying how risk is going to be mitigated before we approve any new ed tech ever again,” Linkletter explained.

He also created the Canadian Privacy Library, where he posts privacy impact assessments that he collects by sending freedom-of-information requests to higher education institutions in British Columbia. That’s one way local students could monitor privacy concerns as AI use expands across campuses, increasingly impacting not just how exams are proctored, but how assignments are graded.

Holland told Ars that students concerned about ed tech surveillance “are most powerful when they act in solidarity with each other.” While the pandemic was widely forcing remote learning, student groups were able to successfully remove harmful proctoring tech by “working together so that there was not one single scapegoat or one single face that the ed tech company could go after,” she suggested. Those movements typically start with one or two students learning how the technology works, so that they can educate others about top concerns, Holland said.

Since Linkletter’s lawsuit started, Proctorio has stopped fighting with students on Reddit and suing critics over tweets, Holland said. But Linkletter told Ars that the company still seems to leave students in the dark when it comes to how its software works, and that “could lead to academic discipline for honest students, and unnecessary stress for everyone,” his earliest court filing defending his tweets said.

“I was and am gravely concerned about Proctorio’s lack of transparency about how its algorithms work, and how it labels student behaviours as ‘suspicious,’” Linkletter swore in the filing. One of his deleted tweets urged that all schools have to demand transparency and ask why Proctorio was “hiding” information about how the software worked. But in the end, Linkletter saw no point in continuing to argue over whether two deleted tweets re-posting Proctorio’s videos using YouTube’s sharing tool violated Proctorio’s copyrights.

“I didn’t feel too censored,” Linkletter told Ars. “But yeah, I guess it’s censorship, and I do believe they filed it to try and censor me. But as you can see, I just refused to go down, and I remained their biggest critic.”

As universities prepare to break ahead of the winter holidays, Linkletter told Ars that he’s looking forward to a change in dinner table conversation topics.

“It’s one of those things where I’m 41 and I have aging parents, and I’ve had to waste the last five Christmases talking to them about the lawsuit and their concerns about me,” Linkletter said. “So I’m really looking forward to this Thanksgiving, this Christmas, with this all behind me and the ability to just focus with my parents and my family.”

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

He got sued for sharing public YouTube videos; nightmare ended in settlement Read More »

big-tech-sues-texas,-says-age-verification-law-is-“broad-censorship-regime”

Big Tech sues Texas, says age-verification law is “broad censorship regime”

Texas minors also challenge law

The Texas App Store Accountability Act is similar to laws enacted by Utah and Louisiana. The Texas law is scheduled to take effect on January 1, 2026, while the Utah and Louisiana laws are set to be enforced starting in May and July, respectively.

The Texas law is also being challenged in a different lawsuit filed by a student advocacy group and two Texas minors.

“The First Amendment does not permit the government to require teenagers to get their parents’ permission before accessing information, except in discrete categories like obscenity,” attorney Ambika Kumar of Davis Wright Tremaine LLP said in an announcement of the lawsuit. “The Constitution also forbids restricting adults’ access to speech in the name of protecting children. This law imposes a system of prior restraint on protected expression that is presumptively unconstitutional.”

Davis Wright Tremaine LLP said the law “extends far beyond social media to mainstream educational, news, and creative applications, including Wikipedia, search apps, and internet browsers; messaging services like WhatsApp and Slack; content libraries like Audible, Kindle, Netflix, Spotify, and YouTube; educational platforms like Coursera, Codecademy, and Duolingo; news apps from The New York Times, The Wall Street Journal, ESPN, and The Atlantic; and publishing tools like Substack, Medium, and CapCut.”

Both lawsuits against Texas argue that the law is preempted by the Supreme Court’s 2011 decision in Brown v. Entertainment Merchants Association, which struck down a California law restricting the sale of violent video games to children. The Supreme Court said in Brown that a state’s power to protect children from harm “does not include a free-floating power to restrict the ideas to which children may be exposed.”

The tech industry has sued Texas over multiple laws related to content moderation. In 2022, the Supreme Court blocked a Texas law that prohibits large social media companies from moderating posts based on a user’s viewpoint. Litigation in that case is ongoing. In a separate case decided in June 2025, the Supreme Court upheld a Texas law that requires age verification on porn sites.

Big Tech sues Texas, says age-verification law is “broad censorship regime” Read More »

4chan-fined-$26k-for-refusing-to-assess-risks-under-uk-online-safety-act

4chan fined $26K for refusing to assess risks under UK Online Safety Act

The risk assessments also seem to unconstitutionally compel speech, they argued, forcing them to share information and “potentially incriminate themselves on demand.” That conflicts with 4chan and Kiwi Farms’ Fourth Amendment rights, as well as “the right against self-incrimination and the due process clause of the Fifth Amendment of the US Constitution,” the suit says.

Additionally, “the First Amendment protects Plaintiffs’ right to permit anonymous use of their platforms,” 4chan and Kiwi Farms argued, opposing Ofcom’s requirements to verify ages of users. (This may be their weakest argument as the US increasingly moves to embrace age gates.)

4chan is hoping a US district court will intervene and ban enforcement of the OSA, arguing that the US must act now to protect all US companies. Failing to act now could be a slippery slope, as the UK is supposedly targeting “the most well-known, but small and, financially speaking, defenseless platforms” in the US before mounting attacks to censor “larger American companies,” 4chan and Kiwi Farms argued.

Ofcom has until November 25 to respond to the lawsuit and has maintained that the OSA is not a censorship law.

On Monday, Britain’s technology secretary, Liz Kendall, called OSA a “lifeline” meant to protect people across the UK “from the darkest corners of the Internet,” the Record reported.

“Services can no longer ignore illegal content, like encouraging self-harm or suicide, circulating online which can devastate young lives and leaves families shattered,” Kendall said. “This fine is a clear warning to those who fail to remove illegal content or protect children from harmful material.”

Whether 4chan and Kiwi Farms can win their fight to create a carveout in the OSA for American companies remains unclear, but the Federal Trade Commission agrees that the UK law is an overreach. In August, FTC Chair Andrew Ferguson warned US tech companies against complying with the OSA, claiming that censoring Americans to comply with UK law is a violation of the FTC Act, the Record reported.

“American consumers do not reasonably expect to be censored to appease a foreign power and may be deceived by such actions,” Ferguson told tech executives in a letter.

Another lawyer backing 4chan, Preston Byrne, seemed to echo Ferguson, telling the BBC, “American citizens do not surrender our constitutional rights just because Ofcom sends us an e-mail.”

4chan fined $26K for refusing to assess risks under UK Online Safety Act Read More »

tiktok-loses-supreme-court-fight,-prepares-to-shut-down-sunday

TikTok loses Supreme Court fight, prepares to shut down Sunday


TikTok has said it’s preparing to shut down Sunday.

A TikTok influencer holds a sign that reads “Keep TikTok” outside the US Supreme Court Building as the court hears oral arguments on whether to overturn or delay a law that could lead to a ban of TikTok in the U.S., on January 10, 2025 in Washington, DC. Credit: Kayla Bartkowski / Stringer | Getty Images News

TikTok has lost its Supreme Court appeal in a 9–0 decision and will likely shut down on January 19, a day before Donald Trump’s inauguration, unless the app can be sold before the deadline, which TikTok has said is impossible.

During the trial last Friday, TikTok lawyer Noel Francisco warned SCOTUS that upholding the Biden administration’s divest-or-sell law would likely cause TikTok to “go dark—essentially the platform shuts down” and “essentially… stop operating.” On Wednesday, TikTok reportedly began preparing to shut down the app for all US users, anticipating the loss.

But TikTok’s claims that the divest-or-sell law violated Americans’ free speech rights did not supersede the government’s compelling national security interest in blocking a foreign adversary like China from potentially using the app to spy on or influence Americans, SCOTUS ruled.

“We conclude that the challenged provisions do not violate petitioners’ First Amendment rights,” the SCOTUS opinion said, while acknowledging that “there is no doubt that, for more than 170 million Americans, TikTok offers a distinctive and expansive outlet for expression, means of engagement, and source of community.”

Late last year, TikTok and its owner, the Chinese-owned company ByteDance, urgently pushed SCOTUS to intervene before the law’s January 19 enforcement date. Ahead of SCOTUS’ decision, TikTok warned it would have no choice but to abruptly shut down a thriving platform where many Americans get their news, express their views, and make a living.

The US had argued the law was necessary to protect national security interests as the US-China trade war intensifies, alleging that China could use the app to track and influence TikTok’s 170 million American users. A lower court had agreed that the US had a compelling national security interest and rejected arguments that the law violated the First Amendment, triggering TikTok’s appeal to SCOTUS. Today, the Supreme Court upheld that ruling.

According to SCOTUS, the divest-or-sell law is “content-neutral” and only triggers intermediate scrutiny. That requires that the law doesn’t burden “substantially more speech than necessary” to serve the government’s national security interests, rather than strict scrutiny which would force the government to protect those interests through the least restrictive means.

Further, the government was right to single TikTok out, SCOTUS wrote, due to its “scale and susceptibility to foreign adversary control, together with the vast swaths of sensitive data the platform collects.”

“Preventing China from collecting vast amounts of sensitive data from 170 million US TikTok users” is a “decidedly content agnostic” rationale, justices wrote.

“The Government had good reason to single out TikTok for special treatment,” the opinion said.

TikTok CEO Shou Zi Chew posted a statement on TikTok reacting to the ruling, thanking Trump for committing to “work with TikTok” to avoid a shut down and telling users to “rest assured, we will do everything in our power to ensure our platform thrives” in the US.

Momentum to ban TikTok has shifted

First Amendment advocates condemned the SCOTUS ruling. The American Civil Liberties Union called it a “major blow to freedom of expression online,” and the Electronic Frontier Foundation’s civil liberties director David Greene accused justices of sweeping “past the undisputed content-based justification for the law” to “rule only based on the shaky data privacy concerns.”

While the SCOTUS ruling was unanimous, justice Sonia Sotomayor said that  “precedent leaves no doubt” that the law implicated the First Amendment and “plainly” imposed a burden on any US company that distributes TikTok’s speech and any content creator who preferred TikTok as a publisher of their speech.

Similarly concerned was justice Neil Gorsuch, who wrote in his concurring opinion that he harbors “serious reservations about whether the law before us is ‘content neutral’ and thus escapes ‘strict scrutiny.'” Gorsuch also said he didn’t know “whether this law will succeed in achieving its ends.”

“But the question we face today is not the law’s wisdom, only its constitutionality,” Gorsuch wrote. “Given just a handful of days after oral argument to issue an opinion, I cannot profess the kind of certainty I would like to have about the arguments and record before us. All I can say is that, at this time and under these constraints, the problem appears real and the response to it not unconstitutional.”

For TikTok and content creators defending the app, the stakes were incredibly high. TikTok repeatedly denied there was any evidence of spying and warned that enforcing the law would allow the government to unlawfully impose “a massive and unprecedented speech restriction.”

But the Supreme Court declined to order a preliminary injunction to block the law until Trump took office, instead deciding to rush through oral arguments and reach a decision prior to the law’s enforcement deadline. Now TikTok has little recourse if it wishes to maintain US operations, as justices suggested during the trial that even if a president chose to not enforce the law, providing access to TikTok or enabling updates could be viewed as too risky for app stores or other distributors.

The law at the center of the case—the Protecting Americans from Foreign Adversary Controlled Applications Act—had strong bipartisan support under the Biden administration.

But President-elect Donald Trump said he opposed a TikTok ban, despite agreeing that US national security interests in preventing TikTok spying on or manipulating Americans were compelling. And this week, Senator Ed Markey (D-Mass.) has introduced a bill to extend the deadline ahead of a potential TikTok ban, and a top Trump adviser, Congressman Mike Waltz, has said that Trump plans to stop the ban and “keep TikTok from going dark,” the BBC reported. Even the Biden administration, whose justice department just finished arguing why the US needed to enforce the law to SCOTUS, “is considering ways to keep TikTok available,” sources told NBC News.

“What might happen next to TikTok remains unclear,” Gorsuch noted in the opinion.

Will Trump save TikTok?

It will likely soon be clear whether Trump will intervene. Trump filed a brief in December, requesting that the Supreme Court stay enforcement of the law until after he takes office because allegedly only he could make a deal to save TikTok. He criticized SCOTUS for rushing the decision and suggested that Congress’ passage of the law may have been “legislative encroachment” that potentially “binds his hands” as president.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

TikTok’s CEO Chew signaled to users that Trump is expected to step in.

“On behalf of everyone at TikTok and all our users across the country, I want to thank President Trump for his commitment to work with us to find a solution that keeps TikTok available in the United States,” Chew’s statement said.

Chew also reminded Trump that he has 60 billion views of his content on TikTok and perhaps stands to lose a major platform through the ban.

“We are grateful and pleased to have the support of a president who truly understands our platform, one who has used TikTok to express his own thoughts and perspectives,” Chew said.

Trump seemingly has limited options to save TikTok, Forbes suggested. At trial, justices disagreed on whether Trump could legally decide to simply not enforce the law. And efforts to pause enforcement or claim compliance without evidence that ByteDance is working on selling off TikTok could be blocked by the court, analysts said. And while ByteDance has repeatedly said it’s unwilling to sell TikTok US, it’s possible, one analyst suggested to Forbes, that ByteDance might be more willing to divest “in exchange for Trump backing off his threat of high tariffs on Chinese imports.”

On Tuesday, a Bloomberg report suggested that China was considering whether selling TikTok to Elon Musk might be a good bargaining chip to de-escalate Trump’s attacks in the US-China trade war.

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

TikTok loses Supreme Court fight, prepares to shut down Sunday Read More »

trump-told-scotus-he-plans-to-make-a-deal-to-save-tiktok

Trump told SCOTUS he plans to make a deal to save TikTok

Several members of Congress— Senator Edward J. Markey (D-Mass.), Senator Rand Paul (R-Ky.), and Representative Ro Khanna (D-Calif.)—filed a brief agreeing that “the TikTok ban does not survive First Amendment scrutiny.” They agreed with TikTok that the law is “illegitimate.”

Lawmakers’ “principle justification” for the ban—”preventing covert content manipulation by the Chinese government”—masked a “desire” to control TikTok content, they said. Further, it could be achieved by a less-restrictive alternative, they said, a stance which TikTok has long argued for.

Attorney General Merrick Garland defended the Act, though, urging SCOTUS to remain laser-focused on the question of whether a forced sale of TikTok that would seemingly allow the app to continue operating without impacting American free speech violates the First Amendment. If the court agrees that the law survives strict scrutiny, TikTok could still be facing an abrupt shutdown in January.

The Supreme Court has scheduled oral arguments to begin on January 10. TikTok and content creators who separately sued to block the law have asked for their arguments to be divided, so that the court can separately weigh “different perspectives” when deciding how to approach the First Amendment question.

In its own brief, TikTok has asked SCOTUS to strike the portions of the law singling out TikTok or “at the very least” explain to Congress that “it needed to do far better work either tailoring the Act’s restrictions or justifying why the only viable remedy was to prohibit Petitioners from operating TikTok.”

But that may not be necessary if Trump prevails. Trump told the court that TikTok was an important platform for his presidential campaign and that he should be the one to make the call on whether TikTok should remain in the US—not the Supreme Court.

“As the incoming Chief Executive, President Trump has a particularly powerful interest in and responsibility for those national-security and foreign-policy questions, and he is the right constitutional actor to resolve the dispute through political means,” Trump’s brief said.

Trump told SCOTUS he plans to make a deal to save TikTok Read More »

supreme-court-to-decide-if-tiktok-should-be-banned-or-sold

Supreme Court to decide if TikTok should be banned or sold

While the controversial US law doesn’t necessarily ban TikTok, it does seem designed to make TikTok “go away,” Greene said, and such a move to interfere with a widely used communications platform seems “unprecedented.”

“The TikTok ban itself and the DC Circuit’s approval of it should be of great concern even to those who find TikTok undesirable or scary,” Greene said in a statement. “Shutting down communications platforms or forcing their reorganization based on concerns of foreign propaganda and anti-national manipulation is an eminently anti-democratic tactic, one that the US has previously condemned globally.”

Greene further warned that the US “cutting off a tool used by 170 million Americans to receive information and communicate with the world, without proving with evidence that the tools are presently seriously harmful” would “greatly” lower “well-established standards for restricting freedom of speech in the US.”

TikTok partly appears to be hoping that President-elect Donald Trump will disrupt enforcement of the law, but Greene said it remains unclear if Trump’s plan to “save TikTok” might just be a plan to support a sale to a US buyer. At least one former Trump ally, Steven Mnuchin, has reportedly expressed interest in buying the app.

For TikTok, putting pressure on Trump will likely be the next step, “if the Supreme Court ever says, ‘we agree the law is valid,'” Greene suggested.

“Then that’s it,” Greene said. “There’s no other legal recourse. You only have political recourses.”

Like other civil rights groups, the EFF plans to remain on TikTok’s side as the SCOTUS battle starts.

“We are pleased that the Supreme Court will take the case and will urge the justices to apply the appropriately demanding First Amendment scrutiny,” Greene said.

Supreme Court to decide if TikTok should be banned or sold Read More »

facing-ban-next-month,-tiktok-begs-scotus-for-help

Facing ban next month, TikTok begs SCOTUS for help

TikTok: Ban is slippery slope to broad US censorship

According to TikTok, the government’s defense of the ban to prevent China from wielding a “covert” influence over Americans is a farce invented by lawyers to cover up the true mission of censorship. If the lower court’s verdict stands, TikTok alleged, “then Congress will have free rein to ban any American from speaking simply by identifying some risk that the speech is influenced by a foreign entity.”

TikTok doesn’t want to post big disclaimers on the app warning of “covert” influence, claiming that the government relied on “secret evidence” to prove this influence occurs on TikTok. But if the Supreme Court agrees that the government needed to show more than “bare factual assertions” to back national security claims the lower court said justified any potential speech restrictions, then the court will also likely agree to reverse the lower court’s decision, TikTok suggested.

It will become much clearer by January 6 whether the January 19 ban will take effect, at which point TikTok would shut down, booting all US users from the app. TikTok urged the Supreme Court to agree it is in the public interest to delay the ban and review the constitutional claims to prevent any “extreme” harms to both TikTok and US users who depend on the app for news, community, and income.

If SCOTUS doesn’t intervene, TikTok said that the lower court’s “flawed legal rationales would open the door to upholding content-based speech bans in contexts far different than this one.”

“Fearmongering about national security cannot obscure the threat that the Act itself poses to all Americans,” TikTok alleged, while suggesting that even Congress would agree that a “modest delay” in enforcing the law wouldn’t pose any immediate risk to US national security. Congress is also aware that a sale would not be technically, commercially, or legally possible in the timeframe provided, TikTok said. A temporary injunction would prevent irreparable harms, TikTok said, including the irreparable harm courts have long held is caused by restricting speech of Americans for any amount of time.

“An interim injunction is also appropriate because it will give the incoming Administration time to determine its position, as the President-elect and his advisors have voiced support for saving TikTok,” TikTok argued.

Ars could not immediately reach TikTok for comment.

Facing ban next month, TikTok begs SCOTUS for help Read More »

rfk-jr’s-anti-vaccine-group-can’t-sue-meta-for-agreeing-with-cdc,-judge-rules

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules

Independent presidential candidate Robert F. Kennedy Jr.

Enlarge / Independent presidential candidate Robert F. Kennedy Jr.

The Children’s Health Defense (CHD), an anti-vaccine group founded by Robert F. Kennedy Jr, has once again failed to convince a court that Meta acted as a state agent when censoring the group’s posts and ads on Facebook and Instagram.

In his opinion affirming a lower court’s dismissal, US Ninth Circuit Court of Appeals Judge Eric Miller wrote that CHD failed to prove that Meta acted as an arm of the government in censoring posts. Concluding that Meta’s right to censor views that the platforms find “distasteful” is protected by the First Amendment, Miller denied CHD’s requested relief, which had included an injunction and civil monetary damages.

“Meta evidently believes that vaccines are safe and effective and that their use should be encouraged,” Miller wrote. “It does not lose the right to promote those views simply because they happen to be shared by the government.”

CHD told Reuters that the group “was disappointed with the decision and considering its legal options.”

The group first filed the complaint in 2020, arguing that Meta colluded with government officials to censor protected speech by labeling anti-vaccine posts as misleading or removing and shadowbanning CHD posts. This caused CHD’s traffic on the platforms to plummet, CHD claimed, and ultimately, its pages were removed from both platforms.

However, critically, Miller wrote, CHD did not allege that “the government was actually involved in the decisions to label CHD’s posts as ‘false’ or ‘misleading,’ the decision to put the warning label on CHD’s Facebook page, or the decisions to ‘demonetize’ or ‘shadow-ban.'”

“CHD has not alleged facts that allow us to infer that the government coerced Meta into implementing a specific policy,” Miller wrote.

Instead, Meta “was entitled to encourage” various “input from the government,” justifiably seeking vaccine-related information provided by the World Health Organization (WHO) and the US Centers for Disease Control and Prevention (CDC) as it navigated complex content moderation decisions throughout the pandemic, Miller wrote.

Therefore, Meta’s actions against CHD were due to “Meta’s own ‘policy of censoring,’ not any provision of federal law,” Miller concluded. “The evidence suggested that Meta had independent incentives to moderate content and exercised its own judgment in so doing.”

None of CHD’s theories that Meta coordinated with officials to deprive “CHD of its constitutional rights” were plausible, Miller wrote, whereas the “innocent alternative”—”that Meta adopted the policy it did simply because” CEO Mark Zuckerberg and Meta “share the government’s view that vaccines are safe and effective”—appeared “more plausible.”

Meta “does not become an agent of the government just because it decides that the CDC sometimes has a point,” Miller wrote.

Equally not persuasive were CHD’s notions that Section 230 immunity—which shields platforms from liability for third-party content—”‘removed all legal barriers’ to the censorship of vaccine-related speech,” such that “Meta’s restriction of that content should be considered state action.”

“That Section 230 operates in the background to immunize Meta if it chooses to suppress vaccine misinformation—whether because it shares the government’s health concerns or for independent commercial reasons—does not transform Meta’s choice into state action,” Miller wrote.

One judge dissented over Section 230 concerns

In his dissenting opinion, Judge Daniel Collins defended CHD’s Section 230 claim, however, suggesting that the appeals court erred and should have granted CHD injunctive and declaratory relief from alleged censorship. CHD CEO Mary Holland told The Defender that the group was pleased the decision was not unanimous.

According to Collins, who like Miller is a Trump appointee, Meta could never have built its massive social platforms without Section 230 immunity, which grants platforms the ability to broadly censor viewpoints they disfavor.

It was “important to keep in mind” that “the vast practical power that Meta exercises over the speech of millions of others ultimately rests on a government-granted privilege to which Meta is not constitutionally entitled,” Collins wrote. And this power “makes a crucial difference in the state-action analysis.”

As Collins sees it, CHD could plausibly allege that Meta’s communications with government officials about vaccine-related misinformation targeted specific users, like the “disinformation dozen” that includes both CHD and Kennedy. In that case, it appears possible to Collins that Section 230 provides a potential opportunity for government to target speech that it disfavors through mechanisms provided by the platforms.

“Having specifically and purposefully created an immunized power for mega-platform operators to freely censor the speech of millions of persons on those platforms, the Government is perhaps unsurprisingly tempted to then try to influence particular uses of such dangerous levers against protected speech expressing viewpoints the Government does not like,” Collins warned.

He further argued that “Meta’s relevant First Amendment rights” do not “give Meta an unbounded freedom to work with the Government in suppressing speech on its platforms.” Disagreeing with the majority, he wrote that “in this distinctive scenario, applying the state-action doctrine promotes individual liberty by keeping the Government’s hands away from the tempting levers of censorship on these vast platforms.”

The majority agreed, however, that while Section 230 immunity “is undoubtedly a significant benefit to companies like Meta,” lawmakers’ threats to weaken Section 230 did not suggest that Meta’s anti-vaccine policy was coerced state action.

“Many companies rely, in one way or another, on a favorable regulatory environment or the goodwill of the government,” Miller wrote. “If that were enough for state action, every large government contractor would be a state actor. But that is not the law.”

RFK Jr’s anti-vaccine group can’t sue Meta for agreeing with CDC, judge rules Read More »

kids-online-safety-act-passes-senate-despite-concerns-it-will-harm-kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

Kids Online Safety Act passes Senate despite concerns it will harm kids

The Kids Online Safety Act (KOSA) easily passed the Senate today despite critics’ concerns that the bill may risk creating more harm than good for kids and perhaps censor speech for online users of all ages if it’s signed into law.

KOSA received broad bipartisan support in the Senate, passing with a 91–3 vote alongside the Children’s Online Privacy Protection Action (COPPA) 2.0. Both laws seek to control how much data can be collected from minors, as well as regulate the platform features that could harm children’s mental health.

Only Senators Ron Wyden (D-Ore.), Rand Paul (R-Ky.), and Mike Lee (R-Utah) opposed the bills.

In an op-ed for The Courier-Journal, Paul argued that KOSA imposes a “duty of care” to mitigate harms to minors on their platforms that “will not only stifle free speech, but it will deprive Americans of the benefits of our technological advancements.”

“With the Internet, today’s children have the world at their fingertips,” Paul wrote, but if KOSA passes, even allegedly benign content like “pro-life messages” or discussion of a teen overcoming an eating disorder could be censored if platforms fear compliance issues.

“While doctors’ and therapists’ offices close at night and on weekends, support groups are available 24 hours a day, seven days a week for people who share similar concerns or have the same health problems. Any solution to protect kids online must ensure the positive aspects of the Internet are preserved,” Paul wrote.

During a KOSA critics’ press conference today, Dara Adkison—the executive director of a group providing resources for transgender youths called TransOhio—expressed concerns that lawmakers would target sites like TransOhio if the law also passed in the House, where the bill heads next.

“I’ve literally had legislators tell me to my face that they would love to see our website taken off the Internet because they don’t want people to have the kinds of vital community resources that we provide,” Adkison said.

Paul argued that what was considered harmful to kids was subjective, noting that a key flaw with KOSA was that “KOSA does not explicitly define the term ‘mental health disorder.'” Instead, platforms are to refer to the definition in “the fifth edition of the Diagnostic and Statistical Manual of Mental Health Disorders” or “the most current successor edition.”

“That means the scope of the bill could change overnight without any action from America’s elected representatives,” Paul warned, suggesting that “KOSA opens the door to nearly limitless content regulation because platforms will censor users rather than risk liability.”

Ahead of the vote, Senator Richard Blumenthal (D-Conn.)—who co-sponsored KOSA—denied that the bill strove to regulate content, The Hill reported. To Blumenthal and other KOSA supporters, its aim instead is to ensure that social media is “safe by design” for young users.

According to The Washington Post, KOSA and COPPA 2.0 passing “represent the most significant restrictions on tech platforms to clear a chamber of Congress in decades.” However, while President Joe Biden has indicated he would be willing to sign the bill into law, most seem to agree that KOSA will struggle to pass in the House of Representatives.

A senior tech policy director for Chamber of Progress—a progressive tech industry policy coalition—Todd O’Boyle, has said that currently there is “substantial opposition” in the House. O’Boyle said that he expects that the political divide will be enough to block KOSA’s passage and prevent giving “the power” to the Federal Trade Commission (FTC) or “the next president” to “crack down on online speech” or otherwise pose “a massive threat to our constitutional rights.”

“If there’s one thing the far-left and far-right agree on, it’s that the next chair of the FTC shouldn’t get to decide what online posts are harmful,” O’Boyle said.

Kids Online Safety Act passes Senate despite concerns it will harm kids Read More »

scotus-nixes-injunction-that-limited-biden-admin-contacts-with-social-networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

On Wednesday, the Supreme Court tossed out claims that the Biden administration coerced social media platforms into censoring users by removing COVID-19 and election-related content.

Complaints alleging that high-ranking government officials were censoring conservatives had previously convinced a lower court to order an injunction limiting the Biden administration’s contacts with platforms. But now that injunction has been overturned, re-opening lines of communication just ahead of the 2024 elections—when officials will once again be closely monitoring the spread of misinformation online targeted at voters.

In a 6–3 vote, the majority ruled that none of the plaintiffs suing—including five social media users and Republican attorneys general in Louisiana and Missouri—had standing. They had alleged that the government had “pressured the platforms to censor their speech in violation of the First Amendment,” demanding an injunction to stop any future censorship.

Plaintiffs may have succeeded if they were instead seeking damages for past harms. But in her opinion, Justice Amy Coney Barrett wrote that partly because the Biden administration seemingly stopped influencing platforms’ content policies in 2022, none of the plaintiffs could show evidence of a “substantial risk that, in the near future, they will suffer an injury that is traceable” to any government official. Thus, they did not seem to face “a real and immediate threat of repeated injury,” Barrett wrote.

“Without proof of an ongoing pressure campaign, it is entirely speculative that the platforms’ future moderation decisions will be attributable, even in part,” to government officials, Barrett wrote, finding that an injunction would do little to prevent future censorship.

Instead, plaintiffs’ claims “depend on the platforms’ actions,” Barrett emphasized, “yet the plaintiffs do not seek to enjoin the platforms from restricting any posts or accounts.”

“It is a bedrock principle that a federal court cannot redress ‘injury that results from the independent action of some third party not before the court,'” Barrett wrote.

Barrett repeatedly noted “weak” arguments raised by plaintiffs, none of which could directly link their specific content removals with the Biden administration’s pressure campaign urging platforms to remove vaccine or election misinformation.

According to Barrett, the lower court initially granting the injunction “glossed over complexities in the evidence,” including the fact that “platforms began to suppress the plaintiffs’ COVID-19 content” before the government pressure campaign began. That’s an issue, Barrett said, because standing to sue “requires a threshold showing that a particular defendant pressured a particular platform to censor a particular topic before that platform suppressed a particular plaintiff’s speech on that topic.”

“While the record reflects that the Government defendants played a role in at least some of the platforms’ moderation choices, the evidence indicates that the platforms had independent incentives to moderate content and often exercised their own judgment,” Barrett wrote.

Barrett was similarly unconvinced by arguments that plaintiffs risk platforms removing future content based on stricter moderation policies that were previously coerced by officials.

“Without evidence of continued pressure from the defendants, the platforms remain free to enforce, or not to enforce, their policies—even those tainted by initial governmental coercion,” Barrett wrote.

Judge: SCOTUS “shirks duty” to defend free speech

Justices Clarence Thomas and Neil Gorsuch joined Samuel Alito in dissenting, arguing that “this is one of the most important free speech cases to reach this Court in years” and that the Supreme Court had an “obligation” to “tackle the free speech issue that the case presents.”

“The Court, however, shirks that duty and thus permits the successful campaign of coercion in this case to stand as an attractive model for future officials who want to control what the people say, hear, and think,” Alito wrote.

Alito argued that the evidence showed that while “downright dangerous” speech was suppressed, so was “valuable speech.” He agreed with the lower court that “a far-reaching and widespread censorship campaign” had been “conducted by high-ranking federal officials against Americans who expressed certain disfavored views about COVID-19 on social media.”

“For months, high-ranking Government officials placed unrelenting pressure on Facebook to suppress Americans’ free speech,” Alito wrote. “Because the Court unjustifiably refuses to address this serious threat to the First Amendment, I respectfully dissent.”

At least one plaintiff who opposed masking and vaccines, Jill Hines, was “indisputably injured,” Alito wrote, arguing that evidence showed that she was censored more frequently after officials pressured Facebook into changing their policies.

“Top federal officials continuously and persistently hectored Facebook to crack down on what the officials saw as unhelpful social media posts, including not only posts that they thought were false or misleading but also stories that they did not claim to be literally false but nevertheless wanted obscured,” Alito wrote.

While Barrett and the majority found that platforms were more likely responsible for injury, Alito disagreed, writing that with the threat of antitrust probes or Section 230 amendments, Facebook acted like “a subservient entity determined to stay in the good graces of a powerful taskmaster.”

Alito wrote that the majority was “applying a new and heightened standard” by requiring plaintiffs to “untangle Government-caused censorship from censorship that Facebook might have undertaken anyway.” In his view, it was enough that Hines showed that “one predictable effect of the officials’ action was that Facebook would modify its censorship policies in a way that affected her.”

“When the White House pressured Facebook to amend some of the policies related to speech in which Hines engaged, those amendments necessarily impacted some of Facebook’s censorship decisions,” Alito wrote. “Nothing more is needed. What the Court seems to want are a series of ironclad links.”

“That is regrettable,” Alito said.

SCOTUS nixes injunction that limited Biden admin contacts with social networks Read More »

elon-musk’s-x-defeats-australia’s-global-takedown-order-of-stabbing-video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Australia’s safety regulator has ended a legal battle with X (formerly Twitter) after threatening approximately $500,000 daily fines for failing to remove 65 instances of a religiously motivated stabbing video from X globally.

Enforcing Australia’s Online Safety Act, eSafety commissioner Julie Inman-Grant had argued it would be dangerous for the videos to keep spreading on X, potentially inciting other acts of terror in Australia.

But X owner Elon Musk refused to comply with the global takedown order, arguing that it would be “unlawful and dangerous” to allow one country to control the global Internet. And Musk was not alone in this fight. The legal director of a nonprofit digital rights group called the Electronic Frontier Foundation (EFF), Corynne McSherry, backed up Musk, urging the court to agree that “no single country should be able to restrict speech across the entire Internet.”

“We welcome the news that the eSafety Commissioner is no longer pursuing legal action against X seeking the global removal of content that does not violate X’s rules,” X’s Global Government Affairs account posted late Tuesday night. “This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed.”

Inman-Grant was formerly Twitter’s director of public policy in Australia and used that experience to land what she told The Courier-Mail was her “dream role” as Australia’s eSafety commissioner in 2017. Since issuing the order to remove the video globally on X, Inman-Grant had traded barbs with Musk (along with other Australian lawmakers), responding to Musk labeling her a “censorship commissar” by calling him an “arrogant billionaire” for fighting the order.

On X, Musk arguably got the last word, posting, “Freedom of speech is worth fighting for.”

Safety regulator still defends takedown order

In a statement, Inman-Grant said early Wednesday that her decision to discontinue proceedings against X was part of an effort to “consolidate actions,” including “litigation across multiple cases.” She ultimately determined that dropping the case against X would be the “option likely to achieve the most positive outcome for the online safety of all Australians, especially children.”

“Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community,” Inman-Grant said, still defending the order despite dropping it.

In court, X’s lawyer Marcus Hoyne had pushed back on such logic, arguing that the eSafety regulator’s mission was “pointless” because “footage of the attack had now spread far beyond the few dozen URLs originally identified,” the Australian Broadcasting Corporation reported.

“I stand by my investigators and the decisions eSafety made,” Inman-Grant said.

Other Australian lawmakers agree the order was not out of line. According to AP News, Australian Minister for Communications Michelle Rowland shared a similar statement in parliament today, backing up the safety regulator while scolding X users who allegedly took up Musk’s fight by threatening Inman-Grant and her family. The safety regulator has said that Musk’s X posts incited a “pile-on” from his followers who allegedly sent death threats and exposed her children’s personal information, the BBC reported.

“The government backs our regulators and we back the eSafety Commissioner, particularly in light of the reprehensible threats to her physical safety and the threats to her family in the course of doing her job,” Rowland said.

Elon Musk’s X defeats Australia’s global takedown order of stabbing video Read More »

judge-halts-texas-probe-into-media-matters’-reporting-on-x

Judge halts Texas probe into Media Matters’ reporting on X

Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

Enlarge / Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

A judge has preliminarily blocked what Media Matters for America (MMFA) described as Texas Attorney General Ken Paxton’s attempt to “rifle through” confidential documents to prove that MMFA fraudulently manipulated X (formerly Twitter) data to ruin X’s advertising business, as Elon Musk has alleged.

After Musk accused MMFA of publishing reports that Musk claimed were designed to scare advertisers off X, Paxton promptly launched his own investigation into MMFA last November.

Suing MMFA over alleged violations of Texas’ Deceptive Trade Practices Act—which prohibits “disparaging the goods, services, or business of another by false or misleading representation of facts”—Paxton sought a wide range of MMFA documents through a civil investigative demand (CID). Filing a motion to block the CID, MMFA told the court that the CID had violated the media organization’s First Amendment rights, providing evidence that Paxton’s investigation and CID had chilled MMFA speech.

Paxton had requested Media Matters’ financial records—including “direct and indirect sources of funding for all Media Matters operations involving X research or publications”—as well as “internal and external communications” on “Musk’s purchase of X” and X’s current CEO Linda Yaccarino. He also asked for all of Media Matters’ communications with X representatives and X advertisers.

But perhaps most invasive, Paxton wanted to see all the communications about Media Matters’ X reporting that triggered the lawsuits, which, as US District Judge Amit Mehta wrote in an opinion published Friday, was a compelled disclosure that “poses a serious threat to the vitality of the newsgathering process.”

Mehta was concerned that MMFA showed that “Media Matters’ editorial leaders have pared back reporting and publishing, particularly on any topics that could be perceived as relating to the Paxton investigation”—including two follow-ups on its X reporting. Because of Paxton’s alleged First Amendment retaliation, MMFA said it did not publish “two pieces concerning X’s placement of advertising alongside antisemitic, pro-Nazi accounts”—”not out of legitimate concerns about fairness or accuracy,” but “out of fear of harassment, threats, and retaliation.”

According to Mehta’s order, Paxton did not contest that Texas’ lawsuit had chilled MMFA’s speech. Further, Paxton had given at least one podcast interview where he called upon other state attorneys general to join him in investigating MMFA.

Because Paxton “projected himself across state lines and asserted a pseudo-national executive authority,” Mehta wrote and repeatedly described MMFA as a “radical anti-free speech” or “radical left-wing organization,” the court had seen sufficient “evidence of retaliatory intent.”

“Notably,” Mehta wrote, Paxton remained “silent” and never “submitted a sworn declaration that explains his reasons for opening the investigation.”

In his press release, Paxton justified the investigation by saying, “We are examining the issue closely to ensure that the public has not been deceived by the schemes of radical left-wing organizations who would like nothing more than to limit freedom by reducing participation in the public square.”

Ultimately, Mehta granted MMFA’s request for a preliminary injunction to block Paxton’s CID because the judge found that the investigation and the CID have caused MMFA “to self-censor when making research and publication decisions, adversely affected the relationships between editors and reporters, and restricted communications with sources and journalists.”

“Only injunctive relief will ‘prevent the [ongoing] deprivation of free speech rights,'” Mehta’s opinion said, deeming MMFA’s reporting as “core First Amendment activities.”

Mehta’s order also banned Paxton from taking any steps to further his investigation until the lawsuit is decided.

In a statement Friday, MMFA President and CEO Angelo Carusone celebrated the win as not just against Paxton but also against Musk.

“Elon Musk encouraged Republican state attorneys general to use their power to harass their critics and stifle reporting about X,” Carusone said. “Ken Paxton was one of those AGs that took up the call and he was defeated. Today’s decision is a victory for free speech.”

Paxton has not yet responded to the preliminary injunction and his office did not respond to Ars’ request to comment..

Media Matters’ lawyer, Aria C. Branch, a partner at Elias Law Group, told Ars that “while Attorney General Paxton’s office has not yet responded to Friday’s ruling, the preliminary injunction should certainly put an end to these kind of lawless, politically motivated attempts to muzzle the press.”

Judge halts Texas probe into Media Matters’ reporting on X Read More »