First Amendment

scotus-nixes-injunction-that-limited-biden-admin-contacts-with-social-networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

SCOTUS nixes injunction that limited Biden admin contacts with social networks

On Wednesday, the Supreme Court tossed out claims that the Biden administration coerced social media platforms into censoring users by removing COVID-19 and election-related content.

Complaints alleging that high-ranking government officials were censoring conservatives had previously convinced a lower court to order an injunction limiting the Biden administration’s contacts with platforms. But now that injunction has been overturned, re-opening lines of communication just ahead of the 2024 elections—when officials will once again be closely monitoring the spread of misinformation online targeted at voters.

In a 6–3 vote, the majority ruled that none of the plaintiffs suing—including five social media users and Republican attorneys general in Louisiana and Missouri—had standing. They had alleged that the government had “pressured the platforms to censor their speech in violation of the First Amendment,” demanding an injunction to stop any future censorship.

Plaintiffs may have succeeded if they were instead seeking damages for past harms. But in her opinion, Justice Amy Coney Barrett wrote that partly because the Biden administration seemingly stopped influencing platforms’ content policies in 2022, none of the plaintiffs could show evidence of a “substantial risk that, in the near future, they will suffer an injury that is traceable” to any government official. Thus, they did not seem to face “a real and immediate threat of repeated injury,” Barrett wrote.

“Without proof of an ongoing pressure campaign, it is entirely speculative that the platforms’ future moderation decisions will be attributable, even in part,” to government officials, Barrett wrote, finding that an injunction would do little to prevent future censorship.

Instead, plaintiffs’ claims “depend on the platforms’ actions,” Barrett emphasized, “yet the plaintiffs do not seek to enjoin the platforms from restricting any posts or accounts.”

“It is a bedrock principle that a federal court cannot redress ‘injury that results from the independent action of some third party not before the court,'” Barrett wrote.

Barrett repeatedly noted “weak” arguments raised by plaintiffs, none of which could directly link their specific content removals with the Biden administration’s pressure campaign urging platforms to remove vaccine or election misinformation.

According to Barrett, the lower court initially granting the injunction “glossed over complexities in the evidence,” including the fact that “platforms began to suppress the plaintiffs’ COVID-19 content” before the government pressure campaign began. That’s an issue, Barrett said, because standing to sue “requires a threshold showing that a particular defendant pressured a particular platform to censor a particular topic before that platform suppressed a particular plaintiff’s speech on that topic.”

“While the record reflects that the Government defendants played a role in at least some of the platforms’ moderation choices, the evidence indicates that the platforms had independent incentives to moderate content and often exercised their own judgment,” Barrett wrote.

Barrett was similarly unconvinced by arguments that plaintiffs risk platforms removing future content based on stricter moderation policies that were previously coerced by officials.

“Without evidence of continued pressure from the defendants, the platforms remain free to enforce, or not to enforce, their policies—even those tainted by initial governmental coercion,” Barrett wrote.

Judge: SCOTUS “shirks duty” to defend free speech

Justices Clarence Thomas and Neil Gorsuch joined Samuel Alito in dissenting, arguing that “this is one of the most important free speech cases to reach this Court in years” and that the Supreme Court had an “obligation” to “tackle the free speech issue that the case presents.”

“The Court, however, shirks that duty and thus permits the successful campaign of coercion in this case to stand as an attractive model for future officials who want to control what the people say, hear, and think,” Alito wrote.

Alito argued that the evidence showed that while “downright dangerous” speech was suppressed, so was “valuable speech.” He agreed with the lower court that “a far-reaching and widespread censorship campaign” had been “conducted by high-ranking federal officials against Americans who expressed certain disfavored views about COVID-19 on social media.”

“For months, high-ranking Government officials placed unrelenting pressure on Facebook to suppress Americans’ free speech,” Alito wrote. “Because the Court unjustifiably refuses to address this serious threat to the First Amendment, I respectfully dissent.”

At least one plaintiff who opposed masking and vaccines, Jill Hines, was “indisputably injured,” Alito wrote, arguing that evidence showed that she was censored more frequently after officials pressured Facebook into changing their policies.

“Top federal officials continuously and persistently hectored Facebook to crack down on what the officials saw as unhelpful social media posts, including not only posts that they thought were false or misleading but also stories that they did not claim to be literally false but nevertheless wanted obscured,” Alito wrote.

While Barrett and the majority found that platforms were more likely responsible for injury, Alito disagreed, writing that with the threat of antitrust probes or Section 230 amendments, Facebook acted like “a subservient entity determined to stay in the good graces of a powerful taskmaster.”

Alito wrote that the majority was “applying a new and heightened standard” by requiring plaintiffs to “untangle Government-caused censorship from censorship that Facebook might have undertaken anyway.” In his view, it was enough that Hines showed that “one predictable effect of the officials’ action was that Facebook would modify its censorship policies in a way that affected her.”

“When the White House pressured Facebook to amend some of the policies related to speech in which Hines engaged, those amendments necessarily impacted some of Facebook’s censorship decisions,” Alito wrote. “Nothing more is needed. What the Court seems to want are a series of ironclad links.”

“That is regrettable,” Alito said.

SCOTUS nixes injunction that limited Biden admin contacts with social networks Read More »

elon-musk’s-x-defeats-australia’s-global-takedown-order-of-stabbing-video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Elon Musk’s X defeats Australia’s global takedown order of stabbing video

Australia’s safety regulator has ended a legal battle with X (formerly Twitter) after threatening approximately $500,000 daily fines for failing to remove 65 instances of a religiously motivated stabbing video from X globally.

Enforcing Australia’s Online Safety Act, eSafety commissioner Julie Inman-Grant had argued it would be dangerous for the videos to keep spreading on X, potentially inciting other acts of terror in Australia.

But X owner Elon Musk refused to comply with the global takedown order, arguing that it would be “unlawful and dangerous” to allow one country to control the global Internet. And Musk was not alone in this fight. The legal director of a nonprofit digital rights group called the Electronic Frontier Foundation (EFF), Corynne McSherry, backed up Musk, urging the court to agree that “no single country should be able to restrict speech across the entire Internet.”

“We welcome the news that the eSafety Commissioner is no longer pursuing legal action against X seeking the global removal of content that does not violate X’s rules,” X’s Global Government Affairs account posted late Tuesday night. “This case has raised important questions on how legal powers can be used to threaten global censorship of speech, and we are heartened to see that freedom of speech has prevailed.”

Inman-Grant was formerly Twitter’s director of public policy in Australia and used that experience to land what she told The Courier-Mail was her “dream role” as Australia’s eSafety commissioner in 2017. Since issuing the order to remove the video globally on X, Inman-Grant had traded barbs with Musk (along with other Australian lawmakers), responding to Musk labeling her a “censorship commissar” by calling him an “arrogant billionaire” for fighting the order.

On X, Musk arguably got the last word, posting, “Freedom of speech is worth fighting for.”

Safety regulator still defends takedown order

In a statement, Inman-Grant said early Wednesday that her decision to discontinue proceedings against X was part of an effort to “consolidate actions,” including “litigation across multiple cases.” She ultimately determined that dropping the case against X would be the “option likely to achieve the most positive outcome for the online safety of all Australians, especially children.”

“Our sole goal and focus in issuing our removal notice was to prevent this extremely violent footage from going viral, potentially inciting further violence and inflicting more harm on the Australian community,” Inman-Grant said, still defending the order despite dropping it.

In court, X’s lawyer Marcus Hoyne had pushed back on such logic, arguing that the eSafety regulator’s mission was “pointless” because “footage of the attack had now spread far beyond the few dozen URLs originally identified,” the Australian Broadcasting Corporation reported.

“I stand by my investigators and the decisions eSafety made,” Inman-Grant said.

Other Australian lawmakers agree the order was not out of line. According to AP News, Australian Minister for Communications Michelle Rowland shared a similar statement in parliament today, backing up the safety regulator while scolding X users who allegedly took up Musk’s fight by threatening Inman-Grant and her family. The safety regulator has said that Musk’s X posts incited a “pile-on” from his followers who allegedly sent death threats and exposed her children’s personal information, the BBC reported.

“The government backs our regulators and we back the eSafety Commissioner, particularly in light of the reprehensible threats to her physical safety and the threats to her family in the course of doing her job,” Rowland said.

Elon Musk’s X defeats Australia’s global takedown order of stabbing video Read More »

tiktok-vaguely-disputes-report-that-it’s-making-a-us-only-app

TikTok vaguely disputes report that it’s making a US-only app

Exploring a different route —

TikTok has spent months separating code for US-only algorithm, insiders claim.

TikTok vaguely disputes report that it’s making a US-only app

TikTok is now disputing a Reuters report that claims the short-video app is cloning its algorithm to potentially offer a different version of the app, which might degrade over time, just for US users.

Sources “with direct knowledge” of the project—granted anonymity because they’re not authorized to discuss it publicly—told Reuters that the TikTok effort began late last year. They said that the project will likely take a year to complete, requiring hundreds of engineers to separate millions of lines of code.

As these sources reported, TikTok’s tremendous undertaking could potentially help prepare its China-based owner ByteDance to appease US lawmakers who passed a law in April forcing TikTok to sell its US-based operations by January 19 or face a ban. But TikTok has maintained that the “qualified divestiture” required by the law would be impossible, and on Thursday, TikTok denied the accuracy of Reuters’ report while reiterating its stance that a sale is not in the cards.

“The Reuters story published today is misleading and factually inaccurate,” the TikTok Policy account posted on X (formerly Twitter). “As we said in our court filing, the ‘qualified divestiture’ demanded by the Act to allow TikTok to continue operating in the United States is simply not possible: not commercially, not technologically, not legally. And certainly not on the 270-day timeline required by the Act.”

It remains unclear precisely which parts of Reuters’ report are supposedly “misleading and factually inaccurate.” A Reuters spokesperson said that Reuters stands by its reporting.

A TikTok spokesperson told Ars that “while we have continued work in good faith to further safeguard the authenticity of the TikTok experience, it is simply false to suggest that this work would facilitate divestiture or that divestiture is even a possibility.”

TikTok is currently suing to block the US law on First Amendment grounds, and this week a court fast-tracked that legal challenge to attempt to resolve the matter before the law takes effect next year. Oral arguments are scheduled to start this September, with a ruling expected by December 6, which Reuters reported leaves time for a potential Supreme Court challenge to that ruling.

However, in the meantime, sources told Reuters that TikTok is seemingly exploring all its options to avoid running afoul of the US law, including separating its code base and even once considering open-sourcing parts of its algorithm to increase transparency.

Separating out the code is not an easy task, insiders told Reuters.

“Compliance and legal issues involved with determining what parts of the code can be carried over to TikTok are complicating the work,” one source told Reuters. “Each line of code has to be reviewed to determine if it can go into the separate code base.”

But creating a US-only content-recommendation algorithm could be worth it, as it could allow TikTok US to operate independently from the China-based TikTok app in a manner that could satisfy lawmakers worried about the Chinese government potentially spying on Americans through the app.

However, there’s no guaranteeing that the US-only version of TikTok’s content-recommendation algorithm will perform as well as the currently available app in the US, sources told Reuters. By potentially cutting off TikTok US from Chinese engineers who developed and maintain the algorithm, US users could miss out on code updates improving or maintaining desired functionality. That means TikTok’s popular For You Page recommendations may suffer, as “TikTok US may not be able to deliver the same level of performance as the existing TikTok,” sources told Reuters.

TikTok vaguely disputes report that it’s making a US-only app Read More »

judge-halts-texas-probe-into-media-matters’-reporting-on-x

Judge halts Texas probe into Media Matters’ reporting on X

Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

Enlarge / Texas Attorney General Ken Paxton speaks during the annual Conservative Political Action Conference (CPAC) meeting on February 23, 2024.

A judge has preliminarily blocked what Media Matters for America (MMFA) described as Texas Attorney General Ken Paxton’s attempt to “rifle through” confidential documents to prove that MMFA fraudulently manipulated X (formerly Twitter) data to ruin X’s advertising business, as Elon Musk has alleged.

After Musk accused MMFA of publishing reports that Musk claimed were designed to scare advertisers off X, Paxton promptly launched his own investigation into MMFA last November.

Suing MMFA over alleged violations of Texas’ Deceptive Trade Practices Act—which prohibits “disparaging the goods, services, or business of another by false or misleading representation of facts”—Paxton sought a wide range of MMFA documents through a civil investigative demand (CID). Filing a motion to block the CID, MMFA told the court that the CID had violated the media organization’s First Amendment rights, providing evidence that Paxton’s investigation and CID had chilled MMFA speech.

Paxton had requested Media Matters’ financial records—including “direct and indirect sources of funding for all Media Matters operations involving X research or publications”—as well as “internal and external communications” on “Musk’s purchase of X” and X’s current CEO Linda Yaccarino. He also asked for all of Media Matters’ communications with X representatives and X advertisers.

But perhaps most invasive, Paxton wanted to see all the communications about Media Matters’ X reporting that triggered the lawsuits, which, as US District Judge Amit Mehta wrote in an opinion published Friday, was a compelled disclosure that “poses a serious threat to the vitality of the newsgathering process.”

Mehta was concerned that MMFA showed that “Media Matters’ editorial leaders have pared back reporting and publishing, particularly on any topics that could be perceived as relating to the Paxton investigation”—including two follow-ups on its X reporting. Because of Paxton’s alleged First Amendment retaliation, MMFA said it did not publish “two pieces concerning X’s placement of advertising alongside antisemitic, pro-Nazi accounts”—”not out of legitimate concerns about fairness or accuracy,” but “out of fear of harassment, threats, and retaliation.”

According to Mehta’s order, Paxton did not contest that Texas’ lawsuit had chilled MMFA’s speech. Further, Paxton had given at least one podcast interview where he called upon other state attorneys general to join him in investigating MMFA.

Because Paxton “projected himself across state lines and asserted a pseudo-national executive authority,” Mehta wrote and repeatedly described MMFA as a “radical anti-free speech” or “radical left-wing organization,” the court had seen sufficient “evidence of retaliatory intent.”

“Notably,” Mehta wrote, Paxton remained “silent” and never “submitted a sworn declaration that explains his reasons for opening the investigation.”

In his press release, Paxton justified the investigation by saying, “We are examining the issue closely to ensure that the public has not been deceived by the schemes of radical left-wing organizations who would like nothing more than to limit freedom by reducing participation in the public square.”

Ultimately, Mehta granted MMFA’s request for a preliminary injunction to block Paxton’s CID because the judge found that the investigation and the CID have caused MMFA “to self-censor when making research and publication decisions, adversely affected the relationships between editors and reporters, and restricted communications with sources and journalists.”

“Only injunctive relief will ‘prevent the [ongoing] deprivation of free speech rights,'” Mehta’s opinion said, deeming MMFA’s reporting as “core First Amendment activities.”

Mehta’s order also banned Paxton from taking any steps to further his investigation until the lawsuit is decided.

In a statement Friday, MMFA President and CEO Angelo Carusone celebrated the win as not just against Paxton but also against Musk.

“Elon Musk encouraged Republican state attorneys general to use their power to harass their critics and stifle reporting about X,” Carusone said. “Ken Paxton was one of those AGs that took up the call and he was defeated. Today’s decision is a victory for free speech.”

Paxton has not yet responded to the preliminary injunction and his office did not respond to Ars’ request to comment..

Media Matters’ lawyer, Aria C. Branch, a partner at Elias Law Group, told Ars that “while Attorney General Paxton’s office has not yet responded to Friday’s ruling, the preliminary injunction should certainly put an end to these kind of lawless, politically motivated attempts to muzzle the press.”

Judge halts Texas probe into Media Matters’ reporting on X Read More »

x-filing-“thermonuclear-lawsuit”-in-texas-should-be-“fatal,”-media-matters-says

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says

Ever since Elon Musk’s X Corp sued Media Matters for America (MMFA) over a pair of reports that X (formerly Twitter) claims caused an advertiser exodus in 2023, one big question has remained for onlookers: Why is this fight happening in Texas?

In a motion to dismiss filed in Texas’ northern district last month, MMFA argued that X’s lawsuit should be dismissed not just because of a “fatal jurisdictional defect,” but “dismissal is also required for lack of venue.”

Notably, MMFA is based in Washington, DC, while “X is organized under Nevada law and maintains its principal place of business in San Francisco, California, where its own terms of service require users of its platform to litigate any disputes.”

“Texas is not a fair or reasonable forum for this lawsuit,” MMFA argued, suggesting that “the case must be dismissed or transferred” because “neither the parties nor the cause of action has any connection to Texas.”

Last Friday, X responded to the motion to dismiss, claiming that the lawsuit—which Musk has described as “thermonuclear”—was appropriately filed in Texas because MMFA “intentionally” targeted readers and at least two X advertisers located in Texas, Oracle and AT&T. According to X, because MMFA “identified Oracle, a Texas-based corporation, by name in its coverage,” MMFA “cannot claim surprise at being held to answer for its conduct in Texas.” X also claimed that Texas has jurisdiction because Musk resides in Texas and “makes numerous critical business decisions about X while in Texas.”

This so-called targeting of Texans caused a “substantial part” of alleged financial harms that X attributes to MMFA’s reporting, X alleged.

According to X, MMFA specifically targeted X in Texas by sending newsletters sharing its reports with “hundreds or thousands” of Texas readers and by allegedly soliciting donations from Texans to support MMFA’s reporting.

But MMFA pushed back, saying that “Texas subscribers comprise a disproportionately small percentage of Media Matters’ newsletter recipients” and that MMFA did “not solicit Texas donors to fund Media Matters’s journalism concerning X.” Because of this, X’s “efforts to concoct claim-related Texas contacts amount to a series of shots in the dark, uninformed guesses, and irrelevant tangents,” MMFA argued.

On top of that, MMFA argued that X could not attribute any financial harms allegedly caused by MMFA’s reports to either of the two Texas-based advertisers that X named in its court filings. Oracle, MMFA said, “by X’s own admission,… did not withdraw its ads” from X, and AT&T was not named in MMFA’s reporting, and thus, “any investigation AT&T did into its ad placement on X was of its own volition and is not plausibly connected to Media Matters.” MMFA has argued that advertisers, particularly sophisticated Fortune 500 companies, made their own decisions to stop advertising on X, perhaps due to widely reported increases in hate speech on X or even Musk’s own seemingly antisemitic posting.

Ars could not immediately reach X, Oracle, or AT&T for comment.

X’s suit allegedly designed to break MMFA

MMFA President Angelo Carusone, who is a defendant in X’s lawsuit, told Ars that X’s recent filing has continued to “expose” the lawsuit as a “meritless and vexatious effort to inflict maximum damage on critical research and reporting about the platform.”

“It’s solely designed to basically break us or stop us from doing the work that we were doing originally,” Carusone said, confirming that the lawsuit has negatively impacted MMFA’s hate speech research on X.

MMFA argued that Musk could have sued in other jurisdictions, such as Maryland, DC, or California, and MMFA would not have disputed the venue, but Carusone suggested that Musk sued in Texas in hopes that it would be “a more friendly jurisdiction.”

X filing “thermonuclear lawsuit” in Texas should be “fatal,” Media Matters says Read More »

bill-that-could-ban-tiktok-passes-in-house-despite-constitutional-concerns

Bill that could ban TikTok passes in House despite constitutional concerns

Bill that could ban TikTok passes in House despite constitutional concerns

On Wednesday, the US House of Representatives passed a bill with a vote of 352–65 that could block TikTok in the US. Fifteen Republicans and 50 Democrats voted in opposition, and one Democrat voted present, CNN reported.

TikTok is not happy. A spokesperson told Ars, “This process was secret and the bill was jammed through for one reason: it’s a ban. We are hopeful that the Senate will consider the facts, listen to their constituents, and realize the impact on the economy, 7 million small businesses, and the 170 million Americans who use our service.”

Lawmakers insist that the Protecting Americans from Foreign Adversary Controlled Applications Act is not a ban. Instead, they claim the law gives TikTok a choice: either divest from ByteDance’s China-based owners or face the consequences of TikTok being cut off in the US.

Under the law—which still must pass the Senate, a more significant hurdle, where less consensus is expected and a companion bill has not yet been introduced—app stores and hosting services would face steep consequences if they provide access to apps controlled by US foreign rivals. That includes allowing the app to be updated or maintained by US users who already have the app on their devices.

Violations subject app stores and hosting services to fines of $5,000 for each individual US user “determined to have accessed, maintained, or updated a foreign adversary-controlled application.” With 170 million Americans currently on TikTok, that could add up quickly to eye-popping fines.

If the bill becomes law, app stores and hosting services would have 180 days to limit access to foreign adversary-controlled apps. The bill specifically names TikTok and ByteDance as restricted apps, making it clear that lawmakers intend to quash the alleged “national security threat” that TikTok poses in the US.

House Energy and Commerce Committee Chair Cathy McMorris Rodgers (R-Wash.), a proponent of the bill, has said that “foreign adversaries like China pose the greatest national security threat of our time. With applications like TikTok, these countries are able to target, surveil, and manipulate Americans.” The proposed bill “ends this practice by banning applications controlled by foreign adversaries of the United States that pose a clear national security risk.”

McMorris Rodgers has also made it clear that “our goal is to get this legislation onto the president’s desk.” Joe Biden has indicated he will sign the bill into law, leaving the Senate as the final hurdle to clear. Senators told CNN that they were waiting to see what happened in the House before seeking a path forward in the Senate that would respect TikTok users’ civil liberties.

Attempts to ban TikTok have historically not fared well in the US, with a recent ban in Montana being reversed by a federal judge last December. Judge Donald Molloy granted TikTok’s request for a preliminary injunction, denouncing Montana’s ban as an unconstitutional infringement of Montana-based TikTok users’ rights.

More recently, the American Civil Liberties Union (ACLU) has slammed House lawmakers for rushing the bill through Congress, accusing lawmakers of attempting to stifle free speech. ACLU senior policy counsel Jenna Leventoff said in a press release that lawmakers were “once again attempting to trade our First Amendment rights for cheap political points during an election year.”

“Just because the bill sponsors claim that banning TikTok isn’t about suppressing speech, there’s no denying that it would do just that,” Leventoff said.

Bill that could ban TikTok passes in House despite constitutional concerns Read More »

some-calif.-cops-still-sharing-license-plate-info-with-anti-abortion-states

Some Calif. cops still sharing license plate info with anti-abortion states

Some Calif. cops still sharing license plate info with anti-abortion states

Dozens of California police agencies are still sharing automated license plate reader (ALPR) data with out-of-state authorities without a warrant, the Electronic Frontier Foundation has revealed. This is occurring despite guidance issued by State Attorney General Rob Bonta last year.

Clarifying a state law that limits state public agencies to sharing ALPR data only with other public agencies, Bonta’s guidance pointed out that “importantly,” the law’s definition of “public agency” “does not include out-of-state or federal law enforcement agencies.”

Bonta’s guidance came after EFF uncovered more than 70 California law enforcement agencies sharing ALPR data with cops in other states, including anti-abortion states. After Bonta clarified the statute, approximately half of these agencies told EFF that they updated their practices to fall in line with Bonta’s reading of the law. Some states could not verify that the practice had ended yet, though.

In a letter to Bonta, EFF praised the guidance as protecting Californians’ privacy but also flagged more than 30 police agencies that either expressly rejected Bonta’s guidance or else refused to confirm that they’ve stopped sharing data with out-of-state authorities. EFF staff attorney Jennifer Pinsof told Ars that it’s likely that additional agencies are also failing to comply, such as agencies that EFF never contacted or that recently acquired ALPR technology.

“We think it is very likely other agencies in the state remain out of compliance with the law,” EFF’s letter said.

EFF is hoping that making Bonta aware of the ongoing non-compliance will end sharing of highly sensitive location data with police agencies in states that do not provide as many privacy protections as California does. If Bonta “takes initiative” to enforce compliance, Pinsof said that police may be more willing to consider the privacy risks involved, since Bonta can “communicate more easily with the law enforcement community” than privacy advocates can.

However, even Bonta may struggle, as some agencies “have dug their heels in,” Pinsof said.

Many state police agencies simply do not agree with Bonta’s interpretation of the law, which they claim does allow sharing ALPR data with cops in other states. In a November letter, a lawyer representing the California State Sheriffs’ Association, California Police Chiefs Association, and California Peace Officers’ Association urged Bonta to “revisit” his position that the law “precludes sharing ALPR data with out-of-state governmental entities for legitimate law enforcement purposes.”

The cops argued that sharing ALPR data with cops in other states assists “in the apprehension and prosecution of child abductors, narcotics traffickers, human traffickers, extremist hate groups, and other cross-state criminal enterprises.”

They told Bonta that the law “was not designed to block law enforcement from sharing ALPR data outside California where the information could be used to intercede with criminal offenders moving from state to state.” As they see it, cooperation between state authorities is “absolutely imperative to effective policing.”

Here’s where cops say the ambiguity lies. The law defines public agency as “the state, any city, county, or city and county, or any agency or political subdivision of the state or a city, county, or city and county, including, but not limited to, a law enforcement agency.” According to cops, because the law does not “specifically refer to the State of California” or “this state,” it could be referring to agencies in any state.

“Had the legislation referred to ‘a State’ rather than ‘the State,’ there would be no debate about whether sharing was prohibited,” the police associations’ letter said. “We see no basis to read such a limitation into the legislation based on the word ‘the’ rather than ‘a.'”

The police associations also reminded Bonta that the California Legislature considered passing a bill that would have explicitly “prohibited the out-of-state sharing of ALPR information” with states interfering with “the right to seek abortion services” but “rejected it.” They told Bonta that “the Legislature’s refusal to adopt a position consistent with the position” he is “advancing is troubling.”

EFF said that California police can still share ALPR data with out-of-state police in situations permitted by law, like when out-of-state cops have a “warrant for ALPR information based on probable cause and particularity.” Instead, EFF alleged that cops are “dragnet sharing through commercial cloud storage systems” without warrants, which could be violating Californians’ right to privacy, as well as First and Fourth Amendment rights.

EFF urged Bonta to reject the police associations’ “crabbed interpretation” of the statute, but it’s unclear if Bonta will ever respond. Pinsof told Ars that Bonta did not directly respond to EFF’s initial investigation, but the guidance he later issued seemed to suggest that he got EFF’s message.

Police associations and Bonta’s office did not respond to Ars’ request to comment.

Some Calif. cops still sharing license plate info with anti-abortion states Read More »