police surveillance

cop-busted-for-unauthorized-use-of-clearview-ai-facial-recognition-resigns

Cop busted for unauthorized use of Clearview AI facial recognition resigns

Secret face scans —

Indiana cop easily hid frequent personal use of Clearview AI face scans.

Cop busted for unauthorized use of Clearview AI facial recognition resigns

An Indiana cop has resigned after it was revealed that he frequently used Clearview AI facial recognition technology to track down social media users not linked to any crimes.

According to a press release from the Evansville Police Department, this was a clear “misuse” of Clearview AI’s controversial face scan tech, which some US cities have banned over concerns that it gives law enforcement unlimited power to track people in their daily lives.

To help identify suspects, police can scan what Clearview AI describes on its website as “the world’s largest facial recognition network.” The database pools more than 40 billion images collected from news media, mugshot websites, public social media, and other open sources.

But these scans must always be linked to an investigation, and Evansville police chief Philip Smith said that instead, the disgraced cop repeatedly disguised his personal searches by deceptively “utilizing an actual case number associated with an actual incident” to evade detection.

Smith’s department discovered the officer’s unauthorized use after performing an audit before renewing their Clearview AI subscription in March. That audit showed “an anomaly of very high usage of the software by an officer whose work output was not indicative of the number of inquiry searches that they had.”

Another clue to the officer’s abuse of the tool was that most face scans conducted during investigations are “usually live or CCTV images”—shots taken in the wild—Smith said. However, the officer who resigned was mainly searching social media images, which was a red flag.

An investigation quickly “made clear that this officer was using Clearview AI” for “personal purposes,” Smith said, declining to name the officer or verify if targets of these searchers were notified.

As a result, Smith recommended that the department terminate the officer. However, the officer resigned “before the Police Merit Commission could make a final determination on the matter,” Smith said.

Easily dodging Clearview AI’s built-in compliance features

Clearview AI touts the face image network as a public safety resource, promising to help law enforcement make arrests sooner while committing to “ethical and responsible” use of the tech.

On its website, the company says that it understands that “law enforcement agencies need built-in compliance features for increased oversight, accountability, and transparency within their jurisdictions, such as advanced admin tools, as well as user-friendly dashboards, reporting, and metrics tools.”

To “help deter and detect improper searches,” its website says that a case number and crime type is required, and “every agency is required to have an assigned administrator that can see an in-depth overview of their organization’s search history.”

It seems that neither of those safeguards stopped the Indiana cop from repeatedly scanning social media images for undisclosed personal reasons, seemingly rubber-stamping the case number and crime type requirement and going unnoticed by his agency’s administrator. This incident could have broader implications in the US, where its technology has been widely used by police to conduct nearly 1 million searches, Clearview AI CEO Hoan Ton-That told the BBC last year.

In 2022, Ars reported when Clearview AI told investors it had ambitions to collect more than 100 billion face images, ensuring that “almost everyone in the world will be identifiable.” As privacy concerns about the controversial tech mounted, it became hotly debated. Facebook moved to stop the company from scraping faces on its platform, and the ACLU won a settlement that banned Clearview AI from contracting with most businesses. But the US government retained access to the tech, including “hundreds of police forces across the US,” Ton-That told the BBC.

Most law enforcement agencies are hesitant to discuss their Clearview AI tactics in detail, the BBC reported, so it’s often unclear who has access and why. But the Miami Police confirmed that “it uses this software for every type of crime,” the BBC reported.

Now, at least one Indiana police department has confirmed that an officer can sneakily abuse the tech and conduct unapproved face scans with apparent ease.

According to Kashmir Hill—the journalist who exposed Clearview AI’s tech—the disgraced cop was following in the footsteps of “billionaires, Silicon Valley investors, and a few high-wattage celebrities” who got early access to Clearview AI tech in 2020 and considered it a “superpower on their phone, allowing them to put a name to a face and dig up online photos of someone that the person might not even realize were online.”

Advocates have warned that stronger privacy laws are needed to stop law enforcement from abusing Clearview AI’s network, which Hill described as “a Shazam for people.”

Smith said the officer disregarded department guidelines by conducting the improper face scans.

“To ensure that the software is used for its intended purposes, we have put in place internal operational guidelines and adhere to the Clearview AI terms of service,” Smith said. “Both have language that clearly states that this is a tool for official use and is not to be used for personal reasons.

Cop busted for unauthorized use of Clearview AI facial recognition resigns Read More »

amazon-ring-stops-letting-police-request-footage-in-neighbors-app-after-outcry

Amazon Ring stops letting police request footage in Neighbors app after outcry

Neighborhood watch —

Warrantless access may still be granted during vaguely defined “emergencies.”

Amazon Ring stops letting police request footage in Neighbors app after outcry

Amazon Ring has shut down a controversial feature in its community safety app Neighbors that has allowed police to contact homeowners and request doorbell and surveillance camera footage without a warrant for years.

In a blog, head of the Neighbors app Eric Kuhn confirmed that “public safety agencies like fire and police departments can still use the Neighbors app to share helpful safety tips, updates, and community events,” but the Request for Assistance (RFA) tool will be disabled.

“They will no longer be able to use the RFA tool to request and receive video in the app,” Kuhn wrote.

Kuhn did not explain why Neighbors chose to “sunset” the RFA tool, but privacy advocates and lawmakers have long criticized Ring for helping to expand police surveillance in communities, seemingly threatening privacy and enabling racial profiling, CNBC reported. Among the staunchest critics of Ring’s seemingly tight relationship with law enforcement is the Electronic Frontier Foundation (EFF), which has long advocated for Ring and its users to stop sharing footage with police without a warrant.

In a statement provided to Ars, EFF senior policy analyst Matthew Guariglia noted that Ring had launched the RFA tool after EFF and other organizations had criticized Ring for allowing police to privately email warrantless requests for footage in the Neighbors app. Rather than end requests through the app entirely, Ring appeared to see the RFA tool as a middle ground, providing transparency about how many requests were being made, without ending police access to community members readily sharing footage on the app.

“Now, Ring hopefully will altogether be out of the business of platforming casual and warrantless police requests for footage to its users,” Guariglia said.

Moving forward, police and public safety agencies with warrants will still be able to request footage, which Amazon documents in transparency reports published every six months. These reports show thousands of search warrant requests and even more “preservation requests,” which allow government agencies to request to preserve user information for up to 90 days, “pending the receipt of a legally valid and binding order.”

“If we are legally required to comply, we will provide information responsive to the government demand,” Ring’s website says.

Ring rebrand embraces “hope and joy”

Guariglia said that Ring sunsetting the RFA tool “is a step in the right direction,” but it has “come after years of cozy relationships with police and irresponsible handling of data” that has, for many, damaged trust in Ring.

In 2022, EFF reported that Ring admitted that “there are ’emergency’ instances when police can get warrantless access to Ring personal devices without the owner’s permission.” And last year, Ring reached a $5.8 million settlement with the Federal Trade Commission, refunding customers for what the FTC described as “compromising its customers’ privacy by allowing any employee or contractor to access consumers’ private videos and by failing to implement basic privacy and security protections, enabling hackers to take control of consumers’ accounts, cameras, and videos.”

Because of this history, Guariglia said that EFF is “still deeply skeptical about law enforcement’s and Ring’s ability to determine what is, or is not, an emergency that requires the company to hand over footage without a warrant or user consent.”

EFF recommends additional steps that Ring could take to enhance user privacy, like enabling end-to-end encryption by default and turning off default audio collection, Guariglia said.

Bloomberg noted that this change to the Neighbors app comes after a new CEO, Liz Hamren, came on board, announcing that last year “Ring was rethinking its mission statement.” Because Ring was adding indoor and backyard home monitoring and business services, the company’s initial mission statement—”to reduce crime in neighborhoods”—was no longer, as founding Ring CEO Jamie Siminoff had promoted it, “at the core” of what Ring does.

In Kuhn’s blog, barely any attention is given to ending the RFA tool. A Ring spokesperson declined to tell Ars how many users had volunteered to use the tool, so it remains unclear how popular it was.

Rather than clarifying the RFA tool controversy, Kuhn’s blog primarily focused on describing how much Ring users loved “heartwarming or silly” footage like a “bear relaxing in a pool.” Under Hamren and Kuhn’s guidance, it appears that the Neighbors app is embracing a new mission of connecting communities to find “hope and joy” in their areas by adding new features to Neighbors like Moments and Best of Ring.

By contrast, when Ring introduced the RFA tool, it said that its mission was “to make neighborhoods safer for everyone.” On a help page, Ring bragged that police had used Neighbors to recover stolen guns and medical supplies. Because of these selling points, Ring’s community safety features may still be priorities for some users. So, while Ring may be ready to move on from highlighting its partnership with law enforcement as a “core” part of its service, its users may still be used to seeing their cameras as tools that should be readily accessible to police.

As law enforcement agencies lose access to Neighbors’ RFA tool, Guariglia said that it’s important to raise awareness among Ring owners that police can’t demand access to footage without a warrant.

“This announcement will not stop police from trying to get Ring footage directly from device owners without a warrant,” Guariglia said. “Ring users should also know that when police knock on their door, they have the right to, and should, request that police get a warrant before handing over footage.”

Amazon Ring stops letting police request footage in Neighbors app after outcry Read More »