biometric data

vending-machine-error-reveals-secret-face-image-database-of-college-students

Vending machine error reveals secret face image database of college students

“Stupid M&M machines” —

Facial-recognition data is typically used to prompt more vending machine sales.

Vending machine error reveals secret face image database of college students

Aurich Lawson | Mars | Getty Images

Canada-based University of Waterloo is racing to remove M&M-branded smart vending machines from campus after outraged students discovered the machines were covertly collecting facial-recognition data without their consent.

The scandal started when a student using the alias SquidKid47 posted an image on Reddit showing a campus vending machine error message, “Invenda.Vending.FacialRecognitionApp.exe,” displayed after the machine failed to launch a facial recognition application that nobody expected to be part of the process of using a vending machine.

Reddit post shows error message displayed on a University of Waterloo vending machine (cropped and lightly edited for clarity).

Enlarge / Reddit post shows error message displayed on a University of Waterloo vending machine (cropped and lightly edited for clarity).

“Hey, so why do the stupid M&M machines have facial recognition?” SquidKid47 pondered.

The Reddit post sparked an investigation from a fourth-year student named River Stanley, who was writing for a university publication called MathNEWS.

Stanley sounded alarm after consulting Invenda sales brochures that promised “the machines are capable of sending estimated ages and genders” of every person who used the machines without ever requesting consent.

This frustrated Stanley, who discovered that Canada’s privacy commissioner had years ago investigated a shopping mall operator called Cadillac Fairview after discovering some of the malls’ informational kiosks were secretly “using facial recognition software on unsuspecting patrons.”

Only because of that official investigation did Canadians learn that “over 5 million nonconsenting Canadians” were scanned into Cadillac Fairview’s database, Stanley reported. Where Cadillac Fairview was ultimately forced to delete the entire database, Stanley wrote that consequences for collecting similarly sensitive facial recognition data without consent for Invenda clients like Mars remain unclear.

Stanley’s report ended with a call for students to demand that the university “bar facial recognition vending machines from campus.”

A University of Waterloo spokesperson, Rebecca Elming, eventually responded, confirming to CTV News that the school had asked to disable the vending machine software until the machines could be removed.

Students told CTV News that their confidence in the university’s administration was shaken by the controversy. Some students claimed on Reddit that they attempted to cover the vending machine cameras while waiting for the school to respond, using gum or Post-it notes. One student pondered whether “there are other places this technology could be being used” on campus.

Elming was not able to confirm the exact timeline for when machines would be removed other than telling Ars it would happen “as soon as possible.” She told Ars she is “not aware of any similar technology in use on campus.” And for any casual snackers on campus wondering, when, if ever, students could expect the vending machines to be replaced with snack dispensers not equipped with surveillance cameras, Elming confirmed that “the plan is to replace them.”

Invenda claims machines are GDPR-compliant

MathNEWS’ investigation tracked down responses from companies responsible for smart vending machines on the University of Waterloo’s campus.

Adaria Vending Services told MathNEWS that “what’s most important to understand is that the machines do not take or store any photos or images, and an individual person cannot be identified using the technology in the machines. The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface—never taking or storing images of customers.”

According to Adaria and Invenda, students shouldn’t worry about data privacy because the vending machines are “fully compliant” with the world’s toughest data privacy law, the European Union’s General Data Protection Regulation (GDPR).

“These machines are fully GDPR compliant and are in use in many facilities across North America,” Adaria’s statement said. “At the University of Waterloo, Adaria manages last mile fulfillment services—we handle restocking and logistics for the snack vending machines. Adaria does not collect any data about its users and does not have any access to identify users of these M&M vending machines.”

Under the GDPR, face image data is considered among the most sensitive data that can be collected, typically requiring explicit consent to collect, so it’s unclear how the machines may meet that high bar based on the Canadian students’ experiences.

According to a press release from Invenda, the maker of M&M candies, Mars, was a key part of Invenda’s expansion into North America. It was only after closing a $7 million funding round, including deals with Mars and other major clients like Coca-Cola, that Invenda could push for expansive global growth that seemingly vastly expands its smart vending machines’ data collection and surveillance opportunities.

“The funding round indicates confidence among Invenda’s core investors in both Invenda’s corporate culture, with its commitment to transparency, and the drive to expand global growth,” Invenda’s press release said.

But University of Waterloo students like Stanley now question Invenda’s “commitment to transparency” in North American markets, especially since the company is seemingly openly violating Canadian privacy law, Stanley told CTV News.

On Reddit, while some students joked that SquidKid47’s face “crashed” the machine, others asked if “any pre-law students wanna start up a class-action lawsuit?” One commenter summed up students’ frustration by typing in all caps, “I HATE THESE MACHINES! I HATE THESE MACHINES! I HATE THESE MACHINES!”

Vending machine error reveals secret face image database of college students Read More »

ftc-suggests-new-rules-to-shift-parents’-burden-of-protecting-kids-to-websites

FTC suggests new rules to shift parents’ burden of protecting kids to websites

Ending the endless tracking of kids —

FTC seeking public comments on new rules to expand children’s privacy law.

FTC suggests new rules to shift parents’ burden of protecting kids to websites

The Federal Trade Commission (FTC) is currently seeking comments on new rules that would further restrict platforms’ efforts to monetize children’s data.

Through the Children’s Online Privacy Protection Act (COPPA), the FTC initially sought to give parents more control over what kinds of information that various websites and apps can collect from their kids. Now, the FTC wants to update COPPA and “shift the burden from parents to providers to ensure that digital services are safe and secure for children,” the FTC’s press release said.

“By requiring firms to better safeguard kids’ data, our proposal places affirmative obligations on service providers and prohibits them from outsourcing their responsibilities to parents,” FTC chair Lina Khan said.

Among proposed rules, the FTC would require websites to turn off targeted advertising by default and prohibit sending push notifications to encourage kids to use services more than they want to. Surveillance in schools would be further restricted, so that data is only collected for educational purposes. And data security would be strengthened by mandating that websites and apps “establish, implement, and maintain a written children’s personal information security program that contains safeguards that are appropriate to the sensitivity of the personal information collected from children.”

Perhaps most significantly, COPPA would also be updated to stop companies from retaining children’s data forever, explicitly stating that “operators cannot retain the information indefinitely.” In a statement, commissioner Alvaro Bedoya called this a “critical protection” at a time when “new, machine learning-fueled systems require ever larger amounts of training data.”

These proposed changes were designed to address “the evolving ways personal information is being collected, used, and disclosed, including to monetize children’s data,” the FTC said.

Keeping up with advancing technology, the FTC said, also requires expanding COPPA’s definition of “personal information” to include biometric identifiers. That change was likely inspired by charges brought against Amazon earlier this year, when the FTC accused Amazon of violating COPPA by retaining tens of thousands of children’s Alexa voice recordings forever.

Once the notice of proposed rulemaking is published to the Federal Register, the public will have 60 days to submit comments. The FTC likely anticipates thousands of parents and stakeholders to weigh in, noting that the last time COPPA was updated in 2019, more than 175,000 comments were submitted.

Endless tracking of kids not a “victimless crime”

Bedoya said that updating the already-expansive children’s privacy law would prevent known harms. He also expressed concern that increasingly these harms are being overlooked, citing a federal judge in California who preliminarily enjoined California’s Age-Appropriate Design Code” in September. That judge had suggested that California’s law was “actually likely to exacerbate” online harm to kids, but Bedoya challenged that decision as reinforcing a “critique that has quietly proliferated around children’s privacy: the idea that many privacy invasions do not actually hurt children.”

For decades, COPPA has protected against the unauthorized or unnecessary collection, use, retention, and disclosure of children’s information, which Bedoya said “endangers children’s safety,” “exposes children and families to hacks and data breaches,” and “allows third-party companies to develop commercial relationships with children that prey on their trust and vulnerability.”

“I think each of these harms, particularly the latter, undermines the idea that the pervasive tracking of children online is [a] ‘victimless crime,'” Bedoya said, adding that “the harms that COPPA sought to prevent remain real, and COPPA remains relevant and profoundly important.”

According to Bedoya, COPPA is more vital than ever, as “we are only at the beginning of an era of biometric fraud.”

Khan characterized the proposed changes as “much-needed” in an “era where online tools are essential for navigating daily life—and where firms are deploying increasingly sophisticated digital tools to surveil children.”

“Kids must be able to play and learn online without being endlessly tracked by companies looking to hoard and monetize their personal data,” Khan said.

FTC suggests new rules to shift parents’ burden of protecting kids to websites Read More »