Digital Services Act

eu-considers-calculating-x-fines-by-including-revenue-from-musk’s-other-firms

EU considers calculating X fines by including revenue from Musk’s other firms

“After Breton resigned in September, he bequeathed his fining powers to competition and digital boss Margrethe Vestager. Decisions on the penalties and how they are calculated would ultimately lie with Vestager,” Bloomberg wrote. The European Commission would have the final say.

“The commission hasn’t yet decided whether to penalize X, and the size of any potential fine is still under discussion,” Bloomberg wrote, citing its anonymous sources. “Penalties may be avoided if X finds ways to satisfy the watchdog’s concerns.”

X says SpaceX revenue should be off-limits

Although X faces potential DSA fines, it will avoid penalties under the EU’s Digital Markets Act (DMA). The European Commission announced yesterday that X does not “qualify as a gatekeeper in relation to its online social networking service, given that the investigation revealed that X is not an important gateway for business users to reach end users.”

But documents related to the DMA probe of X raise the possibility of treating multiple Musk-led companies as a single entity called the “Musk Group” for compliance purposes. In a March 2024 letter to Musk and X Holdings Corp., “the Commission set out its preliminary views on the possible designation of Mr. Elon Musk and the companies that he controls (‘the Musk Group’) as a gatekeeper,” according to a document signed by Breton.

X has argued that it wouldn’t make sense to include Musk’s other companies in revenue calculations when issuing penalties. “X Holdings Corp. submits that the combined market value of the Musk Group does not accurately reflect X’s monetization potential in the Union or its financial capacity,” the document said. “In particular, it argues that X and SpaceX provide entirely different services to entirely different users, so that there is no gateway effect, and that the undertakings controlled by Mr. Elon Musk ‘do not form one financial front, as the DMA presumes.'”

We contacted X and SpaceX today and will update this article if they provide any comment.

EU considers calculating X fines by including revenue from Musk’s other firms Read More »

meta-risks-sanctions-over-“sneaky”-ad-free-plans-confusing-users,-eu-says

Meta risks sanctions over “sneaky” ad-free plans confusing users, EU says

Under pressure —

Consumer laws may change Meta’s ad-free plans before EU’s digital crackdown does.

Meta risks sanctions over “sneaky” ad-free plans confusing users, EU says

The European Commission (EC) has finally taken action to block Meta’s heavily criticized plan to charge a subscription fee to users who value privacy on its platforms.

Surprisingly, this step wasn’t taken under laws like the Digital Services Act (DSA), the Digital Markets Act (DMA), or the General Data Protection Regulation (GDPR).

Instead, the EC announced Monday that Meta risked sanctions under EU consumer laws if it could not resolve key concerns about Meta’s so-called “pay or consent” model.

Meta’s model is seemingly problematic, the commission said, because Meta “requested consumers overnight to either subscribe to use Facebook and Instagram against a fee or to consent to Meta’s use of their personal data to be shown personalized ads, allowing Meta to make revenue out of it.”

Because users were given such short notice, they may have been “exposed to undue pressure to choose rapidly between the two models, fearing that they would instantly lose access to their accounts and their network of contacts,” the EC said.

To protect consumers, the EC joined national consumer protection authorities, sending a letter to Meta requiring the tech giant to propose solutions to resolve the commission’s biggest concerns by September 1.

That Meta’s “pay or consent” model may be “misleading” is a top concern because it uses the term “free” for ad-based plans, even though Meta “can make revenue from using their personal data to show them personalized ads.” It seems that while Meta does not consider giving away personal information to be a cost to users, the EC’s commissioner for justice, Didier Reynders, apparently does.

“Consumers must not be lured into believing that they would either pay and not be shown any ads anymore, or receive a service for free, when, instead, they would agree that the company used their personal data to make revenue with ads,” Reynders said. “EU consumer protection law is clear in this respect. Traders must inform consumers upfront and in a fully transparent manner on how they use their personal data. This is a fundamental right that we will protect.”

Additionally, the EC is concerned that Meta users might be confused about how “to navigate through different screens in the Facebook/Instagram app or web-version and to click on hyperlinks directing them to different parts of the Terms of Service or Privacy Policy to find out how their preferences, personal data, and user-generated data will be used by Meta to show them personalized ads.” They may also find Meta’s “imprecise terms and language” confusing, such as Meta referring to “your info” instead of clearly referring to consumers’ “personal data.”

To resolve the EC’s concerns, Meta may have to give EU users more time to decide if they want to pay to subscribe or consent to personal data collection for targeted ads. Or Meta may have to take more drastic steps by altering language and screens used when securing consent to collect data or potentially even scrapping its “pay or consent” model entirely, as pressure in the EU mounts.

So far, Meta has defended its model against claims that it violates the DMA, the DSA, and the GDPR, and Meta’s spokesperson told Ars that Meta continues to defend the model while facing down the EC’s latest action.

“Subscriptions as an alternative to advertising are a well-established business model across many industries,” Meta’s spokesperson told Ars. “Subscription for no ads follows the direction of the highest court in Europe and we are confident it complies with European regulation.”

Meta’s model is “sneaky,” EC said

Since last year, the social media company has argued that its “subscription for no ads” model was “endorsed” by the highest court in Europe, the Court of Justice of the European Union (CJEU).

However, privacy advocates have noted that this alleged endorsement came following a CJEU case under the GDPR and was only presented as a hypothetical, rather than a formal part of the ruling, as Meta seems to interpret.

What the CJEU said was that “users must be free to refuse individually”—”in the context of” signing up for services—”to give their consent to particular data processing operations not necessary” for Meta to provide such services “without being obliged to refrain entirely from using the service.” That “means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations,” the CJEU said.

The nuance here may matter when it comes to Meta’s proposed solutions even if the EC accepts the CJEU’s suggestion of an acceptable alternative as setting some sort of legal precedent. Because the consumer protection authorities raised the action due to Meta suddenly changing the consent model for existing users—not “in the context of” signing up for services—Meta may struggle to persuade the EC that existing users weren’t misled and pressured into paying for a subscription or consenting to ads, given how fast Meta’s policy shifted.

Meta risks sanctions if a compromise can’t be reached, the EC said. Under the EU’s Unfair Contract Terms Directive, for example, Meta could be fined up to 4 percent of its annual turnover if consumer protection authorities are unsatisfied with Meta’s proposed solutions.

The EC’s vice president for values and transparency, Věra Jourová, provided a statement in the press release, calling Meta’s abrupt introduction of the “pay or consent” model “sneaky.”

“We are proud of our strong consumer protection laws which empower Europeans to have the right to be accurately informed about changes such as the one proposed by Meta,” Jourová said. “In the EU, consumers are able to make truly informed choices and we now take action to safeguard this right.”

Meta risks sanctions over “sneaky” ad-free plans confusing users, EU says Read More »

eu-accuses-tiktok-of-failing-to-stop-kids-pretending-to-be-adults

EU accuses TikTok of failing to stop kids pretending to be adults

Getting TikTok’s priorities straight —

TikTok becomes the second platform suspected of Digital Services Act breaches.

EU accuses TikTok of failing to stop kids pretending to be adults

The European Commission (EC) is concerned that TikTok isn’t doing enough to protect kids, alleging that the short-video app may be sending kids down rabbit holes of harmful content while making it easy for kids to pretend to be adults and avoid the protective content filters that do exist.

The allegations came Monday when the EC announced a formal investigation into how TikTok may be breaching the Digital Services Act (DSA) “in areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content.”

“We must spare no effort to protect our children,” Thierry Breton, European Commissioner for Internal Market, said in the press release, reiterating that the “protection of minors is a top enforcement priority for the DSA.”

This makes TikTok the second platform investigated for possible DSA breaches after X (aka Twitter) came under fire last December. Both are being scrutinized after submitting transparency reports in September that the EC said failed to satisfy the DSA’s strict standards on predictable things like not providing enough advertising transparency or data access for researchers.

But while X is additionally being investigated over alleged dark patterns and disinformation—following accusations last October that X wasn’t stopping the spread of Israel/Hamas disinformation—it’s TikTok’s young user base that appears to be the focus of the EC’s probe into its platform.

“As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online,” Breton said. “We are launching this formal infringement proceeding today to ensure that proportionate action is taken to protect the physical and emotional well-being of young Europeans.”

Likely over the coming months, the EC will request more information from TikTok, picking apart its DSA transparency report. The probe could require interviews with TikTok staff or inspections of TikTok’s offices.

Upon concluding its investigation, the EC could require TikTok to take interim measures to fix any issues that are flagged. The Commission could also make a decision regarding non-compliance, potentially subjecting TikTok to fines of up to 6 percent of its global turnover.

An EC press officer, Thomas Regnier, told Ars that the Commission suspected that TikTok “has not diligently conducted” risk assessments to properly maintain mitigation efforts protecting “the physical and mental well-being of their users, and the rights of the child.”

In particular, its algorithm may risk “stimulating addictive behavior,” and its recommender systems “might drag its users, in particular minors and vulnerable users, into a so-called ‘rabbit hole’ of repetitive harmful content,” Regnier told Ars. Further, TikTok’s age verification system may be subpar, with the EU alleging that TikTok perhaps “failed to diligently assess the risk of 13-17-year-olds pretending to be adults when accessing TikTok,” Regnier said.

To better protect TikTok’s young users, the EU’s investigation could force TikTok to update its age-verification system and overhaul its default privacy, safety, and security settings for minors.

“In particular, the Commission suspects that the default settings of TikTok’s recommender systems do not ensure a high level of privacy, security, and safety of minors,” Regnier said. “The Commission also suspects that the default privacy settings that TikTok has for 16-17-year-olds are not the highest by default, which would not be compliant with the DSA, and that push notifications are, by default, not switched off for minors, which could negatively impact children’s safety.”

TikTok could avoid steep fines by committing to remedies recommended by the EC at the conclusion of its investigation.

Regnier told Ars that the EC does not comment on ongoing investigations, but its probe into X has spanned three months so far. Because the DSA does not provide any deadlines that may speed up these kinds of enforcement proceedings, ultimately, the duration of both investigations will depend on how much “the company concerned cooperates,” the EU’s press release said.

A TikTok spokesperson told Ars that TikTok “would continue to work with experts and the industry to keep young people on its platform safe,” confirming that the company “looked forward to explaining this work in detail to the European Commission.”

“TikTok has pioneered features and settings to protect teens and keep under-13s off the platform, issues the whole industry is grappling with,” TikTok’s spokesperson said.

All online platforms are now required to comply with the DSA, but enforcement on TikTok began near the end of July 2023. A TikTok press release last August promised that the platform would be “embracing” the DSA. But in its transparency report, submitted the next month, TikTok acknowledged that the report only covered “one month of metrics” and may not satisfy DSA standards.

“We still have more work to do,” TikTok’s report said, promising that “we are working hard to address these points ahead of our next DSA transparency report.”

EU accuses TikTok of failing to stop kids pretending to be adults Read More »

musk’s-x-hit-with-eu’s-first-investigation-of-digital-services-act-violations

Musk’s X hit with EU’s first investigation of Digital Services Act violations

EU investigates X —

EU probes disinformation, election policy, Community Notes, and paid checkmarks.

Illustration includes an upside-down Twitter bird logo with an

Getty Images | Chris Delmas

The European Union has opened a formal investigation into whether Elon Musk’s X platform (formerly Twitter) violated the Digital Services Act (DSA), which could result in fines of up to 6 percent of global revenue. A European Commission announcement today said the agency “opened formal proceedings to assess whether X may have breached the Digital Services Act (DSA) in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers.”

This is the commission’s first formal investigation under the Digital Services Act, which applies to large online platforms and has requirements on content moderation and transparency. The step has been in the works since at least October, when a formal request for information was sent amid reports of widespread Israel/Hamas disinformation.

The European Commission today said it “decided to open formal infringement proceedings against X under the Digital Services Act” after reviewing X’s replies to the request for information on topics including “the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel.” The commission said the investigation will focus on dissemination of illegal content, the effectiveness of measures taken to combat information manipulation on X, transparency, and “a suspected deceptive design of the user interface.”

The illegal content probe will focus on “risk assessment and mitigation measures” and “the functioning of the notice and action mechanism for illegal content” that is mandated by the DSA. The commission said this will be evaluated “in light of X’s content moderation resources,” a reference to the deep staff cuts made by Musk since purchasing Twitter in October 2022.

Community Notes and paid checkmarks under review

The information manipulation portion of the investigation will evaluate “the effectiveness of X’s so-called ‘Community Notes’ system in the EU and the effectiveness of related policies mitigating risks to civic discourse and electoral processes,” the announcement said. The transparency probe “concerns suspected shortcomings in giving researchers access to X’s publicly accessible data as mandated by Article 40 of the DSA, as well as shortcomings in X’s ads repository,” the commission said.

Musk’s decision to make “verification” checkmarks a paid feature will figure into the commission’s probe of whether the X user interface has a deceptive design. The commission said it will evaluate “checkmarks linked to certain subscription products, the so-called Blue checks.”

The investigation will include more requests for information, interviews, and “inspections,” the commission said. There is no legal deadline for completing the investigation.

“The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions. The Commission is also empowered to accept any commitment made by X to remedy on the matters subject to the proceeding,” the announcement said.

In a statement today, X said it is committed to complying with the Digital Services Act and is cooperating with regulators. “It is important that this process remains free of political influence and follows the law,” the company said. “X is focused on creating a safe and inclusive environment for all users on our platform, while protecting freedom of expression, and we will continue to work tirelessly towards this goal.”

Musk’s X hit with EU’s first investigation of Digital Services Act violations Read More »