end to end encryption

backdoors-that-let-cops-decrypt-messages-violate-human-rights,-eu-court-says

Backdoors that let cops decrypt messages violate human rights, EU court says

Building of the European Court of Human Rights in Strasbourg (France).

Enlarge / Building of the European Court of Human Rights in Strasbourg (France).

The European Court of Human Rights (ECHR) has ruled that weakening end-to-end encryption disproportionately risks undermining human rights. The international court’s decision could potentially disrupt the European Commission’s proposed plans to require email and messaging service providers to create backdoors that would allow law enforcement to easily decrypt users’ messages.

This ruling came after Russia’s intelligence agency, the Federal Security Service (FSS), began requiring Telegram to share users’ encrypted messages to deter “terrorism-related activities” in 2017, ECHR’s ruling said. A Russian Telegram user alleged that FSS’s requirement violated his rights to a private life and private communications, as well as all Telegram users’ rights.

The Telegram user was apparently disturbed, moving to block required disclosures after Telegram refused to comply with an FSS order to decrypt messages on six users suspected of terrorism. According to Telegram, “it was technically impossible to provide the authorities with encryption keys associated with specific users,” and therefore, “any disclosure of encryption keys” would affect the “privacy of the correspondence of all Telegram users,” the ECHR’s ruling said.

For refusing to comply, Telegram was fined, and one court even ordered the app to be blocked in Russia, while dozens of Telegram users rallied to continue challenging the order to maintain Telegram services in Russia. Ultimately, users’ multiple court challenges failed, sending the case before the ECHR while Telegram services seemingly tenuously remained available in Russia.

The Russian government told the ECHR that “allegations that the security services had access to the communications of all users” were “unsubstantiated” because their request only concerned six Telegram users.

They further argued that Telegram providing encryption keys to FSB “did not mean that the information necessary to decrypt encrypted electronic communications would become available to its entire staff.” Essentially, the government believed that FSB staff’s “duty of discretion” would prevent any intrusion on private life for Telegram users as described in the ECHR complaint.

Seemingly most critically, the government told the ECHR that any intrusion on private lives resulting from decrypting messages was “necessary” to combat terrorism in a democratic society. To back up this claim, the government pointed to a 2017 terrorist attack that was “coordinated from abroad through secret chats via Telegram.” The government claimed that a second terrorist attack that year was prevented after the government discovered it was being coordinated through Telegram chats.

However, privacy advocates backed up Telegram’s claims that the messaging services couldn’t technically build a backdoor for governments without impacting all its users. They also argued that the threat of mass surveillance could be enough to infringe on human rights. The European Information Society Institute (EISI) and Privacy International told the ECHR that even if governments never used required disclosures to mass surveil citizens, it could have a chilling effect on users’ speech or prompt service providers to issue radical software updates weakening encryption for all users.

In the end, the ECHR concluded that the Telegram user’s rights had been violated, partly due to privacy advocates and international reports that corroborated Telegram’s position that complying with the FSB’s disclosure order would force changes impacting all its users.

The “confidentiality of communications is an essential element of the right to respect for private life and correspondence,” the ECHR’s ruling said. Thus, requiring messages to be decrypted by law enforcement “cannot be regarded as necessary in a democratic society.”

Martin Husovec, a law professor who helped to draft EISI’s testimony, told Ars that EISI is “obviously pleased that the Court has recognized the value of encryption and agreed with us that state-imposed weakening of encryption is a form of indiscriminate surveillance because it affects everyone’s privacy.”

Backdoors that let cops decrypt messages violate human rights, EU court says Read More »

apple-warns-proposed-uk-law-will-affect-software-updates-around-the-world

Apple warns proposed UK law will affect software updates around the world

Heads up —

Apple may leave the UK if required to provide advance notice of product updates.

Apple warns proposed UK law will affect software updates around the world

Apple is “deeply concerned” that proposed changes to a United Kingdom law could give the UK government unprecedented power to “secretly veto” privacy and security updates to its products and services, the tech giant said in a statement provided to Ars.

If passed, potentially this spring, the amendments to the UK’s Investigatory Powers Act (IPA) could deprive not just UK users, but all users globally of important new privacy and security features, Apple warned.

“Protecting our users’ privacy and the security of their data is at the very heart of everything we do at Apple,” Apple said. “We’re deeply concerned the proposed amendments” to the IPA “now before Parliament place users’ privacy and security at risk.”

The IPA was initially passed in 2016 to ensure that UK officials had lawful access to user data to investigate crimes like child sexual exploitation or terrorism. Proposed amendments were announced last November, after a review showed that the “Act has not been immune to changes in technology over the last six years” and “there is a risk that some of these technological changes have had a negative effect on law enforcement and intelligence services’ capabilities.”

The proposed amendments require that any company that fields government data requests must notify UK officials of any updates they planned to make that could restrict the UK government’s access to this data, including any updates impacting users outside the UK.

UK officials said that this would “help the UK anticipate the risk to public safety posed by the rolling out of technology by multinational companies that precludes lawful access to data. This will reduce the risk of the most serious offenses such as child sexual exploitation and abuse or terrorism going undetected.”

According to the BBC, the House of Lords will begin debating the proposed changes on Tuesday.

Ahead of that debate, Apple described the amendments on Monday as “an unprecedented overreach by the government” that “if enacted” could allow the UK to “attempt to secretly veto new user protections globally, preventing us from ever offering them to customers.”

In a letter last year, Apple argued that “it would be improper for the Home Office to act as the world’s regulator of security technology.”

Apple told the UK Home Office that imposing “secret requirements on providers located in other countries” that apply to users globally “could be used to force a company like Apple, that would never build a backdoor, to publicly withdraw critical security features from the UK market, depriving UK users of these protections.” It could also “dramatically disrupt the global market for security technologies, putting users in the UK and around the world at greater risk,” Apple claimed.

The proposed changes, Apple said, “would suppress innovation, stifle commerce, and—when combined with purported extraterritorial application—make the Home Office the de facto global arbiter of what level of data security and encryption are permissible.”

UK defends proposed changes

The UK Home Office has repeatedly stressed that these changes do not “provide powers for the Secretary of State to approve or refuse technical changes,” but “simply” requires companies “to inform the Secretary of State of relevant changes before those changes are implemented.”

“The intention is not to introduce a consent or veto mechanism or any other kind of barrier to market,” a UK Home Office fact sheet said. “A key driver for this amendment is to give operational partners time to understand the change and adapt their investigative techniques where necessary, which may in some circumstances be all that is required to maintain lawful access.”

The Home Office has also claimed that “these changes do not directly relate to end-to-end encryption,” while admitting that they “are designed to ensure that companies are not able to unilaterally make design changes which compromise exceptional lawful access where the stringent safeguards of the IPA regime are met.”

This seems to suggest that companies will not be allowed to cut off the UK government from accessing encrypted data under certain circumstances, which concerns privacy advocates who consider end-to-end encryption a vital user privacy and security protection. Earlier this month, civil liberties groups including Big Brother Watch, Liberty, Open Rights Group and Privacy International filed a joint brief opposing the proposed changes, the BBC reported, warning that passing the amendments would be “effectively transforming private companies into arms of the surveillance state and eroding the security of devices and the Internet.”

“We have always been clear that we support technological innovation and private and secure communications technologies, including end-to-end encryption, but this cannot come at a cost to public safety,” a UK government official told the BBC.

The UK government may face more opposition to the amendments than from tech companies and privacy advocates, though. In Apple’s letter last year, the tech giant noted that the proposed changes to the IPA could conflict with EU and US laws, including the EU’s General Data Protection Regulation—considered the world’s strongest privacy law.

Under the GDPR, companies must implement measures to safeguard users’ personal data, Apple said, noting that “encryption is one means by which a company can meet” that obligation.

“Secretly installing backdoors in end-to-end encrypted technologies in order to comply with UK law for persons not subject to any lawful process would violate that obligation,” Apple argued.

Apple warns proposed UK law will affect software updates around the world Read More »