WhatsApp says no government will make it weaken its encryption | Technology

The WhatsApp command says that it will not “lower security” of the messaging app as required by any country.

It would be “foolish” to heed any government that asked the app to weaken its encryption, says Will Cathcart, WhatsApp’s director, in an interview with the BBC.

In the UK, a government project to identify child sexual abuse images provides the possibility to analyze private messages.

Faced with WhatsApp’s stance on the issue, the British Society for the Prevention of Cruelty to Children (NSPCC) criticized the app, claiming it is the “frontline” of child sexual abuse.

The British government says tech companies have an obligation to tackle the problem. His project is part of an Online Safety Act, the analysis of which has been postponed to the next few months (when Boris Johnson will be formally resigned as premier, and a new conservative leader will be raised to the post).

  • Ants invade Kindle and buy books: ‘It had become a nest’

“They (companies) should not ignore the clear risks that end-to-end encryption could blind them to this (child abuse) content and undermine efforts to identify perpetrators,” a British government spokesperson said.

“We will continue to work with the technology industry to support the development of innovative methods that protect public safety without compromising privacy.”

End-to-End Encryption (E2EE) provides the most robust level of security for messages because only the recipient has the key to decrypt the message. This is considered essential for the privacy of communication.

This technology underpins online exchanges in apps like WhatsApp and Signal, and can optionally be used on Facebook Messenger and Telegram.

  • G1 is now on Telegram; see how to sign up

Only the sender and recipient can read these messages – so neither tech companies nor the police have access to them.

Now, the debate in the tech community is that the British government has promised to support the development of tools that can detect illegal images in or around the E2EE environment, in theory without disrespecting user privacy.

Experts question whether this is feasible – and many believe that the only option would be through so-called “client-side scanning”. These are systems that scan messages (such as texts, photos, videos and files) and compare their content with databases of content considered questionable – in this case, material involving child abuse. This processing takes place before the message reaches its recipient.

For critics, this, in practice, destroys the foundations of E2EE, since messages are no longer private.

“Client-side scanning doesn’t work in practice,” argues Cathcart.

As millions of people around the world use WhatsApp to communicate, the app needs to maintain the same privacy standards in all countries, he adds.

“If we had to lower security for the entire world, to accommodate one country’s requirement, (…) it would be foolish for us to accept that. It would make our product less desirable to 98% of users by the 2% requirement,” he says. Cathcart to BBC News.

The European Commission, for its part, said tech companies must “detect, report, block and remove” child abuse content from their platforms.

“What is being proposed is that we – either directly or indirectly, via software – read everyone’s messages”, defends Cathcart. “I don’t think people want that.”

The UK and European Union plan to echo Apple’s efforts, announced last year, to scan personal photographs on iPhones for abusive content before it could be placed on iCloud.

However, Apple backed off after privacy advocacy groups claimed the company was setting a dangerous precedent and that the system could be leveraged by authoritarian states to surveil and act against their citizens.

Ella Jakubowska, an adviser to the European Digital Rights group, said that using “‘client-side scanning’ is almost equivalent to spying on everyone’s phone.”

“This creates illegal access for malicious actors to be able to see your messages,” she declared.

Monica Horten, policy manager at the Open Rights Group, agreed. “If Apple can’t get this right, how can governments do it? ‘Client-side scanning’ is a form of mass surveillance. It’s a profound interference with privacy.”

Alan Woodward, a professor at the University of Surrey (UK), told BBC News that this type of scan can be misused.

“Of course, if you say ‘do you think we should protect children?’ everybody will say ‘yes’. But what if you say ‘I’m going to put something on your phone that will scan all your pictures and compare them to a database’. Then you start to see the implications of that.”

‘Only part of the abusers are detected’

Cathcart claims that WhatsApp can already detect hundreds of thousands of child abuse images online.

“We report more (harmful content) than any other internet service in the world,” he says.

But this claim is challenged by the child protection society.

“The reality is that currently, under the cloak of cryptography, they (WhatsApp) only identify a fraction of the levels of abuse that sister companies Facebook and Instagram are able to detect,” argues Andy Burrows, head of online protection policies at the NSPCC. .

Burrows claims the direct messaging app has become the “front line” of online child sexual abuse.

“Two-thirds of the (content of) child abuse currently identified and removed is viewed and removed from private messages,” Burrows tells BBC News.

“It is becoming increasingly clear that there is no need to pit child protection against adult privacy. We want a discussion on how to make a balanced deal.”

This text was originally published in https://www.bbc.com/portuguese/geral-62361452

Source link

About Admin

Check Also

Don’t run out of internet! Find out how eSIM can be your ally when traveling internationally | video

When you go on an international trip, one of the concerns is how you will …

Leave a Reply

Your email address will not be published. Required fields are marked *