The European Parliament’s Civil Liberties Committee has voted to adopt a policy that prevents the use of mass scanning of encrypted messages as part of an effort to identify and remove exploitative content on messaging platforms.
The committee on Tuesday voted by a wide margin to adopt the position, which is part of a lengthy process of negotiations and discussions to adopt a EU-wide mandate on detection and prevention of child exploitation material. One of the major points of contention in the draft law was a provision that would have required platform providers to provide a method for law enforcement to scan encrypted messages for illegal content. This type of provision has surfaced in proposed legislation in the United States and UK, as well, raising privacy concerns and drawing criticism from civil liberties groups around the world.
The vote by the Civil Liberties Committee is a preliminary step, as the broader European Parliament will take up the proposal next week.
“To meet this compelling challenge effectively, we have found a legally sound compromise supported by all political groups. It will create uniform rules to fight the sexual abuse of children online, meaning that all providers will have to assess if there is a risk of abuse in their services and mitigate those with tailor-made measures,” said Javier Zarzalejos, a member of EU Parliament from Spain.
“As a last resort, detection orders can be used to take down abusive material still circulating on the internet. This agreement strikes a balance between protecting children and protecting privacy.”
In lieu of mass scanning of end-to-end encrypted messages–which would require fundamental technological changes in those platforms–the committee proposed judicial orders with a strict time limit as the main legal means for removing exploitation material. Those orders would be highly targeted and be the last option, when other mitigation efforts have not worked.
“To avoid mass surveillance or generalised monitoring of the internet, the draft law would allow judicial authorities to authorise time-limited orders, as a last resort, to detect any CSAM and take it down or disable access to it, when mitigation measures are not effective in taking it down. In addition, MEPs emphasise the need to target detection orders to individuals or groups (including subscribers to a channel) linked to child sexual abuse using ‘reasonable grounds of suspicion’”, the committee said in a statement.
“In the adopted text, MEPs excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.”
The position by the committee is a positive sign for providers of encrypted messaging services, such as Signal, WhatsApp, and others, and civil liberties groups are encouraged by the move.
“The European Parliament’s position states that end-to-end encrypted private message services – like WhatsApp, Signal or ProtonMail – are not subject to scanning technologies. This is a strong and clear protection to stop encrypted message services from being weakened in a way that could harm everyone that relies on them – a key demand of civil society and technologists,” said European Digital Rights, a network of civil liberties organizations in Europe.
“Several other provisions throughout the text, such as a horizontal protection of encrypted services, give further confirmation of the Parliament’s will to protect one of the only ways we all have to keep our digital information safe.”
The issue will go to the full EU Parliament for consideration on Nov. 20.