An EU plan under which all WhatsApp, iMessage and Snapchat accounts could be screened for child abuse content has hit a significant obstacle after internal legal advice said it would probably be annulled by the courts for breaching users’ rights.
Under the proposed “chat controls” regulation, any encrypted service provider could be forced to survey billions of messages, videos and photos for “identifiers” of certain types of content where it was suspected a service was being used to disseminate harmful material.
The providers issued with a so-called “detection order” by national bodies would have to alert police if they found evidence of suspected harmful content being shared or the grooming of children.
Privacy campaigners and the service providers have already warned that the proposed EU regulation and a similar online safety bill in the UK risk end-to-end encryption services such as WhatsApp disappearing from Europe.
Now leaked internal EU legal advice, which was presented to diplomats from the bloc’s member states on 27 April and has been seen by the Guardian, raises significant doubts about the lawfuln