Remember when Apple announced that they were planning on implementing a scanning feature into iOS that would scan photos for CSAM content? If you thought that it was going to be an iOS-only thing, think again because according to a leaked document obtained by Alec Muffett, it seems that the EU is considering introducing legislation that would make it mandatory across all devices.
According to the document, it seems that the EU has deemed that the current actions that have been taken have been found to be insufficient to combat the spread of CSAM content, and as such, they think that legislation is now required to prevent it from happening.
“The proposed Regulation consists of two main building blocks: first, it imposes on providers obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges, and, second, it establishes the EU Centre on Child Sexual Abuse as a decentralised agency to enable the implementation of the new Regulation.”
What this means is that should this legislation get passed, it will require that platforms that offer encrypted messages such as WhatsApp and iMessage (just to name a few) to scan messages to detect CSAM content.
Now, Apple has yet to roll out their CSAM scanning feature as it was put on the backburner due to immense backlash. The backlash stems from privacy concerns because how the system would work is by checking photos against hashes of known CSAM content. Privacy experts have weighed in claiming that what’s to stop governments or nefarious parties to abuse the system by using it to target political rivals, minority groups, and so on.
Keep in mind that nothing has been officially announced and that this legislation needs to go through several steps in order to become law, so it might be a while before we see any changes, if there are any to begin with.