The dark side of Apple’s new child safety features

The dark side of Apple’s new child safety features

In early August, Apple announced some new child safety features, slated to arrive in the upcoming updates to iOS, macOS and iPadOS. The first change is that the Messages app warns minors (and their parents) of sexually explicit images and gives parents the option to be alerted if a child views or sends such an image. The second involves Siri and Search being tweaked to intervene when someone makes queries related to Child Sexual Abuse Material (CSAM) . The last, and most major, introduces automatic on-device matching of photos (stored inside iCloud Photos) against a database of known CSAM content. If the system discovers enough flagged pictures, a report will be sent to a moderator for evaluation. If the moderator confirms the assessment, Apple will decrypt the photos and share them with the relevant authorities.


These new features were announced with the stated intent of protecting children from sexual predators, and they do sound like measures with great intentions behind them. But the changes have been met with considerable backlash and feelings of betrayal.


Of the three features, the changes to Siri and Search have been generally uncontroversial. The others, however, have seen massive opposition, ranging from discontent about the tweak to the Messages app to outrage about the CSAM scanning. So much so that Apple was forced to delay (but not stop) the implementation of these features. 

It may still be unclear to you why there even is opposition or why I’m asking you to be scared.


Even if well-intended, these new features are a massive invasion of privacy and have the potential to inf ..

Support the originator by clicking the read the rest link below.