abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Story

10 Aug 2021

Apple plans to scan iPhones for child sexual abuse images; privacy concerns raised

Apple announced in August 2021 that it will scan photo libraries stored on iPhones for images of child sexual abuse, and introduce a system to monitor end-to-end encrypted messages on the accounts of children which can be activated when parents opt in. Privacy concerns were raised by customers and civil society following the announcement. Access Now urged Apple to roll back such plans for client-side scanning and measures that will undermine end-to-end encryption.

In September 2021, Apple announced it would delay the roll out of the plan, citing 'feedback from customers, advocacy groups, researchers and others'.

Timeline