Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).
Source: Gizmodo – Apple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse Material