[$] Scanning “private” content

Post Syndicated from original https://lwn.net/Articles/865756/rss

Child pornography and other types of sexual abuse of children are unquestionably
heinous crimes; those who participate in them should be caught and severely
punished. But some recent efforts to combat these scourges have gone a good
ways down the path toward a kind of AI-driven digital panopticon that will
invade the privacy of everyone in order to try to catch people who are
violating laws prohibiting those activities. It is thus no surprise that privacy
advocates are up in arms about an Apple plan to scan iPhone messages and
an EU measure
to allow companies to scan private messages, both looking
for “child sexual abuse material” (CSAM). As with many things of this
nature, there are concerns about the collateral damage that these efforts will
cause—not to mention the slippery slope that is being created.