Sep 11, 2021
In this Safeguarding Podcast with Hany Farid, Professor at the University of California, Berkeley: PhotoDNA, what is is and how it works, what PhotoDNA doesn't do, what are Hashes and do they work in an End-to-End Encrypted world, is Apple's NeuralHash child safety proposal the incipient slippery slope as many claim, Apple's Secret Sharing Threshold and why that's a problem, and "WhatsApp's hypocrisy".
Links to other relevant content:
The Good, the Bad and the Ugly of Apple's Curate's Egg:
CSI Apple: The Omnibus Edition
You've Already Agreed to Apple's CSAM Detection but you just didn't know it:
Safeguarding Podcast with Glen Pounder CCO Child Rescue Coalition:
Apple's notice on Expanded Protections for Children:
WhatsApp's website: on-device scanning for contraband content:
WhatsApp automatically performs checks to determine if a link is suspicious. To protect your privacy, these checks take place entirely on your device, and because of end-to-end encryption, WhatsApp can’t see the content of your messages.
WhatsApp’s website on CSAM detection:
Our detection methods include the use of advanced automated technology, including photo- and video-matching technology, to proactively scan unencrypted information such as profile and group photos and user reports for known CEI. We have additional technology to detect new, unknown CEI within this unencrypted information. We also use machine learning classifiers to both scan text surfaces, such as user profiles and group descriptions, and evaluate group information and behavior for suspected CEI sharing.
Using these techniques, WhatsApp bans more than 300,000 accounts per month for suspected CEI sharing.
Hany Farid's Newsweek piece on WhatsApp's Hypocrisy: