Showing 1 results for:

CSAM Fingerprints

by Topic

All results

1
Apple Will Launch A New Tool To Fight Against Child Predators, But It May Cost Consumers

Apple is on a mission to end child pornography through this new tool. Gizmodo reports that although the tech company’s goal to crack down on child pornography seems like a great idea, it could very well be misused. The tool will use a “neural matching function” called NeuralHash which can determine if photos on a person’s device match known child sexual abuse material (CSAM) fingerprints. Security experts like Matthew Green say that the tool is a disaster waiting to happen. “I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea,” said Green via Twitter. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.” Green, an associate professor at Johns Hopkins Information Security Institute, has both studied and written privacy methods for Apple throughout the years. He has also previously...

Shanique Yates

Aug 6, 2021