


Apple has plans to detect images of child sexual abuse on some of its devices
Yuichiro Chino/Getty Images
Apple’s soon-to-be-launched algorithm to detect images of child sexual abuse on iPhones and iPads may incorrectly flag people as being in possession of illegal images, warn researchers.
NeuralHash will be launched in the US with an update to iOS and iPadOS later this year. The tool will compare a hash – a unique string of characters created by an algorithm – of every image uploaded to the cloud with a database of hashes for known images …
More Stories
Here’s why pipe organs seem to violate a rule of sound
A galactic smashup might explain galaxies without dark matter
COVID-19 has killed a million Americans. Our minds can’t comprehend that number