(AMERICAN THINKER) – Recently, Apple announced that it will deploy a new algorithm, NeuralMatch, to monitor iMessages and images on its devices. The ostensible purpose is to scan for photos containing nudity sent by or to children and also for photos of nude or seminude children. If a number of suspect images are backed up to an iCloud account, they will be decrypted and inspected, and the user reported to law enforcement. Police could then investigate or prosecute the user for possession of child pornography or for child sex abuse or other offenses. But, well-intentioned as the motive may sound,...