in

Apple Reveals New Software Designed to Scan iPhone Users Photos

An alarm was sounded by privacy watchdog groups on Thursday when Apple revealed that their company will be uploading software to user’s iPhones that can scan for images of child sex abuse. Those watching the tech giant warned that this new move will create a backdoor to the private lives of iPhone owners, that it is basically opening “Pandora’s Box” and that eventually it will likely be used by governments.

“Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind,” Apple’s announcement said.

According to The Financial Times, “Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.”

They also reported that the automated system would proactively alert human reviewers when it senses that illegal imagery is detected. Those reviewers would then contact law enforcement if the material can be verified.

This new system from Apple is called “neuralMatch” and it will be rolled out just in the United States. Apple noted in a blog post that the software will “evolve and expand over time”. The new software is planned to be included in iOS 15, which is set to be released next month.

Apple stated that the new software provides “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.”

But even though Apple made these claims, academics and privacy watchdogs are deeply concerned about what the new program means for the long term.

Ross Anderson is a professor of security engineering at the University of Cambridge. He said: “It is an absolutely appalling idea because it is going to lead to distributed bulk surveillance of . . . our phones and laptops.”

The New York Times gave a detailed description of how the new technology will work: “The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.”

The New York Times continued, “Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.”

Apple believes that this approach means that people without child sexual abuse material on their phones would not have their photos seen by Apple or the authorities.

Erik Neuenschwander, who is Apple’s privacy chief, said, “If you’re storing a collection of [child sexual abuse material], yes, this is bad for you. But for the rest of you, this is no different.”

In contrast, Edward Snowden had this to say in a tweet: “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*”

Despite Apple’s assurances, security experts and privacy advocates criticized the plan. Professor Matthew Green, a Johns Hopkins University cryptography, tweeted this: “With its plan to deploy software that performs on-device scanning and share selected results with authorities, Apple is coming dangerously close to acting as a tool for government surveillance.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Leaked White House Emails Reveal Their Plans To Reach Unvaccinated Kids Through Influencers & Tik Tok Stars

CDC Uses The Term Pregnant People To Push COVID-19 Vaccines During Pregnancy