top of page
  • Sakshar Law Associates

Apple child protocol and how it is endangering privacy

Updated: Sep 24, 2022


By

Sakshi Shairwal

Padmalaya Kanungo


The recent announcement of the tech giant Apple to undertake certain safety measures to check child safety by scanning all i-phone pictures for child sexual abuse material (CSAM) has not gone right with a lot of pro-privacy people. This step by Apple has faced a lot of criticism all over the world and there is an open letter making rounds online with 4000 signatures appealing to the company to re-evaluate the said decision as it strikes huge privacy concerns. Now, the moot questions that arise are what is the measure taken by the Apple Company to tackle child abuse and why are so many people protesting against it?


Apple plans on introducing new technologies and measures in iMessage, iCloud, Siri, and search to prevent sexual abuse of children. The very first measure would be bringing in new communication tools that would help parents in playing a more informed role while helping their children in navigating online communication. The messaging app will be designed in a way to issue a warning whenever any sensitive content comes up while the private communications will be kept unreadable. So, Apple would basically put a warning to both children and parents if any sexually explicit image is being sent or received. Whenever any sexually explicit image is received it would be blurred and a warning will go to the child along with helpful resources and


similarly when a child attempts on sending such a photo there will be again a warning to the child before sending the photo and the parents will be notified once the photo is sent. However, Apple clarified that it would be carried out with the help of machine learning on the device and the company would have no access to any of the messages on the phone.


Coming to the second measure, the iPhone makers plan on integrating cryptography applications for iPhones, iPads, and Macs to curb the rise of CSAM online. The application which is called the NeuralHash would compare the photos present on an iCloud device to the databases of known images of CSAM and the same would be done by checking hashes. The system would perform a scan on the device detecting suspicious images before they are stored on iCloud with the help of a database provided by NMEC and other child safety organizations which would transform this database into an indecipherable set of hashes that are stored securely on the user’s phone. The results of this scanning would not be disclosed but the device would create a cryptographic safety voucher which would be uploaded to iCloud and then a technology named ‘threshold secret sharing’ would come into the picture that would make sure that the content of this voucher is not interpreted by the Company until and unless the photos match the content of CSAM. There have been multiple checks before implementing this tool to avoid any chances of error before the images are transferred to the National Center for Missing and Exploited Children (NCMEC) ad then to law enforcement. However, the company has assured time and again that there won’t be any compromise on the privacy of the users.


Cupertino stated, “The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.” It is only after the point when the threshold exceeds that the cryptographic technology permits Apple to get through the contents of the safety voucher related to the matching of CSAM images. After that the company carries out a manual review of every report and if anything matches then the report is sent to the NCMEC. “If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.” So, the company would only have access to those images that have confirmed CSAM content and not to any other images. Lastly, Apple is thinking to expand Siri and Search to provide parents and children with required help and information anytime they encounter a problem or if they are in unsafe situations. Cupertino said, moreover adding that “interventions will explain to users that interest in this topic is harmful and problematic and provide resources from partners to get help with this issue.”


The scanning of data stored on servers is not a new theory as Google, Microsoft, Dropbox, and several other cloud services also do the same but what is different here is that the Apple Company is going one step beyond as Apple’s scans will be occurring on the iPhone itself and not merely on its iCloud. There have been several criticisms expressed over the current move of Apple. The present move of Apple raises questions over its 45-year long promise that “what happens o your iPhone stays on your iPhone.” The critics believe that it is not about what the technology might do today but what it could do tomorrow. The tool equipped for identifying CSAM might be used for other purposes thereby hampering privacy. Encryption experts like Muffet, Johns Hopkins, etc. have raised concerns over this issue as Apple might look into other materials as well on a user’s device if pressurized by the Government. “How such a feature might be repurposed in an illiberal state is fairly easy to visualize. Apple is performing proactive surveillance on client-purchased devices to defend its own interests but in the name of child protection,” Muffett added. “What will China want them to block? “It is already a moral earthquake.”


The Electronic Frontier Foundation was of the view that the implementation of such an application would be like having a backdoor onto a user device apple might face great pressure from the U.S Government to expand its application beyond identifying child abuse materials so that the government can curb any content which it doesn’t deem fit leading to violation of freedom of expression. Therefore, it would be a tough time for the company in maintaining a balance between the elimination of exploitation of children and staying strong in its commitment to protecting its users’ privacy.



For any information kindly reach out to us on saksharlawassociates@gmail.com
19 views0 comments

Recent Posts

See All
bottom of page