Apple playing with a Seesaw: Apple’s concern with the spread of Child Sexual Abuse Material (CSAM) ‘raises’ child safety while seemingly ‘lowers’ user privacy.

By: Tianyue Liao

This algorithm might seem confusing to most of us, as this is one formula in the Apple Private Set Intersection (“PSI”) system.  Many more of them will soon be incorporated into Apple’s new operating system and be installed on users’ devices with the upcoming iOS 15 and iPadOS 15.  Now, what is Apple doing with this PSI system and how would this affect us?

U.S. Code Title 18 Part 1 Chapter 110 Section 2258A lays out how companies must handle Child Sexual Abuse Material (“CSAM”).  Earlier this month, Apple announced that it would be adding three new on-device features to its operating system, all of which are designed to collaboratively fight against child sexual exploitation and to protect children by limiting the spread of CSAM.  The first feature introduces iCloud photo scanning; the second deals with Messages App scanning of incoming and outgoing “sexually explicit” photos; and the third expands guidance in Siri and Search when users perform searches for queries related to CSAM.

The most remarkable, yet controversial, feature is the updates CSAM Detection Technology.  In short, CSAM Detection enables Apple to accurately identify and report iCloud users who store known CSAM in their iCloud Photos accounts.  While this technology sounds simple, questions linger such as: how would Apple identify them; whom would they report to; and what are its consequences?

To identify CSAM photos, Apple uses a database of known CSAM image hashes provided by the National Center for Missing and Exploited Children (“NCMEC”) and other child-safety organizations.  Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes (Apple converts the CSAM database into an unreadable set of hashes, and NeuralHash would then analyze an image and converts it to a number specific to that image). This matching process is powered by a cryptographic technology called private set intersection (PSI).  This technology allows Apple to learn if a user’s image hash matches the known CSAM image hashes.  Once a match is found, the device creates a cryptographic safety voucher that conceals the match result, and the image is then uploaded to iCloud along with the voucher.  Using another technology called threshold secret sharing, the system ensures that the content of the safety vouchers does not get examined by Apple unless the vouchers in the iCloud account exceed the known CSAM content threshold.  Only when they exceed the 30 day threshold does the cryptographic technology flag the account and allow Apple to view and inspect the contents.  Lastly, there is a manual review process in which Apple scrutinizes each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.  If a user believes their account was erroneously flagged they can file an appeal to have their account reinstated.

Apple plans to add more sprinkles to its child safety model cake. The first being in the Messages App.  The App developed a new feature to warn children and their parents when receiving or sending sexually explicit photos.  To be clear, this is for children accounts set up in Family Sharing. Once the parent/guardian account opts in to the feature, an on-device machine learning classifier in the Messages app will warn the parent if a child under 13 years old is about to view or send sexually explicit images (there would also be a warning to the child that indicates if he or she intends to view the image, Apple will send a notification to his or her parents).  The other type of sprinkles is Siri and Search.  Here, Apple provides suitable resources to help children and parents stay safe online and get help in unsafe situations.  For instance, users could be redirected to the resource page to file a report when he or she asks Siri to report CSAM.  Siri and Search would also hinder a user from performing searches related to CSAM by explaining the relatively dangerous aspects within the search results.  These updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

The CSAM detection for iCloud Photo Library remains to be the most controversial component of Apple’s new technology. Although Facebook and Google have been scanning images uploaded to their platforms for years, it is a completely different story for Apple. Understandably, some privacy and security experts criticize this new technology saying it’s dangerous, and is a threat to privacy, security, democracy, and freedom. Apple went to court in 2016 as it refused to provide the FBI access to an iPhone that belonged to the shooter in an attack in San Bernardino. This is the first time that Apple runs counter to its privacy-first ethos.

Apple has already published a few articles outlining privacy and security assurances such that Apple does not learn information about images that do not match the known CSAM database and that users cannot obtain the CSAM database used for matching to protect the contents of the database from malicious use.  The most significant cause for concern is that this new feature could be abused or misused in the future.  Some experts claim that this move could be exploited by oppressive governments aiming to find other materials for censorship or arrests, and that this would open the backdoor to more troubling surveillance and censorship.  Moreover, this change might further weaken Apple’s ability to stand up to foreign governments since under the Chinese government coercive suggestion, Apple has already put Chinese users’ iCloud data on state-owned enterprise servers, and also provided encryption keys necessary to access it.

Regarding the new features to the Messages App, some research show that parents might be a bigger danger to their children than strangers.  What this implies is that not all homes are safe, and that some homes are statistically the most likely sites of sexual assault.  One study of 150 adult survivors, who indicated they had appeared in sexual abuse material as children, found 42% identified their biological or adoptive/stepfather as the primary offender, and biological mother in 28% of cases, mostly as a co-offender.  More than two-thirds of such images appear to have been made at home. Therefore, giving parents the right to opt-into the child’s account and have access to all information about the child’s Message App account activity, without first providing the child an opportunity to grant or deny access, can lead to disastrous consequences.  This is even more so for LGBTQ+ children because most of them fear to express their LGBTQ+ identity to family members.  So, there might be occasions when these children are exploring their sexual orientation online, and suddenly their parents would receive a notification and later discover their LGBTQ+ identity.  This would undeniably cause needless conflicts and in some cases, even lead to forced eviction.  In a more broader sense, children might not be the only victims here.  Any individual in an abusive household, regardless of age, could be coerced to use a “child” account.  Once set up, this feature could easily allow an abusive family member to gain more control of the victim. Therefore, this feature might open the door to more frequent stalking, controlling, and spying capabilities to those in power.

Looking forward, “based on feedback from customers, advocacy groups, researchers, and security experts, Apple would need to take additional time over the coming months to collect input and make improvements before finally releasing these significant child safety features.”  Apple has repeatedly claimed that this set of innovative techniques allows it to provide information to NCMEC and law enforcement regarding the proliferation of known CSAM while also providing significant privacy benefits.  With safeguards in place, as well as various provisions regarding companies duty to report CSAM, it could be feasible to simply leave the rest to Apple.  Overall, being “out of the blue” is okay sometimes if Apple’s new expanded protections for children features keep its promises.

Student Bio: Tianyue Liao is currently a second-year law student at Suffolk University Law School, focusing in Family Law. She is a staffer on the Journal of High Technology Law. Prior to law school, Tianyue received a Bachelor of Arts in International Relations and French from Mount Holyoke College.

Disclaimer: The views expressed in this blog are the views of the author alone and do not represent the views of JHTL or Suffolk University Law School.

 

Print Friendly, PDF & Email