Apple’s New Child Sexual Abuse Material Detection System: Responsible Prevention or Dangerous Precedent?

Technology Policy Brief # 62 | By: Scout Burchill | August 26, 2021

Header photo taken from: Electronic Frontier Foundation


Facebook


Twitter


Google-plus

Follow us on our social media platforms above

Browse more technology policy briefs here

download 1 1

Photo taken from: The Washington Post

Policy Summary

[SSB theme=”Official” align=”center” counter=”true” ]

Earlier this month, Apple announced three new features to protect children and crack down on child sexual abuse material (CSAM). While these new features, which will be rolled out on all iPhone and iPad devices in the coming months, may be well-intentioned, a number of security researchers and civil rights groups are raising the alarm about their potential to open the floodgates to increasing government and corporate surveillance.

The least controversial feature only affects Apple’s search application and Siri. The company announced that it will introduce systems to help users report CSAM, and will also provide warnings and links to mental health resources for users who search for topics related to CSAM. The remaining two initiatives, however, are far more controversial. One includes new parental control features on Apple’s iMessage application. If parents with a family account opt-in, on-device image scanning technology will be activated on their child’s device to identify and obscure any sexually explicit images shared via iMessage. If they really want to see or send such material, they will have to confirm it, and if a child 12 or under makes this choice, a warning alert will be sent to their parents.

The final and most controversial new feature will make use of similar technology, using users’ personal devices to scan their iCloud Photos pictures for CSAM. In the event that CSAM is identified, Apple will report it to their moderators, who will then review the details of the images and notify authorities if necessary. Apple claims that by disabling iCloud Photos the local scanning system on a user’s device will be completely deactivated.

The main reason a large number of security researchers and civil liberties advocates are raising red flags is that the system Apple has unveiled scans images locally on users’ iPhones or iPads. This is entirely different from normal CSAM scan systems, which run remotely, only checking files uploaded and stored on a company’s server. In effect, they argue, people’s personal phones are being equipped with the technology to surveil and monitor them, and once this technology is installed, there is no limit to what it can be used for. In short, the slow creep toward unchecked surveillance is paved with good intentions.

Policy Analysis

Before unpacking the possible downstream effects of this type of technology, let’s take a brief look at how it actually works. Apple’s CSAM scan system uses a tool called NeuralHash, which breaks each picture into a string of numbers corresponding to the visual information and characteristics of the image.

Think of it as a program that creates a numerical map of images. On its own, this “map” is incomprehensible and does not provide enough information to reconstruct the image, but it does allow the unique numerical hash, as the string or “map” of numbers is called, to be compared with a registry of hashes that include illicit CSAM content compiled by the National Center for Missing and Exploited Children (NCMEC). If a match is found, a “safety voucher” is created by the user’s personal phone and uploaded to iCloud Photos along with the image.

After a certain amount of these “safety vouchers,” the images become decrypted and sent to human moderators who then review them for CSAM. Up until the human moderators, the entire process is intended to be encrypted, meaning the data is undecipherable or secretive to ensure the security of users’ personal information and details.

a12c39393640b250d0529f0dbb55921d7bb4e5b3

Photo taken from: WSWS

Apples CSAM detection 758x397 1

Photo taken from: 9to5mac.com

So why would Apple choose to create this local image scanning system that operates directly on user’s personal devices instead of a remote one that scans users’ files stored on company servers, like Facebook, Reddit and other tech companies?

The answer to this question is currently the source of a lot of heated disagreement. On one hand, by designing their CSAM system to scan images locally, Apple can retain total control over the data. Scanning and analyzing data on cloud servers can open the door for third party players to access it. For example, iCloud is powered by Amazon Web Services and Microsoft Azure. So by building CSAM scanning systems directly into Apple devices, Apple is arguing that the entire process will be more secure.

However, accepting this argument also requires placing A LOT of trust in Apple, the company. And this is exactly what security researchers and privacy advocates are concerned about. From their perspective, these new features are a Trojan Horse. Once the local scanning system becomes built-in to Apple devices, a permanent backdoor will be wrenched open, allowing for surveillance capabilities that far surpass the system’s original intent.

According to security experts, once this scanning system is integrated into Apple’s devices, it could easily be adapted to detect any number of things governments around the world would foam at the mouth for. The slow creep from CSAM to videos of violent terrorism might be rationalized as necessary interventions, but what about the tracking and detection of LGBTQ+ people or the textual analysis of documents and protest signs? These hypotheticals are not far-fetched possibilities, particularly in a number of repressive regimes around the world. As the past two decades of the War on Terror’s securitization politics prove, compromises on civil liberties made in times of exceptional circumstances or states of emergency are extremely hard to reverse and tend to further incentivize governmental overreach. In other words, it’s a slippery slope.

Even Apple’s own reassurances that our personal data is safe in their hands is far from reassuring given Apple’s recent history. There are still many questions surrounding Apple and other tech companies’ willingness to hand over the personal data of reporters, Democratic lawmakers, and sitting White House officials to Trump Administration lackeys investigating the steady stream of leaks seeping out of the Trump White House. In China, Apple has already made many disturbing concessions to the government, even as it now urges Americans not to worry. Bending to the Chinese government’s demands, Apple now stores all its Chinese users’ data in China on state-owned servers, and the company has even abandoned the encryption technology designed to keep personal data secure, handing over the digital keys to government data centers. Furthemore, it actively aids the Chinese government in censoring the App Store.

apples controversial new child protection features explained

Photo taken from: OLTNEWS

While there is no disputing the importance of protecting children from sexual abuse, Apple’s new features reflect an incredible about-face for a brand built on prioritizing and protecting user privacy. Not too long ago, in 2016, Apple was widely celebrated for refusing to help the FBI unlock the iPhone of the terrorist behind the mass shootings and attempted bombing in San Bernardino, California. In a letter to the public, Apple argued:

“…the order would set a legal precedent that would expand the powers of the government and we simply don’t know where that would lead us. Should the government be allowed to order us to create other capabilities for surveillance purposes, such as recording conversations or location tracking? This would set a very dangerous precedent.”

Now, only five years later, Apple’s newest features seem to represent a stark reversal of priorities.

Leaked internal memos reveal an air of scorn and dismissiveness toward security researchers and privacy advocates, characterizing their outspoken concerns as the “screeching voices of the minority.” But according to recent reporting, even some of Apple’s own employees are pushing back on the idea. To make matters worse, in less than a couple of hours after Apple’s NeuralHash script was published for the public to test its security themselves, a “hash collision” was discovered, meaning two entirely different images were found to produce the same hash.

Hash collisions have the potential to render entire encryption systems ineffective and unsecure. In response, Apple downplayed the significance of the development. Apple’s attempts to brush aside criticism and minimize legitimate concerns belies a real arrogance and abandonment of principles that the Apple of yesteryears at least paid lip service to. The reality is iPhones have always been insatiable collectors of personal information, but in recent months, the mask is finally starting to slip and the illusion of privacy and security is fading fast.

Plenty of people online have been expressing fear and outrage at the prospect that Apple will soon be scanning their photo library, and perhaps picking out pictures of their beloved son or daughter taking a bath or leaping through backyard sprinklers and flagging it as CSAM. This outrage is understandable, even though many would still argue it is an acceptable price to pay to target those who sexually abuse children.

https specials images.forbesimg.com imageserve 610cf90170cc5533f5a5736d 0x0

Photo taken from: Forbes

However, I want to conclude instead by briefly bringing attention to a crucial node in this system, and every other content moderation system, that is always overlooked. The team of human moderators who review CSAM and other disturbing content daily.

Through their labor, societies protect themselves from the darkest corners of human existence. These workers do not receive glamorous tech salaries and yet they are subjected to unimaginable emotional and psychological trauma.

Their stories are rarely, if ever, told, shrouded in the secrecy of Non-Disclosure Agreements and the reality of a contracted and fractured workforce. While thinking about how to create more secure and ethical systems of content moderation that protects the privacy and civil liberties of users, let’s not forget about them, too.

Engagement Resources​

Click or tap on image to visit resource website.

g3HwwFKw

Center For Democracy and Technology

https://cdt.org/

H1RmsjEu

Electronic Frontier Foundation

https://www.eff.org/

mA lhpfV

Take control of your online privacy:

https://globalprivacycontrol.org/#about

Sources

Click or tap on image to visit resource website.

knowledge graph logo

Apple’s Announcement

https://www.apple.com/child-safety/

Apple’s Letter to Public from 2016

https://www.apple.com/customer-letter/answers/

edf6f691d463be51741b88f6699c9e05

Open Letter Against Apple’s New Policies

https://appleprivacyletter.com/

The Verge logo

Good Explainer:

https://www.theverge.com/2021/8/10/22613225/apple-csam-scanning-messages-child-safety-features-privacy-controversy-explained

The Secret Lives of Content Moderators

https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

1200x1200 t

Apple’s Concessions in China

https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.html

Apple Hands Over Data to Trump Administration

https://www.nytimes.com/2021/06/13/us/politics/justice-department-apple-donald-mcgahn.html

Reuters Logo

Apple Employees Resist

https://www.reuters.com/technology/exclusive-apples-child-protection-features-spark-concern-within-its-own-ranks-2021-08-12/

1200px TechCrunch logo.svg

Problems Detected In Apple’s NeuralHash System

https://techcrunch.com/2021/08/18/apples-csam-detection-tech-is-under-fire-again/

default 9to5mac guide

The “Screeching Voices of the Minority” Internal Memo

https://9to5mac.com/2021/08/06/apple-internal-memo-icloud-photo-scanning-concerns/

DONATE NOW
Subscribe Below to Our News Service

x
x
Support fearless journalism! Your contribution, big or small, dismantles corruption and sparks meaningful change. As an independent outlet, we rely on readers like you to champion the cause of transparent and accountable governance. Every donation fuels our mission for insightful policy reporting, a cornerstone for informed citizenship. Help safeguard democracy from tyrants—donate today. Your generosity fosters hope for a just and equitable society.

Pin It on Pinterest

Share This