Is Apple Scanning Your iPhone Photos? Here’s What You Need to Know

Updated: May 16, 2022

Experts explain Apple's controversial new plan to scan your iPhone photos—and what it could mean for your privacy.

Apple has touted its top-notch user privacy standards for years, but its new plan to scan iPhone photos for child sexual abuse material (CSAM) is raising alarms in the tech world. While everyone agrees on the importance of cracking down on sexually explicit content involving children, privacy experts warn that Apple scanning photos could have downsides.

“There are a lot of unintended consequences that this new CSAM policy could lead to,” says Karim Hijazi, CEO of cybersecurity company Prevailion. “This is just one of many ways that our privacy is being eroded on a daily basis.”

Here’s what you need to know about Apple’s new photo-scanning system on iPhones, including how it works, why experts are concerned, and how to protect your privacy. For more iPhone security, turn on these iPhone privacy settings and learn how to hide messages on an iPhone, how to turn off location tracking on your iPhone, how to tell if your phone has been hacked, and what your smartphone knows about you.

What is Apple doing?

Apple will use its CSAM detection technology for three new features on the iOS 15 update, according to the company’s official statement. The first update is a parental control feature that, when enabled, scans photos in the Messages app on children’s phones and sends notifications if it detects explicit content. The second feature scans, flags, and reports photos on iCloud storage for containing known child sexual abuse material, and the third feature warns users when they use Siri or the Search bar to search for images of child abuse.

Despite the fact that an iPhone is considered one of the most secure phones, the second feature in that list is especially concerning for privacy experts like Hijazi.

How will this new feature work?

Using a computer-based algorithm, Apple’s CSAM detection technology will scan an image on your iCloud account and compare it to codes of known child sex abuse images, which are stored in a database by child safety organizations, including the National Center for Missing and Exploited Children (NCMEC). Upload 30 or more images that match an image from child sex abuse databases, and a human will review each image to see if it is really CSAM, according to Alex Hamerstone, a director with security-consulting firm TrustedSec. If it’s determined to be CSAM, Apple will shut down your account and report you to the NCMEC and law enforcement.

Parents may worry that the technology will flag them for innocent photos of their children—say, a silly, suds-covered kid at bath time. But Apple says the system isn’t looking for just any child photos; it’s looking for those that match known and validated child sex images from at least two child safety organization databases.

Why is Apple scanning photos?

Child sexual abuse material is spreading faster than ever, with millions of images or videos online and over 19,100 victims identified by law enforcement. To help curb the spread of CSAM, Apple is increasing its efforts to detect and flag sexually explicit content involving children on its devices. In a statement, Apple said that its new features aim “to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.” But experts believe this new photo-scanning technology could be one of the top smartphone security threats.

Why are people concerned?

Unfortunately, Apple’s new CSAM detection system might backfire, experts say. Many worry that Apple scanning photos could lead to abuse by bad actors like hackers, governments, and even Apple itself. “Apple has essentially created a bypass to its end-to-end encryption that it—or anyone it authorizes—can use to gain access to a person’s device without their permission and snoop on their content,” Hijazi says.

End-to-end encryption is a security feature that ensures no one eavesdrops on your messages. But by using Apple’s new CSAM technology, someone—hackers or rogue Apple employees, for example—could access and spy on users’ photos.

The feature could also be used to search for other types of material on users’ devices, such as political content or things that are critical of a repressive government like China or Iran, according to Hamerstone. Responding to this criticism, Apple released a statement reassuring users that it “would refuse such demands” from governments. For more on iPhone photos, here’s how to edit photos on your iPhone like a pro.

Are there other risks?

In addition to making users’ devices and information more vulnerable, Apple’s photo-scanning technology might confuse harmless photos with sexually explicit content—known as false positives. Apple claims that the chance of a false-positive CSAM detection is one in one trillion.

But Hijazi notes that researchers who recently tested a version of Apple’s CSAM scanning algorithm found errors in its ability to scan cropped and rotated images, as well as recognize differences between two photos. This discovery raises concerns that the likelihood of false positives is higher than Apple suggests, according to Hijazi. “I believe Apple’s heart is in the right place, and it really is trying to do the right thing here, but this measure is simply too wide-ranging because it sacrifices the privacy of everyone in the process,” he says.

FYI, your cell phone can get viruses, too, so take these steps to protect your iPhone.

Is there a way to opt out?

Don’t want Apple scanning photos on your device? Because Apple only scans photos uploaded to iCloud, you can choose to opt out of the new feature by disabling iCloud storage for your photos.

Iphone Photo Settingsrd.com, via iPhone, Getty Images

Follow these steps to disable iCloud Photos:

  1. Open your iPhone or iPad and go to Settings.
  2. Scroll down to Photos.
  3. Tap the iCloud Photos slider to the off position.
  4. In the pop-up window, tap Download Photos & Videos to download the photos from your iCloud Photos library to your device instead.

Alternatively, you can disable iCloud Photos by:

  1. Go to Settings.
  2. Click on your name (at the top of the page).
  3. Click on iCloud.
  4. Click on Photos.
  5. Tap the slider to disable it.

Instead of storing your iPhone photos on Apple’s iCloud, Hamerstone suggests keeping them on a home computer or thumb drive to protect your privacy. If you decide to use Apple’s iCloud despite the risks, you should know how safe iCloud really is before storing your data.

How else can I protect my privacy?

Apple’s CSAM detection feature is not the only security threat lurking on your iPhone, according to Hijazi. “If you care about your privacy, you should assume that any iOS device is vulnerable to end-to-end encryption bypasses and full photo access,” he says. “There is no such thing as privacy online or on any connected device, and you should always assume that is the case and behave accordingly.”

Hamerstone recommends securing your accounts with strong passwords and multi-factor authentication, using privacy tools like encryption, and limiting which apps you download on your iPhone. Check your phone for apps that could be spying on you.

Sources: