Skip to main content

Verified by Psychology Today

Sex

Apple, Child Sex Exploitation, and Your Personal Photos

The goal is admirable, but the risks are real.

Key points

  • Child sexual abuse imagery is a major problem that Apple proposes to help tackle.
  • Apple will begin scanning users' photos to detect child sexual abuse imagery.
  • Major risks that come with this plan, especially for teens who may legitimately and legally want to take nude or sexual images of themselves.
  • This plan also opens the door to scanning for other illegal activities, which is concerning everywhere but especially in authoritarian regimes.
Flickr / verkeorg
Source: Flickr / verkeorg

Last week, Apple announced a set of new iPhone features to reduce child sex exploitation. The most controversial of these is that the company is going to start scanning people's photos to look for incidences of children being sexually abused and scanning messages for sexually explicit content.

When looking for pornography, photos will be automatically scanned and once they cross a threshold in which an algorithm concludes that it has found child sexual content, this process will "…allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated."

Apple will also scan photos on a user's device that appear in messages. If a young person is using Messages, they may blur the photo, ask if the want to sent or receive it, and, depending on parental settings, send a message alerting parents.

While protecting children from sexual abuse is a laudable goal, there are so many ways this can go wrong: What if underage people are taking photos of themselves naked for their own use? What if teenagers take pictures of themselves making out? Authorities have already shown that they may be willing to charge teens with child pornography around these kinds of images. And what about trans kids who are taking nude photos to explore their identities or think about transition?

More broadly, what if Apple decides to start scanning photos looking for other types of illegal activity? Think beyond just the U.S.: Many countries have stricter laws around indecency that they may want apple to enforce. If homosexuality is illegal in a country, for example, could its government demand that Apple identify photos of consenting gay adults?

While users can currently opt out by turning off iCloud photo syncing, this move by Apple opens up a complex set of privacy issues. The company believes that critics of this new plan simply don't understand what it is doing. “We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built.” However, there are plenty of legitimate concerns. Child sexual abuse imagery is a major problem, but scanning every user's photos smacks of the tired "If you're not doing anything wrong, what do you have to worry about?" excuse. There are many ways this plan can go wrong, and we as users have little power to control the errors or abuses.

advertisement
More from Jennifer Golbeck Ph.D.
More from Psychology Today
More from Jennifer Golbeck Ph.D.
More from Psychology Today