ICloud photos are posted privately and no one can access them except the user who owns the iCloud account. However, that does not mean that it is a private domain of any kind of content. A Apple's automated program is responsible for analyzing downloaded content to detect possible child sexual abuse photos stored in iCloud. It all shows that PhotoDNA is used for this, a Microsoft program that is used by many other technologies for the same purpose.
Pornography that no one can access, except it's child pornography
As mentioned at CES 2020 Jane Hovarth, Apple's executive director, scan photos uploaded to iCloud from devices such as iPhone or iPad to make sure there are no child abuse incidents.
He hasn't figured out how to make this assertion, he just showed it they use certain technologies to look by default:
"We are using other technologies to help detect child sexual abuse."
Other big technologies like Facebook or Google know that Use PhotoDNA, is an automated detection system that can detect images of children with sexual content. Apple can use PhotoDNA as well, even though there is nothing official about it.
The emphasis of Jane Hovart is the fact that User privacy is still important (In fact, last year he made one of his most relevant campaigns for this precision at CES) and the only encryption is always the case. This end-to-end encryption means that only the user and their devices can access the content as it is locked all the time and the unlock keys are only available on end users' devices. According to Jane:
"End-to-end encryption is very important for the services we rely on … health data, payment data.
The truth is we shouldn't be surprised here, even though Apple does not display when images are scanned
Apple is committed to protecting children in all our environment wherever our products are used (…) As part of this commitment, Apple uses image matching technology to help detect and report child abuse. (..) Accounts for child abuse content that violate our Terms of Service, and any account we obtain with this content will be promoted.
The important thing here continue to respect the privacy of the user, finding illegal content without violating the privacy and security of all users is certainly not easy, but it seems that Apple has your plan. That, at least so far, has worked for her.
Via | The telegraph