Apple would have started scanning iCloud for child pornography


What do we store in the clouds? Nowadays these have become hard drives to drive our data, using less and less pendrives and we move more data between different market clouds, many of which have free volume that we can extend with modest pricing plans. ICloud has become an invaluable service for many, it's amazing how syncing of files or photos between Apple devices works, but you have to be careful with what's stored there … Apple wants to update images uploaded to iCloud for child abuse and child pornography

. After the jump we give you all the details of these important issues …

Confirmed Jane Horvath, a privacy consultant at Apple, CES in Las Vegas for The Telegraph boys: Apple will automatically process images stored in iCloud Photos for illegal images among which may be photos child abuse or even child pornography.

As confirmed on the Apple page provided for official terms: Apple uses image matching technology to help detect and report child abuse. Like spam filters in email, our systems use electronic signatures to detect allegations of child abuse. We guarantee individual content for individual review. Accounts on child abuse content violate our Terms of Service

, and any account we access with this content will be disabled. Most obviously, our data is secure in iCloud, Apple has encryption keys for these and in the end it is obvious that if we store illegal content in a cloud that is not owned by Apple and they will be responsible and complete then it is normal for them to investigate this content.

The best accessories for your iPhone

Looking for a new case for your iPhone? Apple Watch extension? Maybe a Bluetooth speaker? Don't miss these offers on accessories and get the most out of Apple's mobile:



oriXone

oriXone

I started playing Xbox a lot thanks to Call of Duty online. Since then I haven't stopped playing competitive online.

Related Posts

Next Post

Leave a Reply

Your email address will not be published.