The advent of consumer technology is a reality, and we're grateful for the fact that we might be able to do everyday tasks, touch a map, find a restaurant, or send a photo in a message. However, everything has its ugly face, because child abuse is often embedded in digital media. Sad news that creeps us from time to time and that Protagonists are mobile devices.
The manipulation of smart phones based on the high quality of photos makes them used by inexperienced people to register and pose for children, which is a high-risk group. However, there is some light in all this, since Apple seems to have taken it seriously
It may interest you | So you can add a trackpad touch to your Magic Mouse 2
As we learned about British media, Jane Horvath, Apple's director of privacy, said at CES 2020 held in Las Vegas, that the company was using technology to find Search for illegal images. It looks like Apple is disabled for accounts when Apple finds evidence of child abuse, even though it's not clear how they get it.
"Apple is committed to protecting children in all our environment wherever our products are used, and we continue to support innovation in this space."
Copertino didn't provide any details on how to verify child abuse images, but many technology companies use PhotoDNA's filtering system, in which images database of previously identified images
In addition, Apple treats privacy more seriously than anyone encrypted with a device, because according to Horvath, “Calls are small, lost and stolen. We must ensure that if you lose that device you do not disclose your personal information. ”