Apple wants to discover child pornography on devices – economy

Apple is taking a radical step in the fight against child pornography. Starting in the fall, the group wants to compare photos of American users using the internal online storage service iCloud with a list of known child pornography material. Apple on Thursday presented a complex procedure for this, supposed to guarantee data protection.

For comparison purposes, a file containing so-called “hashes” of known child pornography content should be loaded on the devices – a kind of digital fingerprint of the image. This helps identify a copy of the photo when comparing with special processes, but the original cannot be restored from the hash.

If they match, the suspicious images come with a certificate, which Apple can use to open them and submit them for review after uploading to iCloud. The system only sounds the alarm when there is a certain number of hits. How many there must be so that it is not made public.

If child pornography material is indeed discovered during the control, Apple reports it to the American non-governmental organization NCMEC (National Center for Missing & Exploited Children), which can in turn appeal to the authorities.

All users receive a file with hashes

While the feature is only enabled for Apple customers with US accounts, the file with the hashes is an integral part of the operating system. It should be loaded on all iPhones that have this version of the system installed. The list needs to be updated on devices with the release of new versions of the operating systems for iPhones and iPad tablets. Before the function can be introduced internationally, the legal requirements must first be clarified.

Users who find known child pornography as a result of the comparison will not be notified. However, your account will be blocked. Comparison via hashes is also used, for example, by online platforms to discover such content while it is being downloaded and prevent its publication. According to the industry, the process works pretty much perfectly for photos – but doesn’t apply to videos yet.

New content is not discovered

Critics of the encryption of private communications in chat services and on smartphones, which is common today, often cite the fight against child sexual abuse as an argument to demand back doors for authorities. The system announced by Apple is an attempt to solve the problem in a different way. The company has repeatedly fought against requests by US security authorities to decipher the encryption of its devices during investigations. The focus on already known photo hashes also means that new content created on devices is not discovered.

Apple has published analyzes from several experts who have praised data protection in the process. At the same time, Matthew Green, a crypto expert at the American University of Johns Hopkins, criticized the fact that the ability to sync files across devices has been created. He specifically sees the danger that someone could pass hashes for other content on devices – and so authoritarian governments could impose regulations on how to search for other content.

With another function, in the future it will be possible for parents to receive a warning message if their child receives or sends nude photos in Apple’s iMessage chat service. Nudity in images is detected by software on the device. The group does not hear about it.

Related Articles

Back to top button