How did it all start?
Most cloud services - Dropbox, Google and Microsoft and many others - are already crawling
Apple will deploy a new system calledNeuralHash, in the USA. It will appear in iOS 15 and macOS Monterey, both scheduled for release in the next month or two. Whether the system will be rolled out internationally and when љ unknown Until recently, companies like Facebook were forced to turn off their child abuse detection tools across the European Union.
How will it work?
Let's start with the fact that there is a special base in the USAimages collected by NCMEC and other organizations. NCMEC or National Center for Missing & Exploited Children is the National Center for Missing and Exploited Children, a private non-profit organization created in 1984 by the US Congress. It is worth noting that Apple does not receive the images themselves, but their hashes. These are specially generated strings that will be the same for the same images and different for different ones, explains T - Z.
It should be noted that the peculiarity of hashes is that theydo not allow you to completely restore the original file. However, they make it possible to compare the hashes of two files with each other. At the same time, the algorithm used in the process is resistant to photo manipulation. What does it mean? For example, even if you change the colors or size of the photo in the photo, its hash will be the same as that of the original file.
Regardless of Apple, the iPhone calculates similar hashes for all user photos. The algorithm does not take into account specific pixels, but the content of the image.
For the analysis to work, Apple devicesusers themselves will upload to iCloud along with photos and security vouchers - they will store the hash of the photo and its visual derivative. At the same time, Apple does not explain what kind of derivative it is. Most likely, this is a snapshot fragment or its processed version, which will be considered by a company employee during the last stages of analysis.
Security vouchers will also storefragments of a special encryption key. At the same time, it is generated on the device and is unknown to Apple from the very beginning. This key encrypts the hash and the visual component of the snapshot. By the way, in order to decrypt the data, you need to get several fragments of such a key.
Forbes notes that only one hash matchfrom the user's gallery with a hash from the database of images with scenes of child abuse is not enough to draw unambiguous conclusions. To avoid "false positives", Apple sets the minimum number of matches needed to "get things going." However, the number of such matches is unknown.
If the number of matches is sufficient, andif the hash is completely identical (the photo really matches the snapshot from the database, then the decryption will be successful. If the hash is different, then the picture in the NCMEC database is not found, and the "voucher" will not be able to decrypt. That is, according to Apple, the company will not get access to the photo until it finds a similar one in the database, Mediazona notes.
Project criticism
Cybersecurity professionals predictablygreeted Apple's initiative with indignation: numerous attempts to depersonalize photo analysis do not cancel the main problem - the very fact of constant access to user files. Even if the files that the algorithm analyzes are uploaded to the cloud storage, in fact, the user does not share them with anyone.
Matthew Greene, security researcher atJohns Hopkins University told Reuters that Apple's desire to create systems that scan iPhone users' phones for "prohibited content" could "break the dam" and cause the US government to "demand it from everyone." He also told the Associated Press that there are concerns that Apple may be pressured by other international governments to scan for other information. One of the first in a series of tweets, the researcher reported the new technology.
No matter what the long term plans areApple, they send a very clear signal. In their (very influential) opinion, it is safe to create systems that scan users' phones for prohibited content. They send this signal to governments, competing services, China, to you personally. Whether they are right or wrong in this matter is hardly important. From now on, nothing is sacred, now the governments [of different countries] will demand the same from everyone, and when we realize that it was a mistake, it will be too late.
Matthew Green, Johns Hopkins University Professor
Why is this necessary?
Last year, the US Department of Justicepublished a set of “voluntary guidelines” aimed at getting tech social media companies to do more to combat child sexual exploitation and abuse. The agency urged companies to establish a thorough system to identify and take immediate action against illegal content and reporting it to the authorities. For example, Microsoft has already created photoDNA to help companies identify images of child sexual abuse on the Internet. Facebook and Google already have systems in place to check for potentially illegal content. Facebook also said it is working on new tools to reduce the spread of child sexual abuse images on its platform.
According to a report by the U.S. National Council onFor missing and exploited children, Facebook reported 20 million child sexual abuse images to law enforcement in 2020. This number includes reports from both Facebook and Instagram platforms. In 2019, there were 16 million such photos.
Read more
The slowing down of the Earth's rotation caused the release of oxygen on the planet
Astronomers spot unusual structures in deep space
See more 60,000-year-old Neanderthal rock art