The IT giant is gearing up to launch a photo tracking feature in smartphone galleries to search for illegal content. The check will go directly on the devices themselves, however, if suspicious photos are found, the neural network will send data to Apple. The function has already been officially confirmed by representatives of the "apple" brand.
According to the company, images of device owners will be scanned using its own neuralMatch algorithm without sending them to cloud servers. All images will be compared with information from the National Center for the Search for Missing and Exploited Children in the United States, which includes two hundred thousand photographs.
The hashing algorithms of the mechanism are designed to mark dubious images, when a certain level of the number of marks is reached, neuralMatch will notify Apple about possible illegal content. Further, the employees of the company will have to decide whether to contact the law enforcement agencies or not. A similar system called PhotoDNA is currently used for Facebook, Twitter and Google.
In addition, in iOS 15, the messaging program will use machine learning on the gadget to warn the user about sending confidential content. If the device is in the use of a child, then his legal representatives will receive a notification about the sending of suspicious files from the device. At the same time, according to confirmed information from Apple, private messages will not be available for reading by the company.
According to reports from representatives of the brand, this system will soon be launched in the United States, but there are no reports of its implementation worldwide. The emergence of the algorithm is already causing skepticism among some specialists. For example, a professor at Johns Hopkins University and an expert in cryptography Matthew Green expressed the opinion that the system can be used by unscrupulous persons, and the algorithm itself can produce incorrect results.
After some time, Apple said that the system would not be able to scan photos on devices where the "iCloud Photos" function is disabled, since marks of questionable content are attached to copies of photos for cloud storage, and not for originals.
According to the company, images of device owners will be scanned using its own neuralMatch algorithm without sending them to cloud servers. All images will be compared with information from the National Center for the Search for Missing and Exploited Children in the United States, which includes two hundred thousand photographs.
The hashing algorithms of the mechanism are designed to mark dubious images, when a certain level of the number of marks is reached, neuralMatch will notify Apple about possible illegal content. Further, the employees of the company will have to decide whether to contact the law enforcement agencies or not. A similar system called PhotoDNA is currently used for Facebook, Twitter and Google.
In addition, in iOS 15, the messaging program will use machine learning on the gadget to warn the user about sending confidential content. If the device is in the use of a child, then his legal representatives will receive a notification about the sending of suspicious files from the device. At the same time, according to confirmed information from Apple, private messages will not be available for reading by the company.
According to reports from representatives of the brand, this system will soon be launched in the United States, but there are no reports of its implementation worldwide. The emergence of the algorithm is already causing skepticism among some specialists. For example, a professor at Johns Hopkins University and an expert in cryptography Matthew Green expressed the opinion that the system can be used by unscrupulous persons, and the algorithm itself can produce incorrect results.
After some time, Apple said that the system would not be able to scan photos on devices where the "iCloud Photos" function is disabled, since marks of questionable content are attached to copies of photos for cloud storage, and not for originals.
Login or register to post comments
Comments 0