Apple emphasizes that new feature only scans for child porn

The announced Apple software that searches iPhones via the cloud for images of child abuse will only be used for that purpose. The company emphasizes this today in a new list of Frequently Asked Questions to temper criticism of the plans.

“We want to be clear that this technology is limited to detecting child sexual abuse material stored in iCloud, and we will not respond to a government request to expand this feature,” the company wrote. .

“We have previously faced demands from governments to make changes that affect user privacy, and have consistently rejected those demands. We will continue to do so in the future,” it reads.

Adjustments made in Saudi Arabia or China

Security and Privacy Experts are critical about Apple’s plans, as it is feared that this is the start of monitoring more data from all Apple users. There are also concerns that such programs could be used to detect other issues, such as dissent from authoritarian regimes.

While Apple assures it will not comply with government requests to modify the software, tech website The Verge that Apple has done this before. For example, video calling with Facetime is not possible in Saudi Arabia, Pakistan and the United Arab Emirates where encrypted telephone calls are not allowed.

And in China, many apps have been removed from the app store and Apple allows iCloud data of Chinese users to reside on a server managed by the Chinese government.

Only in the US, for now

The new software contains two functions. One used machine learning on the device to identify and detect sexually explicit images received by children in a messaging app blur. In addition, parents may be notified if a child aged 12 or under decides to view or submit such an image.

The second feature is designed to detect child sexual abuse images by scanning the images sent to iCloud. A program compares a user’s photo library to a “child pornography list,” a list of confirmed child pornography material whose digital signatures are stored by Apple.

If certain photos or videos are classified as child pornography, they are shared with Apple, which in turn shares them with the authorities. That only happens if a user has turned on backing up photos to iCloud.

Apple did not develop these features on behalf of or in conjunction with those entities. Apple’s new plan will initially only be implemented in the United States. Whether and when it will be the turn of the European Union is unknown

You may also like...