We are supported by readers, when you click & purchase through links on our site we earn affiliate commission. Learn more.

iOS 15.2 Beta: First signs of Apple’s controversial child protection function

Apple seems to have integrated program elements for the first time in a preliminary version of iOS 15 that represent part of the new (and highly controversial) child protection functions. That reports MacRumors according to an investigation of iOS 15.2 beta. Accordingly, the corresponding code is available, but not yet activated by the iPhone manufacturer. Apple should also apparently not plan to activate the features in the final version of iOS 15.2.

The group is planning three new “Child Safety” functions for iOS 15, of which at least two are considered problematic. Function one is a built-in scanner for child pornographic material (CSAM, Child Sexual Abuse Material), of which nothing has yet been found in the operating system. Function two has also not appeared yet, it supplements Siri and Search with information on where children and their parents can get help in the event of abuse and asks users who search for CSAM terms whether they “really want it”. Function three, which is now indicated in iOS 15.2 beta, concerns a nude filter for the iMessage app (messages), which is supposed to protect children.

More from Mac & i

Critics believe that CSAM scanning on the device is a breach of the dam because Apple is suddenly now regularly searching private data without the user consenting to this (the use of iCloud photos is sufficient). The nude filter in messages has been criticized because it can automatically alert parents when children view such material – in problematic parent-child relationships this could lead to serious problems, for example with people who have not come out, say critics.

The first details on the nude filter, optionally also called the nude scanner, can now be found in iOS 15.2 beta. There are numerous text strings that Apple is already using in its advance images for the child protection functions would have. Among other things, it says that one should seek help from “adults whom you trust” if one receives problematic images. The system will not share pictures with Apple, but the company can be helped to identify bad hits, it says in another string. It also says, for example, that “it’s not your fault, but sensitive photos can be used to hurt you”. Another string says that sharing nude pictures of under-18s could have “legal ramifications”.

Apple uses different approaches depending on the age group. Photos are only shared with parents if the child is under 13. The function must be activated by parents as part of the family release. According to MacRumors there are no indications that Apple will activate the functions with iOS 15.2. The group had initially postponed the introduction of the features after there had been a lot of criticism. (bsc)

To home page