Apple’s new tech will warn parents and kids about sexually explicit photos in Messages – TechCrunch


Apple later this year will launch new tools that will warn children and parents if the child sends or receives sexually explicit photos through the Messages app. The feature is part of a handful of new technologies that Apple is introducing that aim to limit the spread of Child Sexual Abuse Material (CSAM) on Apple platforms and services.

As part of these developments, Apple will be able to detect• CSAM images known to your mobile devices, such as iPhone and iPad, and in photos uploaded to iCloud, while respecting consumer privacy.

Meanwhile, the new Messages feature is intended to allow parents to play a more active and informed role when it comes to helping their children learn to navigate online communication. Through a software update to be rolled out later this year, Messages will be able to use machine learning on the device to analyze image attachments and determine if a shared photo is sexually explicit. This technology does not require Apple to access or read the child’s private communications, as all processing occurs on the device. Nothing is returned to Apple’s servers in the cloud.

If a sensitive photo is discovered in a message thread, the image will be locked and a tag will appear below the photo saying “this may be sensitive” with a link to click to view the photo. If the child chooses to view the photo, another screen appears with more information. Here, a message informs the child that the sensitive photos and videos “show the private body parts covered with bathing suits” and “it is not his fault, but the sensitive photos and videos can be used to harm him.”

It also suggests that the person in the photo or video may not want it to be seen and it could have been shared without their knowledge.

Image credits: Apple

These warnings are intended to help guide the child to make the correct decision when choosing not to view the content.

However, if the child clicks to view the photo anyway, they will be shown an additional screen informing them that if they choose to view the photo, their parents will be notified. The screen also explains that their parents want them to be safe and suggests that the child talk to someone if they feel pressured. It also offers a link to more resources for help.

There is still an option at the bottom of the screen to view the photo, but again, it is not the default option. Instead, the display is designed in a way that the option to do not view photo is highlighted.

These kinds of features could help protect children from sexual predators, not only by introducing technology that disrupts communications and offering tips and resources, but also because the system will alert parents. In many cases where a child is injured by a predator, the parents i didn’t even realize the child had started talking to that person online or on the phone. This is because child predators are very manipulative and will try to gain the child’s trust, then isolate the child from his parents so that they keep communications secret. In other cases, predators they have fixed the parents, too.

Apple technology could help in both cases by intervening, identifying and alerting about explicit materials that are shared.

However, an increasing amount of CSAM material is what is known as self-generated CSAM, or images taken by the child, which can then be shared on a consensual basis with the child’s partner (s). In other words, sexting or sharing “nudes”. According to a 2019 poll From Thorn, a company that develops technology to combat the sexual exploitation of children, this practice has become so common that 1 in 5 girls ages 13 to 17 reported sharing their own nudes, and 1 in 10 boys did the same . But the child may not fully understand how sharing those images puts them at risk of sexual abuse and exploitation.

The new Messages feature will also offer a similar set of protections here. In this case, if a child tries to send an explicit photo, they will be warned before the photo is sent. Parents can also receive a message if the child decides to send the photo anyway.

Apple says the new technology will arrive as part of a software update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey in the US.

This update will also include Siri and Search updates that will provide expanded guidance and resources to help kids and parents stay safe online and get help in unsafe situations. For example, users can ask Siri how to report CSAM or child exploitation. Siri and Search will also step in when users search for CSAM-related queries to explain that the topic is harmful and provide resources for help.


feedproxy.google.com

Leave a Reply

Your email address will not be published. Required fields are marked *