Apple has implemented a whole lot of changes in its iOS 18.2 software update, with a new safety feature in Australia allowing children to report nude images and videos being sent to them.
The new feature aims to automatically detect any images or videos that may contain nudity through either iMessage, AirDrop, FaceTime, Photos and third-party apps that adopt Apple’s Communication Safety framework.
If nudity is detected throughout any of these means, the image or video will then be blurred and resources provided to help the child handle the situation.
The automatic detection is made possible through on-device machine learning.
When the image is blurred, Apple will flash a warning message on the image asking if the user wants to report it, along with other support options.
According to The Guardian, the report is sent to Apple who could then report the messages to police.
In the report to Apple, the device will put together a report containing the sensitive images or videos, as well as the communication that was sent immediately before or after the media was sent over.
It will also include the contact information from both accounts, with users able to fill out a form describing what has happened. This report will then be reviewed by the company which can then take further action on an account and report the issue to law enforcement.
Apple adds additional safety features for young device holders
This feature comes as an extension of communications safety measures that have been turned on by default since iOS 17 for Apple users under the age of 13 but has also been available to all users.
Australia is the first country to try out the new addition, with this now being available through the beta update. The feature is reported to be released globally at some point in the future.
The full public rollout of iOS 18.2 hasn’t yet taken place, but the technology giant did launch it for developers on Wednesday (October 23).
Featured Image: AI-generated via Ideogram