- Apple has added several child protection features to FaceTime in iOS 26
- The latest blur videos when detecting nudity is present
- It currently affects adult accounts but it can be a mistake
Apple has added parental control features designed to protect minors for years now, and it seems that a new one has been found in iOS 26 beta. Still, it turns out to be pretty controversial as there is concern that it may be something of an overreaction from Apple.
Specifically, the new feature is added to the FaceTime video call app. When Facetime detects someone dressing up on the call, it puts it up in the call and instead shows a warning message that reads, “Sound and video pause because you may show something sensitive. If you feel uncomfortable, finish the call.” There are then buttons labeled “resume audio and video” and “end call.”
At its WWDC 2025 in June, Apple announced a press release covering new ways in which its systems will protect children and young people online. The release included a feature that sets the new FaceTime behavior: “Communication security is expanded to intervene when nudity is detected in Facetime -Video calls and to blur nakedness in shared albums on photos.”
The actual implementation was noted by IdeviceHelp on X. During the post, @User_101524 added that the feature is found in the setting app in iOS 26 by going to Apps> Facetime> Sensitive Content Warning.
By default, the feature is disabled so it has to be turned on by the user, but it has not prevented it from stirring online debate …
Generate controversy
While this new feature may seem sensible, it has actually generated a degree of controversy. That’s because right now it seems to affect all users of iOS 26, not just those who use a child account. This has fluffed some feathers among people who feel that Apple is potentially censoring the behavior of consent adults.
In addition to that, some users have questioned how Apple knows what is displayed on the screen and whether the company has access to customer video calls. At this point Apple has said the following:
“Communication security uses machine learning on device to analyze photo and video attached files and decide if a photo or video seems to contain nudity. Because photos and videos are analyzed on your child’s device, Apple does not receive an indication that nudity was found and not accessing photos or videos as a result.”
Like many of Apple’s features, the processing on device means that the content is not sent to Apple’s servers and is not available by the business. Rather, the artificial intelligence (AI) uses to mark video content that probably contains nudity and then censor it.
The fact that Apple’s communication security functions are aimed at protecting minors suggests that this latest FaceTime feature may not be intended to cover both adults and children. Its admission to all accounts can therefore be a supervision or error. Although we do not know for sure, we need to find out by September when iOS 26 comes out of beta and releases fully from the public.



