Apple prides itself in protecting the privacy of its users and has a strong encryption system all across its ecosystem, so much so that it even resisted government efforts to break into user records. However, a recent move by the tech behemoth that allows it to scan iPhones for Child Sexual Abuse Material (CSAM) has led to criticism.
As a result, the company announced that it’s officially delaying the rollout of its controversial plans. “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple wrote in a statement.

Apple puts its plans to scan iPhones for CSAM on the backburner.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the statement continues
In August, Apple announced that it planned to scan all data uploaded to its iCloud Photo service and report it to Apple moderators, who can pass it on to the National Center for Missing and Exploited Children.
There were two more updates to its scan policies—
1) If in Apple Search app or Siri a user searches for topics related to child sexual abuse, Apple will direct them to resources for reporting it or getting help with an attraction to it. That’s rolling out later this year on iOS 15, watchOS 8, iPadOS 15, and macOS Monterey.
2)It will also add a parental control option to Messages, obscuring sexually explicit pictures for users under 18 and sending parents an alert if a child 12 or under views or sends these pictures.
Apple says it designed this feature specifically to protect user privacy while finding illegal content. But critics say that this is overreaching into the privacy of a user under the guise of security.
The privacy advocates are also concerned that the move could set a precedent for Apple scanning for other material in the future. And in countries where government control is tight over what can and cannot be shared, this development will give another weapon to peep into private searches.
The Electronic Frontier Foundation argued in a statement at the time that “it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”
For now, Apple has put its plans to scan iPhones for CSAM on the backburner. Apple will have to come up with better features to curb child sexual abuse.