Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The “NeuralHash” tool, which detects known images of child sexual abuse will scan images before they can be uploaded to iCloud. The tool will scan the images for matches and then review them by a human. If child pornography is found, the account of the user will be deleted and the National Center for Missing and Exploited Children notified.

Separately, Apple plans scan encrypted messages of users for explicit sexual content, as a child safety measure. Privacy advocates were also alarmed by this plan.

Images that are in the center’s database of child pornography will be flagged by the detection system. Parents who take innocent photos of children in baths need not be concerned. Researchers say that the matching tool, which does not “see” such images but only mathematical “fingerprints,” could be used for more sinister purposes.

Matthew Green, a Johns Hopkins University cryptography researcher, warns that the system can be used to frame innocent persons by sending them innocuous images intended to trigger child pornography matches. This could be used to fool Apple’s algorithm, and alert law enforcement. He said that researchers have been able “to do this pretty easily” to fool such systems.

Another abuse could be government surveillance of protesters or dissidents. Green asked, “What happens if the Chinese government tells you, ‘Here are a few files we want you scan for?'” “Does Apple deny? Although I would like them to say no, their technology won’t allow me to say no.

For years, tech companies such as Microsoft, Google and Facebook have shared digital fingerprints of child sexual abuse photos. Apple used them to scan files stored in its iCloud Service, which isn’t as secure as its on-device data for child pornography.

Apple has been subject to government pressure for years in order to permit increased surveillance of encrypted data. Apple had to balance between preventing the exploitation and protecting privacy while implementing the new security measures.

But a dejected Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s compromise on privacy protections “a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

PhotoDNA was invented by a computer scientist more than ten years ago. It is the technology that law enforcement uses to identify child pornography online.

“Is it possible?” Yes. But is this something I should be concerned about? Hany Farid, a researcher at University of California at Berkeley said that “no” is the correct answer. He argued that other programs intended to protect devices from different threats have not seen “this kind of mission creep.”

Apple was the first company to adopt “end-to end” encryption. This is where messages are encrypted so that only senders and recipients can see them. However, law enforcement has been pressing Apple for this information for years to help them investigate crimes like terrorism and child sexual exploitation.

Apple announced that the most recent changes to its operating system for iPhones, Macs, and Apple Watches will be available this year in updates to its software.

John Clark, president and CEO, National Center for Missing and Exploited Children said that Apple’s expanded protection for kids was a game-changer. These new safety measures could save the lives of children, as so many people use Apple products.

Julia Cordua is the CEO of Thorn. She stated that Apple’s technology balances privacy and digital safety for children. Thorn was founded by Demi Moore, Ashton Kutcher and uses technology to protect children from sexual abuse. It identifies victims and works with tech platforms.

But in a blistering critique, the Washington-based nonprofit Center for Democracy and Technology called on Apple to abandon the changes, which it said effectively destroy the company’s guarantee of “end-to-end encryption.” Scanning of messages for sexually explicit content on phones or computers effectively breaks the security, it said.

The group also challenged Apple’s technology to distinguish between dangerous content and something more benign like art or memes. CDT stated in an email that such technologies are known for being error-prone. Apple claims that these changes do not amount to a backdoor which degrades its encryption. They are thoughtfully considered innovations that protect user privacy and do not disrupt user privacy.

Apple also announced that its messaging app will use machine learning on the device to detect and blur explicit images on children’s smartphones and can warn parents via text message. Apple also stated that it would “intervene”, if users tried to search for child sexual abuse topics.

Parents will need to register their child’s device in order to get the warnings about explicit images. Parents of teenagers will not receive notifications if their child is older than 13.

Apple claimed that neither feature would compromise security or notify police.