Apple is on a mission to end child pornography through this new tool.

Gizmodo reports that although the tech company’s goal to crack down on child pornography seems like a great idea, it could very well be misused.

The tool will use a “neural matching function” called NeuralHash which can determine if photos on a person’s device match known child sexual abuse material (CSAM) fingerprints.

Security experts like Matthew Green say that the tool is a disaster waiting to happen.

“I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea,” said Green via Twitter. “These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.”

Green, an associate professor at Johns Hopkins Information Security Institute, has both studied and written privacy methods for Apple throughout the years. He has also previously worked with the tech company to resolve a security flaw issue in iMessage.

One part of the issue that Green found with the new technology that Apple plans to roll out, is the addition of end-to-end encryption to their services and products.

Not only does Green see the software as an issue, but the software isn’t favorable amongst various governments due to the fact that it makes it harder to fight against illegal content like child pornography. While that end-to-end encryption is a plus for consumer privacy, Green breaks down exactly what the “compromise” means.

Per his reports, it uses scanning technologies on a person’s device before it both sends and encrypts it on the cloud. What this means is that the technology will only scan photos that are already on Apple’s services.

The question Green poses is why Apple would even go through the trouble of creating this system if it does not plan to use end-to-end encrypted content in the long run.

While the tech has good intentions, CSAM fingerprints have repercussions. One example is that they’re intentionally rather vague because if the fingerprints are too exact, a person can easily restructure an image through the editing feature in order to avoid being caught.

For Green, once Apple opens the door to this type of technology, it’ll be very hard to close it.

“Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” he continued. “That’s the message they’re sending to governments, competing services, China, you.”

It has been confirmed that despite Green’s reservations, Apple still has plans to roll out the technology.