The pilot is meant to help prevent "revenge porn", or having intimate images shared across the social network without the owner's permission.
"They're not storing the image".
The company piloted the technology in Australia - allowing people who've shared nudes with partners to upload the files to Facebook, blocking their partner from posting them. Facebook claims it won't store images or videos and will only be tracking a digital footprint, known as a hash, to prevent the content from being uploaded again by someone else.
In the pilot scheme, users complete an online form outlining their concerns on the e-safety commissioner's website - and it notifies Facebook of the situation.
The program in Australia is in conjunction with the government's e-Safety Commission, an office dedicated to promoting digital safety, especially for children.
"We've been participating in the Global Working Group to identify new solutions to keep people safe, and we're proud to partner with Facebook on this important initiative as it aims to empower Australians to stop image-based abuse in its tracks", said Julie Inman Grant, eSafety Commissioner. A new artificial intelligence technology will remember the photo and zap it if anyone tries to post it to Facebook, Messenger or Instagram. Facebook, in their April announcement of the program, called the employees "specially trained representatives from our Community Operations team".
The pilot program is also available in the US, the United Kingdom and Canada, according to CNBC. "It removes control and power from the perpetrator who is ostensibly trying to amplify the humiliation of the victim amongst friends, family and colleagues", she said.
Turns out, before the image can be "hashed", an actual human at Facebook has to look at it to make sure it "fits the definition of revenge porn".