Humans rather than algorithms will view the naked images voluntarily sent to Facebook in a scheme being trialled in Australia to combat revenge porn.
The BBC understands that members of Facebook's community operations team will look at the images in order to make a "fingerprint" of them to prevent them being uploaded again.
Facebook will notify the person once the photo has been "hashed'.
It will then be up to the sender to delete the image.
The trial, which could be extended to other countries, is seen as a way of of allowing users greater control over their intimate photos - offering a a pre-emptive protection against future misuse of the pictures by disgruntled ex-lovers.
Users wanting to take part in the scheme must first complete an online form on the Australian e-safety commissioner's website.
Then they need to message themselves their nude photos via Messenger and the e-safety commissioner's office will notify Facebook of their submission.
A Facebook community officer can access those photos to "hash" them, and they will then notify the person via the secure email provided via the commissioner's website.
Revenge porn is a growing issue in Australia where studies suggest that one in five women aged 18-45 may have been victims.
Experts in the field have welcomed the initiative but have warned that it will only tackle the problem on Facebook and its associated sites, WhatsApp and Instagram.
Prof Clare McGlynn, from Durham Law School, said that the UK should establish a similar organisation to Australia's e-safety commission.
"We must redouble efforts to challenge the root causes - challenging those who share images without consent," she said.
"We must condemn such forms of non-consensual sexual activity in the strongest terms, and work with young people on this."