Facebook has stressed that its experiment to fight revenge porn that requires users to provide their intimate images proactively to the social network where a “specially trained” professional reviews and hashes the image is voluntary.
“To be clear, people can already report if their intimate images have been shared on our platform without their consent and we will remove and hash them to help prevent further sharing on our platform,” said Antigone Davis, Facebook’s Global Head of Safety, in a blog post late on Thursday.
“With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place,” Davis added.
This programme is completely voluntary.
“It’s a protective measure that can help prevent a much worse scenario where an image is shared more widely. We look forward to getting feedback and learning,” Davis said.
Facebook launched the experiment in Australia this week to help prevent non-consensual intimate images from being posted and shared on its platforms.
Facebook launched the experiment in partnership with the Australian eSafety Commissioner’s Office and an international working group of survivors, victim advocates and other experts.
With this, Australians who fear their intimate image may be shared without their consent can work with the eSafety Commissioner to provide that image in a safe and secure way to Facebook so that it can help prevent it from being shared anywhere on Facebook, Messenger, and Instagram.
How does the mechanism work?
Australians can complete an online form on the eSafety Commissioner’s official website. To establish which image is of concern, people will be asked to send the image to themselves on Messenger.
The eSafety Commissioner’s office notifies Facebook of the submission (via their form). However, they do not have access to the actual image.
Once Facebook receive this notification, a specially trained representative from its Community Operations team reviews and hashes the image, which creates a human-unreadable, numerical fingerprint of it.
Facebook stores the photo hash – not the photo – to prevent someone from uploading the photo in the future.
“If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared,” Davis said.
“Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner’s office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers,” Davis noted.