Facebook has added tools to report revenge porn and to prevent images from being shared once they have been banned. Revenge porn is any intimate photo shared without permission.

In a public post on the platform, Facebook co-founder, Mark Zukerberg wrote: “Today we’re rolling out new tools to prevent “revenge porn” from being shared on Facebook, Messenger and Instagram… It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms.”

How the new tools work

In a post on the company blog, Antigone Davis, the head of Facebook’s global safety, explained how the new tools would work:

  • If you see an intimate image on Facebook that looks like it was shared without permission, you can report it by using the “Report” link that appears when you tap on the downward arrow or “…” next to a post.
  • Specially trained representatives from our Community Operations team review the image and remove it if it violates our Community Standards. In most cases, we will also disable the account for sharing intimate images without permission. We offer an appeals process if someone believes an image was taken down in error.
  • We then use photo-matching technologies to help thwart further attempts to share the image on Facebook, Messenger and Instagram. If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it.
  • We also partner with safety organizations to offer resources and support to the victims of this behavior.

Accounts sharing revenge porn will be deactivated, the post further said.

Privacy concerns

A specially trained group of employees will provide a human review of each reported image, Davis was quoted by Reuters. The proposed system requires Facebook to retain banned pictures in a database. The images are blurred and accessible only a small number of employees, Reuters further reported.

The proposed system is similar to the existing image detection and security features on Facebook, which blocks anti-Semitic and offensive content.

A prompt that prevents users from sharing banned memes on Facebook

What prompted Facebook to do it

The move comes after a controversial incident involving a secret Facebook group of US and British marines, where nude images of service women were being shared. The group which consisted of nearly 30,000 members was taken down in September 2016 and was back on the platform by March 2017. In some cases, the subjects in the images were identified by their names, ranks, duty stations, and branches.