Background

Facebook Introduces Tools To Stop 'Revenge Porn' From Spreading On Its Platform And Properties

Facebook - revenge porn

‘Revenge porn’ is becoming one common term on the web. It's widely used to describe contents that exposed sexually explicit images of somebody without their consent. With governments and companies trying to combat the issue, Facebook is also taking its part.

To do this, the social giant is introducing new tools as its ongoing efforts to build a safer community.

On April 5th, 2017, Facebook launches new tools to help people when intimate 'revenge porn' images are shared on the platform without their permission. When this type of content is shared and reported to Facebook, it will take steps to prevent it from being shared on any other Facebook's property that include its social media, Messenger and also Instagram.

Once Facebook receives a report, the company will review the said content by highly trained specialists. Proven to be more accurate than currently available algorithms, the team will identify whether the image should be considered as revenge porn.

The team is also aided with software to enforce bans on those images after they've been flagged. Facebook is using 'Photo-matching technologies' that will scan through Facebook, Messenger and Instagram and remove any images that have already been deemed to infringe community standards.

The company is also extending its reach by partnering with safety organizations that will offer support to victims of revenge porn.

Facebook - Safer Community

Revenge porn has been issue, especially when access to the internet and mobile technologies have become common. This has prompted a number of countries to legislate against such activities.

While the laws do play an important part, the priority is to prevent the distribution of such material from ever happening. This is why technology and internet companies have been increasingly aware to develop new tools to stop this practice.

By introducing a new Report button within the downward arrow next to a post, users can help limit the spread of those explicit imagery from ever spreading like wildfire.

On the web, Facebook isn't the only company that concerns this. Google also has rules and processes in place to remove revenge porn contents from its search results. Microsoft and others also have similar policy.

But the issue here is the criteria of such imagery. Facebook has been criticized in the past due to its human editors in flagging the wrong content. Like for example, when it notably removed the Pulitzer-Prize winning "napalm girl" picture and banned the writer who shared it.

"Specially trained representatives from our Community Operations team review the image and remove it if it violates our Community Standards," explained Antigone Davis, Facebook's head of global safety, in a blog post. "In most cases, we will also disable the account for sharing intimate images without permission. We offer an appeals process if someone believes an image was taken down in error."