Facebook launches AI tool to stop revenge porn

Facebook

Facebook has been criticised in the past for not doing enough to tackle revenge porn.

Friday, March 15, 2019

Facebook will deploy nudity-detecting artificial intelligence to help victims of revenge porn. 

The social network is using machine learning to identify near-nude images or videos shared on Facebook and Instagram.

The offending account will be disabled if a reviewer decides the content violates Facebook’s standards.

 

 

Diana Fawcett, chief officer at independent charity Victim Support said: "When images and videos are shared online without the person's consent the impact can be devastating and can leave people feeling distressed and humiliated.

"Once images are put online and made public, victims have very little control over where they end up and who sees them and this is likely to leave people feeling extremely powerless.

"It's so important that revenge porn is taken seriously and that more measures are taken to protect victims."

Facebook and other social networks have faced criticism in the past for being too slow in removing illegal material from platforms.

The move builds on a previous trial scheme in which users can create a unique fingerprint of intimate images.

The fingerprint enables Facebook to identify the image without keeping a copy of it should anyone try to upload it to Facebook, Messenger or Instagram - and block it from being posted.

Revenge porn is illegal in the UK, where those found guilty can face a fine or even imprisonment.

Comments