Advertisement

Facebook Launches Tool To Remove 'Revenge Porn'

Facebook is rolling out technology to make it easier to find and remove intimate pictures and videos posted without the subject's consent, often called "revenge porn."

Facebook users or victims of revenge porn currently have to report inappropriate pictures before content moderators will review them.

The company's new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review.

Social media sites like Facebook have struggled to monitor and contain inappropriate posts -- from violent threats to conspiracy theories to nude photos.

Facebook has faced criticism for allowing offensive posts to stay up too long and not removing posts that don't meet its standards.

READ MORE: If You’re Reading This On Facebook, You’re Not Cool

READ MORE: Apple Shuts Down Facebook Data Collecting App

The company said it has been working on expanding moderation efforts, and hopes its new technology will help catch inappropriate posts.

The technology -- which will be used across Facebook and Instagram -- was trained using pictures Facebook has previously confirmed were revenge porn.

It is trained to recognis e a 'nearly nude' photo -- a lingerie shot, perhaps -- coupled with derogatory or shaming text that would suggest someone uploaded the photo to embarrass or seek revenge on someone else.

"This is about using technology to get ahead of the problem," Facebook's Chief Operating Officer Sheryl Sandberg told The Associated Press.

The company does not expect the new technology to catch every instance of revenge porn, and said it will still rely on users reporting photos and videos.