Science

Meta launches tool to stop revenge porn from spreading on Facebook and Instagram


Meta launches tool to stop revenge porn from spreading on Facebook and Instagram – but users concerned of being victimized must submit images and videos to a hashing database to make a case

  • The tool is a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images’
  • People concerned their intimate images or videos have been posted or might be posted to Facebook or Instagram can create a case through the website
  • This is done by uploading the images or videos to the website
  • Meta says the content is turned into a digital fingerprint, allowing them to identify or detect the explicit content  and claims no human eyes see them 










Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram, but it requires people to submit their sexually explicit photos and videos to a hashing database.

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images.’

Each photo or video submitted receives a digital fingerprint, or unique hash value, which is used to detect and track  its copy that was shared or attempted to be posted without the person’s permission.

However, the website was created with 50 global partners and sharing intimate images and video of yourself with a third-party website may not sit well with most users, but Meta says they ‘will not have access to or store copies of the original images.’ 

A Meta spokesperson told DailyMail.com in an email that ‘only the person submitting a case to StopNCII.org has access to their images/videos’ and ‘all of the computations necessary to compute an image’s hash happens in the browser, which means ‘images never leave the person’s device.’

‘Only cases are submitted to StopNCII.org and hashes of the person’s images or videos are shared with participating tech companies like Meta,’ the spokesperson added. 

Scroll down for video 

Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram, but it requires people to submit their sexually explicit photos and videos to a website

Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram, but it requires people to submit their sexually explicit photos and videos to a website

‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Antigone Davis, global head of safety for Meta, shared in a blog post.

‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.’ 

StopNCII.org builds on a pilot program launched in 2017 in Australia, when it asked the public for photos of themselves to create hashes that could be used to detect similar images on Facebook and Instagram.

And this foundation is what is being used to stop revenge porn.

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org 

Meta was originally going to setup Facebook to let people submit their intimate images or videos to stop them from spreading, but the sensitive media would have been reviewed by human moderators during the process before they were converted into unique digital fingerprints, NBC News reports.

Knowing this, the social media firm opted to bringing in a third-party, StopNCII, which specialize in image-based abuse, online safety and women’s rights.

‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Davis wrote.

‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.

StopNCII.org is for adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent.

A report in 2019, released by NBC, Meta identifies nearly 500,000 cases of revenge porn every month.

To deal with the influx of revenge porn, Facebook employs a team of 25 people which, in tandem with an algorithm developed to identify nude images, helps vet reports and take pictures down.

But with StopNCII.org, human moderators are replaced with hashes that can detect and identify the images – after potential victims share the explicit content with Meta.

WHAT CONSTITUTES REVENGE PORN ON FACEBOOK?

Much of what constitutes revenge porn is covered under Facebook’s rules on nudity.

In March 2015, however, the social network brought in specific community guidelines to address the growing problem of revenge porn.

The section, entitled ‘Sexual Violence and Exploitation’, deals specifically with the subject.

The guidelines say: ‘We remove content that threatens or promotes sexual violence or exploitation. 

‘This includes the sexual exploitation of minors and sexual assault.

‘To protect victims and survivors, we also remove photographs or videos depicting incidents of sexual violence and images shared in revenge or without permission from the people in the images.

‘Our definition of sexual exploitation includes solicitation of sexual material, any sexual content involving minors, threats to share intimate images and offers of sexual services. 

‘Where appropriate, we refer this content to law enforcement.’



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.