Facebook says it understands your concerns about having nude or intimate photos leaked or, worse, used as revenge porn, and is offering up a solution — although you’d be forgiven for being sceptical about using it.
The social media service is asking its users to put the ultimate amount of faith in its ability to protect people’s privacy (LOL) and send in copies of their sensitive imagery.
To refresh your memory, this is the same company that, just this year, settled a privacy lawsuit for using photo face-tagging without permission and has historically mined and shared our data with the highest bidders.
In November, the platform announced it would no longer use facial recognition on the site, with the ‘face prints’ of more than 1 billion people removed from the data banks of the company.
Facebook, whose parent company recently changed its name to Meta, previously used facial recognition to automatically suggest users tag themselves in photos they might be in.
The new request for copies of people’s sexy photos is part of a partnership with UK based nonprofit called Revenge Porn Helpline, with the goal of building a tool to prevent intimate images from being uploaded without consent to Facebook, Instagram and other participating platforms.
You’d surely not be alone in wondering how on earth this process will work and how it will prevent your slimy ex from plastering intimate pics of you for all to see the next time they’re feeling scorned and vindictive.
Anyone who wishes to keep their private photos private can upload them to a central, global website called StopNCII.org, which stands for “Stop Non-Consensual Intimate Images.” The site will then ask for confirmation that they are in an image. People can select material, including manipulated images, that depict them nude or nearly nude and the photos or videos will then be converted into unique digital fingerprints which will be given to participating companies, starting with Facebook and Instagram.
StopNCII.org will not have access to or store copies of the original images. Instead, the images will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies.
Speaking to NBC News, Revenge Porn Helpline manager Sophie Mortimer, said, “It’s a massive step forward.
“The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it.”
It’s not the first time that Facebook has tried its hand at mitigating the pervasive problem of revenge porn. Back in 2017, the platform launched a pilot program aimed at the issue, but it used human moderators to review the submissions which, understandably, people were not thrilled about.
The new tool will see participating companies using hash-matching technology to check whether images matching the hashes had been uploaded to their platforms. If a match is detected, a content moderator will review the image and determine if it does indeed violate the company policies so that they can delete all traces of it and block attempts to re-upload them.
While some may scoff at the idea of Facebook/Meta being tasked with keeping sensitive information safe, Mortimer says it is crucial the social media platform participates in the program.
“Having one system open to all parts of industry is critical,” she said. “We know this material doesn’t just get shared on one platform and it needs a much more joined-up approach.”
The statistics around non-consensual image sharing are alarming, to say the least. An estimated 1 in 12 US adults report that they have been victims of image-based abuse with young people 18–29 most likely to report having had intimate photos publicly posted of them without consent. Meanwhile, individuals from low-income households, people of colour and members of the LGBTQIA+ community are all more likely to be victims of image-based abuse.
Read more stories from The Latch and subscribe to our email newsletter.