This week, dating app Bumble announced that as part of its commitment to combatting ‘cyberflashing’, it would be open sourcing its AI tool that detects unsolicited nudes.
Called Private Detector, the tech blurs out nudes sent through the Bumble app, giving the recipient the choice of whether or not to open the image. When Bumble first unveiled it in 2019, the company said it had a 98% accuracy.
“Even though the number of users sending lewd images on our apps is luckily negligible minority — just 0.1% — our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task,” the company shared in a press release.
Bumble isn’t the only social app recently making moves to protect its users’ safety. Also this month, Instagram announced it would be expanding its automated blocking feature, making it all the more difficult for new accounts made by people on their blocked list to contact them.
Related: “No Other Choice”: Apple Confirms That Yes, It’s Changing Charging Ports Again
Related: Meta’s New VR Headset Meta Quest Pro Can Help You to Be Your Most Productive Self
“We want to make it as hard as possible for someone you’ve blocked to contact you again,” Instagram wrote in a press release. “Based on initial test results from this new change, we expect our community will need to block four million fewer accounts every week, since these accounts will now be blocked automatically.”
In September last year, the Meta-owned platform announced a feature Hidden Words, which helps creators filter harmful content requests. Now, it’s said that it’s updating the tech to automatically turn on Hidden Words for people who use a Creator account.
Meanwhile, subscription-based content creator app Sunroom was launched in February this year with a feature called SunBlock, an industry-first, anti-screenshot technology.
“We interviewed hundreds of creators and heard time and time again that their content was being stolen and redistributed elsewhere,” reads a Sunroom Instagram post. “We wanted to ensure creators felt safe on Sunroom, free to express themselves and share openly.”
The timing of Instagram and Bumble’s changes this month may have something to do with the fact executives from Meta, TikTok, YouTube and Twitter had to testify before the US’ Senate Homeland Security Committee in September about their safety and privacy practices.
“Committee Chair Sen. Gary Peters pressed each company to disclose the number of employees they have working full-time on trust and safety and each company in turn refused to answer – even though they received the question prior to answer,” reported Tech Crunch at the time.
“I’ll be honest, I’m frustrated that… all of you [who] have a prominent seat at the table when these business decisions are made were not more prepared to speak to specifics about your product development process, even when you are specifically asked if you would bring specific numbers to us today,” Peters said at the end of the hearing, reported Tech Crunch. “Your companies continue to avoid sharing some really very important information with us.”
These recent app updates might just indicate that the apps are feeling the mounting pressure and now finally being forced to do something about it. Though, it’s worth noting that the updates are related to protecting users’ safety, not as much to their privacy.