Over the last couple of weeks I’ve been wondering if maybe deleting my Facebook account was such a good move – I’ve found out second-hand about a couple of happenings within my friends and family which I’d have picked up on much sooner if I’d been on Facebook. This had me reconsidering whether the benefit of keeping in touch with people might be worth the cost of my privacy.
Then I read an article today about a new Facebook pilot programme to protect people from revenge porn by teaching their software to recognise images of concern to users. The snag is that you have to first provide Facebook a copy of the image you want to block, effectively send all your nude pics to Facebook!.
I understand how this could potentially be a useful tool, but given the already shady reputation of this company it has very creepy overtones to it. If this was a government organisation or reputable non-profit who were recognised for their work on protecting people’s privacy I’d have a bit more confidence in the concept. Given that it is Facebook… nah, just seems wrong.
Fortunately I have no concerns about potentially incriminating photos of me surfacing on social media, but the sheer creepiness on this pilot scheme has me recoiling in horror from the Zuckerberg monster. I think I will stay away for some time yet.
People shouldn’t be able to share intimate images to hurt others
By Antigone Davis, Global Head of Safety
It’s demeaning and devastating when someone’s intimate images are shared without their permission, and we want to do everything we can to help victims of this abuse. We’re now partnering with safety organizations on a way for people to securely submit photos they fear will be shared without their consent, so we can block them from being uploaded to Facebook, Instagram and Messenger. This pilot program, starting in Australia, Canada, the UK and US, expands on existing tools for people to report this content to us if it’s already been shared.
My team and I have traveled to nine countries across four continents, listening to stories about the abuse and cruelty that women face online. From Kenya to Sweden, women shared their painful, eye-opening experiences about having their most intimate moments shared without permission. From anxiety and depression to the loss of a personal relationship or a job, this violation of privacy can be devastating. And while these images, also referred to as “revenge porn” or “non-consensual pornography,” harm people of all genders, ages and sexual-orientations, women are nearly twice as likely as men to be targeted.
Today, people can already report if their intimate images have been shared without their consent, and we will remove each image and create a unique fingerprint known as a hash to prevent further sharing. But we can do more to help people in crisis prevent images from being shared on our services in the first place. This week, Facebook is testing a proactive reporting tool in partnership with an international working group of safety organizations, survivors, and victim advocates, including the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and The National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline, and YWCA Canada.
People who worry that someone might want to harm them by sharing an intimate image can proactively upload it so we can block anyone else from sharing it on Facebook, Instagram, or Messenger:
– Anyone who fears an intimate image of them may be publicly can contact one of our partners to submit a form
– After submitting the form, the victim receives an email containing a secure, one-time upload link
– The victim can use the link to upload images they fear will be shared
– One of a handful of specifically trained members of our Community Operations Safety Team will review the report and create a unique fingerprint, or hash, that allows us to identify future uploads of the images without keeping copies of them on our servers
– Once we create these hashes, we notify the victim via email and delete the images from our servers – no later than seven days
– We store the hashes so any time someone tries to upload an image with the same fingerprint, we can block it from appearing on Facebook, Instagram or Messenger
This is one step to help people who fear an intimate image will be shared without their consent. We look forward to learning from this pilot and further improving our tools for people in devastating situations like these. (Facebook)