“We’re using technology to limit the spread of these photos and developing innovative ways to prevent them from being uploaded in the first place.” There’s a lot that could go wrong with this “We’re constantly working to prevent this kind of abuse and to keep this content out of our community,” the spokesperson said. A Facebook spokesperson confirmed to Vox that the team deletes the image from the database within a week of its submission the entire process must be repeated for each and every photo submitted. Once the image hash is obtained from the photo, a member of Facebook’s Community Operations team deletes the image from the Facebook server. Facebook can then match future uploaded copies of the photo with that unique ID and block those copies from being distributed. The commission then notifies Facebook, which processes the image - by placing it in front of at least one pair of human eyes, those of “a specially trained representative from our Community Operations team.” Crucially, Facebook doesn’t store the nude images, only their digital image hash, which is basically a unique identifying code for each photo submitted. At that point, the agency and Facebook email a link to the users, which will require them to upload the image to a Facebook database using an encrypted link. These include the Australian e-safety commissioner’s office, the Cyber Civil Rights Initiative and the National Network to End Domestic Violence in the US, the UK Revenge Porn Helpline, and YWCA Canada. Users first have to fill out a form through one of Facebook’s partner networks in each of its four pilot countries. The process by which users of the pilot programs are being asked to submit their photographs is convoluted and laborious. The program is called the Non-Consensual Intimate Image Pilot, and as described by Facebook, it’s intended to be an “ emergency option” for people who are worried their images might be shared in the future. It’s the algorithmic process that lets us do things like reverse image searches on Google and TinEye it also allows law enforcement to stop the spread of child exploitation through the use of PhotoDNA.īut in this instance, it’s not the technology that’s alarming - it’s the vagueness of Facebook’s proposal, and the fact that anyone submitting their photo to Facebook would still be vulnerable to potentially having that image be exploited. The technology is called perceptual image hashing, and it’s been around for years. Though members of the public objected to the basic idea of the program back in November, the announcement of its spread has renewed public scrutiny about its harmful potential - and for good reason. The idea is that Facebook can then digitally fingerprint the submitted image and block its potential future spread.īut the potential problems with this procedure are numerous. It requires those who wish to avoid being victims of revenge porn to act preemptively, before they become targets - by submitting their own nude photographs to Facebook. There’s a catch, however - to keep others from using nude photos against you, you have to submit your own.įacebook’s program launched experimentally in Australia in November 2017, and expanded on May 22, 2018, to the US, the UK, and Canada. Last year, Facebook, facing increased scrutiny over its decision-making, launched a program intended to combat a growing problem on the platform: revenge porn, also known as nonconsensual pornography.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |