Perhaps we can shed some light on the issue. First, let's start with something that both the Facebook safety team and Cacho would probably agree on: most people strongly dislike child pornography. However, many of the pedophiles who collect the stuff are apparently somewhat compulsive about it. When they get busted, they generally don't just have a handful of images; it's more like hundreds or even thousands.
The internet makes it a lot easier for pedophiles to exchange material with like-minded people. Peer-to-peer software has often been used for this, but it's not too hard for law enforcement to automatically scan for known child pornography files on P2P systems and pick off the idiot who's sharing it. Social networks allow pedophiles to look for validation and acceptance as well as trading partners, so it's not too surprising that they've shown up on Facebook, Grou.ps, Grouply, MySpace, Google+, etc.
Facebook is currently the biggest player in the social network arena, with over 500 million accounts. They've made an effort to curb predators by scanning for known child pornography and certain kinds of anomalous behavior. But it's very hard to detect some things automatically, and it's impossible for them to do it manually. Therein lies the rub.
Joe Sullivan, Facebook's security chief, used an example of an account that sent all its friend requests to teenage girls. However, predators may also be looking to connect with other pedophiles. He also pointed out that Facebook is using PhotoDNA to scan photos. PhotoDNA is basically an improved approach to detecting known child pornography. Earlier this year, a man in the UK took photos of himself sexually abusing his 2-year-old daughter and uploaded them to Facebook so that other like-minded users could view them. PhotoDNA would not have been able to flag this because the image would not have been in its database. So it's commendable of Facebook to make the effort to use this technology, but it's still no silver bullet.
Because of the sheer volume, a lot of the Facebook account shutdowns are done automatically, with no human intervention. In the example Sullivan provided, the system might notice that a lot of friend requests are being sent to teenage girls and disable the account. However, if an account that tripped off some flags and got automatically shut down was also posting child pornography that PhotoDNA did not detect, no human would see it and it would consequently not get reported to law enforcement. It's possible that the failure to report some of this material is related to this, rather than willfully ignoring the law.
Because of the sheer volume, a lot of the Facebook account shutdowns are done automatically, with no human intervention. In the example Sullivan provided, the system might notice that a lot of friend requests are being sent to teenage girls and disable the account. However, if an account that tripped off some flags and got automatically shut down was also posting child pornography that PhotoDNA did not detect, no human would see it and it would consequently not get reported to law enforcement. It's possible that the failure to report some of this material is related to this, rather than willfully ignoring the law.
If Facebook continues to work on combating predators, it may encourage many of them to relocate to a different social network. But that's probably about as good as it's going to get.
No comments:
Post a Comment