Friday, February 11, 2011

Examiner Article

We were recently mentioned in an Examiner article by Raymond Bechard, who has written extensively about human trafficking.

To be fair to Facebook, dealing with abusive accounts is a hard problem and they've been making some effort.  They've got a system in place that seems to be automatically nuking a number of profiles.  However, the bad guys just come right back with new accounts, and I'd doubt that there's any mechanism in the "auto-nuke" system to report predatory users to law enforcement.  If not, that could mean that the only consequences some of the people are experiencing for soliciting and posting child pornography are having a Facebook account shut down.

Here's an interesting article from last year about Facebook not notifying Australian LE about child pornography:
http://www.smh.com.au/technology/facebook-failed-to-tell-police-about-paedophile-porn-ring-20100826-13ual.html
It says that someone had sent Facebook 10 messages telling them about the CP rings and nothing had happened.  That was likely more about the inability to process user messages than deliberate refusal to do anything about child pornography.

Facebook's reporting mechanisms are limited.  For example, imagine you see a profile of of a man hinting at sexually abusing his daughter.  How do you go about reporting that to Facebook?  Note that there is no "child exploitation" option anywhere.  If the profile picture contains actual nudity or pornography, that's reportable, but if they're just using a photo of a child to signal their interests, no dice.  "Inappropriate profile information" is limited to hate or violence, and there's nothing for profiles that refer to pedophilia or child porn.  There's also no free text field that allows you to provide additional information.

Realistically, Facebook has millions of users.  If you assume that 1% of accounts are abusive in some way but there are 100 million accounts ... well, that's potentially 1 million abusive accounts.  That's why Facebook wants to automate as much as it can -- they have to.  However, I would like to see an improved reporting system that allowed users to specifically flag child exploitation and other things that are problems but don't fit neatly into their existing system.  For example, how would you go about reporting a profile or page that seemed to be advertising an escort service (human trafficking)?

2 comments:

  1. I once contacted (well tried anyway) facebook about the lack of options when making a complaint or reporting a profile/picture etc. There was once a free text space but it was removed in an update. About 18 months ago, a group of trolls invaded a memorial group for infant loss and posted sick jokes, pictures and links showing physical abuse towards babies. A few members took it to the papers and members of parliment. The group was eventually taken down after one newspaper contacted facebook. The activity on this group caused great upset, offence and some greiving parents either closed their facebook or became hellbent of ridding facebook of these trolls. Sadly trolls of course never use their real names so there was no way of finding these people.
    My understanding of trolls wether they are posting images of abuse, being racist, committing anti sematism etc a lot of them don't beleive what they are saying themselves. This isn't their real views and a lot of the time they will say something provoking and sit back and watch the backlash.

    When it comes to flagging I think they need to have a "keyword" system. PTHC, hussyfan etc. maybe everytime these are entered into the search or groups created they should be flagged and banned instantly. Many chat rooms do this with profanities and swearing, surely facebook can do something similar?

    (sorry for the long post)

    ReplyDelete
  2. To be fair to Facebook, it's a hard problem because they have millions of accounts. They're getting a little smarter about keywords, but the pedophiles just tweak stuff a little so the machine recognition breaks.

    ReplyDelete