The Atlantic’s Emily Bazelon took a look into the effort by organizations as diverse as Facebook and the hacker collective Anonymous to battle online harassment and bullying, an experience that over 800,000 people under the age of eighteen experienced last year.
One of the things she learned was that Facebook spends as little time as possible reviewing the complaints:
To demonstrate how the harassment team members do their jobs, Willner introduced me to an affable young guy named Nick Sullivan, who had on his desk a sword-carrying Grim Reaper figurine. Sullivan opened the program that he uses for sorting and resolving reports, which is known as the Common Review Tool (a precursor to the tool had a better name: the Wall of Shame).
Sullivan cycled through the complaints with striking speed, deciding with very little deliberation which posts and pictures came down, which stayed up, and what other action, if any, to take. I asked him whether he would ever spend, say, 10 minutes on a particularly vexing report, and Willner raised his eyebrows. “We optimize for half a second,” he said. “Your average decision time is a second or two, so 30 seconds would be a really long time.”