Very often on Social Networks you come across people who cross the line. Their remarks, comments or posts become slanderous, racist, sexist, violent…
Invariably someone “reports them” to the authorities, be it the official Authorities (the Police in whatever form) or the “powers that be” at the Social Network at hand in the hope they get removed.
The phenomenon for obvious reasons infests Facebook more than any other network and – friends tell me – its russian and chinese equivalents VKontakte and RenRen.
Hate speech is NOT new and there are laws (in most countries) that draw the line between free speech and illegal speech. But the matter is very delicate; I always quote a friend of mine who must remain anonymous who once told me “If you lived where I live, your perspective on what ought to be illegal would change”.
Censorship for me is almost the absolute evil and I had in more than one instance instantly left a group at the slightest attempt to censor my opinions and expressions.
If all we have done is move the right to decide what gets published and what not from a big publishing house to Facebook, we have changed nothing.
Freeing information is the true and only Social Media Revolution, and to keep it free I believe only the community can decide what’s good and what’s not good. Legislation exists, there is a Police and there are Judges who decide the matter of law. Sadly, it is very difficult to roll back their rule over free expression: for every rabid neo-Nazi who gets (rightly) silenced, a political opponent gets the same treatment.
So the Police does its duty, but the fine-grained moderation is the duty of the community, each and every member of it
The best cure for hate speech is isolation and proactive rejection.
Of course, it is also the duty of the platform to provide the technical means to reject without making the post more popular (which is exactly what the troll wants in the first place) something most Social Networks are totally inept at.
The best effort I have come across on this matter is the Moderation System developed by the admins at Slashdot: I have stolen it for my book, so I have no qualms at stealing the bit about moderation and evaluation again for this post
Follow Netiquette - The basic purpose of a community is to provide an exclusive space online, for people to interact and discuss. In that case, every user needs to be given freedom to post his/her views as long as it adheres to the rules of the community. Answers that are posted to questions can be replied with a simple “Thank You”. This can work wonders!
Ensure Good Leadership - A successful community needs a good leader or a moderator who takes care of the content and the social interactions amongst users. A good moderator is one who ensures the right kind of attitude is maintained throughout the forums being discussed. A moderator who does not weigh down his/her presence yet ensures decorum is maintained is bound to have a more interactive and focused community. A community can have more than one moderator or expert panel to ensure that the responsibilities are divided.
The main thing isn’t the software. It’s your (and your management’s) attitude. It is not easy togive readers near-total control over some of your vital Web real estate. There is an endless temptation to do things like create topics you think will interest readers instead of letting your readers choose what to discuss on their own.
Nest discussions - A flat discussion tags the newest comments onto either the top or the bottom of ones already listed. A threaded discussion shows “discussion threads” but doesn’t display theentire content of posts replying to “parent posts,” just their subject headers.
Without reader-to-reader conversations, an online forum is nothing but a giant “letters to the editor” page. While posting responses to your published stories gives your readers more voice than they’d have without this ability, your forums or bulletin boards (or whatever you want to call them) will only achieve their full potential when readers start using them to talk directly to each other instead of merely reacting to content you have posted.
Let readers judge readers - So you don’t have to judge them yourself: Slashdot’s moderation scheme, from which many others were derived, works like this:
Moderation powers are distributed semi-randomly, and only to readers who have had login identities for at least a few weeks. And no individual reader gets more than a few moderation points at a time, so it’s hard for one knucklehead to mess up the whole scheme.
Obscenities, personal attacks, and other unwelcome speech will almost inevitably be moderated down into oblivion. “Community standards” have been used as a legal test of what constitutes obscenity. Give your readers the power to moderate other readers’ posts, and you will soon find what they consider obscene.
Slashdot has tried all sorts of additions and tweaks to its moderation system over the years, so many that a pretty good percentage of the Slashdot FAQ (Frequently Asked Questions) page is dedicated to comments and moderation.
The point of moderation is to separate dreck from diamonds. Readers who aren’t logged in view Slashdot comments that are rated +1 or above (on a -1 to +5 scale) but do not see comments rated 0 or -1 without special effort. Logged-in users’ comments post automatically at the +1 level, while comments from readers who are not logged in start at 0. A post from someone who is not a logged-in user, therefore, needs at least one logged-in reader to consider it worthy of a positive moderation point before most readers can see it at all, while a post from logged-in user that a few users who have moderation powers that day find offensive can easily drop from public view.
On the positive side, comments that add something useful to the discussion will be moderated upwards, so readers who only want to see the most cogent comments can set their preferences so that they see only comments moderated to +2, +3 or even +5.
On the negative side, you may want to give readers a little help. Most Slashdot-type posting systems allow employees or other selected forum monitors extra moderation privileges so that they can save readers from the task of removing strings of especially vituperative comments.
You may also want to only allow comments from registered, logged-in users. Slashdot allows anonymous comments because of the “whistle blower” factor; some of the site’s best posts have always come from people who might lose their jobs if they posted inside information about their employers’ actions under a traceable name. In return for occasional anonymous gems, Slashdot suffers from plenty of anonymous garbage down at the 0 and -1 moderation levels: you may decide this tradeoff isn’t worthwhile.