How publishers can ensure quality control of comments, instead of removing comments sections in totality
With the emergence of digital news, a new practice came to be — that of the Comments sections under articles. It was a new-age version of the Letters To The Editor, which we had always read in print. The only difference? That readers could comment instantaneously and could do so on specific articles while addressing specific journalists. A great way to get feedback on articles, you would think.
Other than for the fact that this soon became a cesspool of insults, abuse and hate speech. Especially for popular websites with a large number of articles being published, it was increasingly difficult to monitor the comments. Websites such as Firstpost had Comments sections which were filled with rape threats and abusive comments to journalists.
Yes, readers need a space much like the authors of articles, to voice their opinion. That’s why Comments sections are important.
We often take comments for granted as the feature is mostly present at the bottom of an article. Considering its importance for User Generated Content (UGC), its equally crucial for online publishers, community managers and everyone else. So, invest in the right commenting system to maximise its potential.
Crucial, but yet phased out
At the 2012 SxSW (South by Southwest) festival, Gawker Media founder Nick Denton claimed that 80 per cent of reader comments were off-topic, rude, hyperbolic, misogynistic, bigoted and devoid of linguistic sense.
Bevan Lakay, the Community Editor of News24, reminds us that the volume of comments was enormous and hate speech was prevalent. “These guys were clever and quite meticulous. We banned one [offensive] word and the next day it would be a version of a different word. It was hard to keep up with it.”
Due to such ghoulish acts, many newsrooms had to make a shift. They phased out the commenting system as it required an increase in manpower to moderate the hate/negative comments, and it was almost impossible to stop it totally.
You can now overcome this unwanted phase.
In this era, artificial intelligence and machine learning are the smartest ways to overcome the issue. Newsroom publishers can now provide real-time feedback to commenters. One such example is Google’s Perspective which determines the “toxicity” of a comment. Perspective essentially uses machine learning to instantly detect abusive speech online.
According to Jigsaw, its AI can immediately assess a sentence for “toxicity”. This has been seen to be more accurate than keyword blacklists and definitely faster than human moderation.
Metype is a solution we built to tackle all real-time issues you face while using a commenting system. It helps readers consume the latest content that is being viewed across the site (Content re-circulation). It also lets our users consume content based on previous choices made (user-centric content aggregation), thus, reducing the dependency on human intervention.
Disclaimer: The author is a sponsors for The Media Rumble.
www.newslaundry.com