It's Worse Than You Think, But You Can't See It Because It's Hidden

It's no secret vile and hateful comments are rife on social media, but in most cases, it's so much worse than what you see -- they are just hidden by teams of social media specialists.

When it comes to moderating comments online, weeding through forms of hate speech is a big part of the job.

Social media moderation is now an even more crucial role, given the recent defamation case that holds media organisations accountable for defamatory user comments on their social media pages.

When Network 10 gained broadcast rights to air the documentary about former AFL star and Australian of the Year Adam Goodes, The Final Quarter, the online racist backlash came in thick and fast.

Both in the lead-up and during Thursday night's broadcast, any trolling or vilifying comments posted on Channel 10 social media accounts were moderated heavily, removing racist and offensive content from public view.

READ MOREThree Pieces Of Fake News The Adam Goodes Booers Believed

Amber Robinson, a social media expert and digital consultant at Quiip was part of the team moderating Facebook comments after The Final Quarter went to air.

"There were a lot of insults last night," she told 10 daily.

"It can be pretty depressing ... sometimes you think progress is being made then you see the comments," Robinson said.

I feel it's a low level of debate, a lot of name calling.

As well as Network 10, Robinson also helps moderate other news broadcaster's social media pages.

"Anti-Muslim [posts] are probably the worst, if we are going to talk about what gets taken down the most," she said.

Robinson said for the sake of her mental health she limits moderating to four hours a day because the hate can get quite severe when groups start trolling en mass.

Who's posting?

In her book, Troll Hunting, author and journalist Ginger Gorman identified and spoke to the people most likely to be perpetrating hate speech on web.

The young men I spoke to were a cohort of essentially 18 to 35-year-old white men and they saw themselves as being largely socially, economically marginalised.

"Trolling is a way to express that anger," she told 10 daily.

Adam Goodes. Image: Getty Images

She said the most likely targets are people of colour, those living with a disability and members of the LGBTQI community.

"Anybody who is not a young white man, gets 'othered' and that is because they are angry that the other people in society are taking the place that they see as rightfully theirs at the top of the food chain," she said.

Gorman refers to social media moderators as "first responders" and said they all need social media "self-defence training."

"The impact of cyberhate and the content these people are exposed to is extreme."

Image: Getty Images

"You are doing the hidden dirty work which the public has no idea about and thank God the public doesn't see it," Gorman said.

Gorman said "trolls" shouldn't be dismissed as harmless people who are just keyboard warriors and said there is a broad spectrum of offenders.

"Predator trolling can be very serious, it's not just people being mean online, my book links this behaviour to murder, terrorism to incitement to suicide to real-life stalking," she said.

READ MOREAdam Goodes Was Failed By The Game And Fans He Loved

Spreading the message

During The Project: Final Quarter Extra Time on Thursday evening, Waleed Aly implored Australians to unpack and tackle racism.

"We need to have a productive national conversation," he said.

But that conversation is not possible if people continue to deny racism exists in Australia, former Race Discrimination Commissioner and University of Sydney academic Professor Tim Soutphommasane told 10 daily.

"There remains a section of our society that always seeks to deny racism or deflect attention away from it.

"Often you get the impression that calling someone racist is a worse offence than perpetrating an act of racism. Or that responding to racism is somehow divisive," he said.

Now there is also a legal imperative for media organisations' online moderators to ensure nothing defamatory gets published publicly on Facebook.

In June, a NSW Supreme Court judge found media organisations are liable for all comments made by third parties on their public Facebook pages, including when they are defamatory.

Racial hate speech is often orchestrated by extremist groups online who enlist others to attack at once and it can be hard to keep up. The posts that do get the racial attention get shared among the most people, and then there's the negative feedback loop.

READ MORE: Goodes Documentary Maker Wants Australia To Have A Real Conversation About Racism

Soutphommosane continued to say that those who deny racism exists haven't been exposed to the worst of it.

"We often only see a fraction of the racism that exists. Public forums will often feature only sanitised comments and moderated sentiments.

All Together Now, Australia's only anti-racism charity told 10 daily the online world is simply an extension of the offline world.

"Moderation may address some of the worst commentary online. However there is still a certain level of exposure, both online and offline, to content on the severe end of the scale."

"This is a challenge that extends beyond moderation -- we need individuals to step up and call out racism when they see it," a spokesperson told 10 daily.

Contact the author