In the wake of a string of contentious live-streams and viral videos on social media, the question of how we regulate the internet is being asked louder than ever.
The internet is becoming a minefield for publishers, lawmakers, and the police. Right now, there is no greater battlefield than social media.
Apple CEO, Tim Cook, himself said in 2018 that the system as it stands isn’t working. And it seems the Federal Government agrees, with their newly announced social content regulation laws coming into play as a reaction to the current state of play.
So who should be responsible for policing social media?
The Australian Government seems to think it’s a job for the platforms themselves. Whereas Mark Zuckerberg thinks the government and its agencies need to take on more responsibility. Let’s weigh-up both sides.
What does the law say?
New legislation holds social media platforms responsible for the spread of ‘hate content’. The repercussions are pretty palpable, with executives facing substantial fines and jail time if they fail to swiftly remove inappropriate content.
But it’s not all good news. This legislation has been labelled a knee-jerk reaction by some in the tech industry, with many worried that it even has the potential to scare off tech companies from operating in Australia.
Germany and the EU are adopting similar sanctions for social media companies to remove content that is clearly illegal within 24 hours.
This kind of legislation sounds sensible in theory, but it could restrict freedom of expression.
An impossible task
Though there will now be an army of 7500 human moderators, alongside sophisticated algorithms charged with checking Facebook’s content, it’ll still be an impossible task to police everything being posted on the platform.
A large part of the problem is that content needs to be ‘reported’ by users as inappropriate or offensive to reach these moderators. This requires some level of self-reporting and citizen-justice from the online community.
The public opinion
Many feel that law enforcement agencies should police social media themselves, but others don’t agree.
Some believe that social platforms should be regulated like a public utility, much like an electricity supplier. Others feel that they should become publicly owned entities rather than trillion dollar corporate businesses.
One thing that we can all agree on is that self-censoring using smart algorithms isn’t enough.
Is it part and parcel?
You could argue that what users are seeing on social media is just part of the harsh reality of the world, and that the onus should be on the user to pay attention to what matters in their feed, and ignore the content that offends.
But we know that our own self-governance doesn’t really work, otherwise we wouldn’t be having this discussion.
Could an ideal solution be one that places responsibilities on all parties involved?
Granting powers to law enforcement to persecute users who create abhorrent content could work, and social media platforms could then be held accountable to a certain level for censoring illegal content.
While the answer isn’t clear, the need for a balance is.