Click here to print page

Facebook and other social media platforms must better police content on their platforms

Observer writer

Thursday, April 18, 2019

Facebook, the social media behemoth headed by its founder Mark Zuckerberg, finds itself under intense scrutiny for its seeming inability to effectively regulate content on its platform.

A nadir was reached with the live streaming of the Christchurch massacre in New Zealand last month. This has led to incessant calls for better regulation of social media platforms, and the question being asked by governments and media operators is, should social media networks be allowed to post any content, regardless of moral turpitude?

“Facebook cannot be trusted. They are morally bankrupt pathological liars who enable genocide (Myanmar) [and] facilitate foreign undermining of democratic institutions. They allow the live streaming of suicides, rapes and murders, continue to host and publish the mosque attack video, allow advertisers to target 'Jew haters' and other hateful market segments, and refuse to accept any responsibility for any content or harm,” declared New Zealand's Privacy Commissioner John Edwards following the gruesome event that took place in his home country.

Facebook and other social media platforms have usurped traditional journalism and its impact. In today's paradigm, everyone is a journalist and everyone can publish whatever he or she wants and it will be ubiquitously transmitted in real time – the democratisation of journalism.

Closer to home, the recently concluded Eastern Portland by-election saw a social commentator using social media constantly to comment on the social mores of Jamaica as it pertained to a particular candidate.

The shadow spokesperson on Foreign Affairs Lisa Hanna, who promotes herself often on social media, found herself misled on the supposed death of a St Lucian diplomat, which found her posting what in fact was erroneous information.

More people consume their news from social media than from traditional media outlets, with less regard being paid to the veracity of that information – thus giving rise to President Trump's aphorism “fake news”.

According to a Maru/Matchbox poll, about half of Americans and Canadians get their news primarily from social media, yet most don't trust this source of information.

“Social media is changing our relationship with the news. While it has become a common source of news and news-related information, people are less trusting of what they find there. The upside is that many feel more engaged with the news than ever before,” said Sara Cappe of Maru/Matchbox.

What is patently clear is that it is not a level playing field. Traditional media houses are subjected to self-regulation (company rules and codes), institutional watchdogs and legal censure; not so for social media.

What stands as the tenets of journalism more often than not do not apply on social media, and there is no longer a requirement to subscribe to those practices. The president of the United States has bypassed media outlets to broadcast his thoughts and pronouncements and openly declares that he does not need the media, only his Twitter account to get his message across.

With more than two billion active users and posting revenues of US$56 billion a year, Facebook is everywhere. The company has upended media and telecommunications as we used to know it. But by being unregulated does it serve us well or is it a force for evil? Should it be allowed to be a transmitter of the worst in human nature?

Fifteen years ago, when an idealistic young Mark Zuckerberg launched Facebook, he wanted to connect people and share content among friends. As he has often said, he wanted to create the digital equivalent of the town square.

“A place where you can go and interact with a lot of your friends and people you are interested in all at once,” he explained.

He did not foresee the chimera he would go on to create. As TS Eliot said: “Most of the evil in this world is done by people of good intentions.”

Today we see live streaming of heinous, dastardly acts which children readily have access to. We now can see ISIS beheadings, and elections can be manipulated via Facebook. You can say anything unfiltered without being censured – all in the name of freedom of speech. The 21st century with Facebook at the vanguard was to usher in a new era of unregulated information for the betterment of mankind.

As we draw nearer to the first quarter of this new century, how is it working out? Today your children can see people shot to death in a church live on Facebook. World leaders can use profanity on Twitter and the truth doesn't have to be the shield of journalism.

Given the tragedy that occurred in New Zealand, Privacy Commissioner John Edwards's opprobrium is understandable.

He went on to say: “This is a global problem. The events that were live streamed in Christchurch could happen anywhere in the world. Governments need to come together and force the platforms to find a solution.”

Perhaps what is more galling to Edwards is that Facebook and other such platforms seem unwilling to submit themselves to regulation and policing, and do not suffer any censure for the proliferation of these broadcasted malevolent acts.

“The legal protection they have, the reason they have been able to launch an unsafe product and escape any liability is the Communications Decency Act in the US – which says if you are a platform, a carrier, you have no liability for the content, but I think what we're seeing around the world is a pushback on that,” said Edwards.

To counter the broadcasting of offensive content, Facebook has established an independent oversight board for content.

When asked by ABC News's George Stephanopoulos if he thinks social media has made acts of extreme violence more prevalent, Zuckerberg replied: “It's hard to say. I haven't seen the data that suggests that it has. The hope is, by giving everyone a voice you are creating a broader diversity of views that people can have out there. Even if sometimes that surfaces some ugly views, with the democratic tradition that we have, you want to get those issues on the table so that you can deal with them.”

The Facebook boss made it clear that he cares about policing harmful content and hate speech.

“I don't want our work to be something that goes towards amplifying negative stereotypes and promoting hate; that's why we are investing so much in building up these AI systems. We have 30,000 people doing content and security review to do as best a job as we can in proactively identifying and removing that kind of content,” said Zuckerberg.

As it currently stands, many are of the opinion that Facebook's efforts are akin to sticking a finger in a dyke, and that a lot more has to be done to regulate the content on its platform.

Countries like Australia and the UK are very vocal on seeing to it that social media companies should be responsible for having proactive enforcement of issues on their platforms. It is simply not good enough to say that bodies like the Federal Communications Commission should not police the First Amendment.

Facebook and others should not only police harmful content, but see to it that the integrity of elections is protected, and ensure that data privacy controls are strong. People's information should always be respected.