Published daily by the Lowy Institute

The content control reckoning for tech giants

Social media networks have long abrogated their responsibilities on content control, and governments have started to notice.

Germany's Justice and Consumer Protection Minister Heiko Maas (Photo: Getty Images/Sean Gallup)
Germany's Justice and Consumer Protection Minister Heiko Maas (Photo: Getty Images/Sean Gallup)
Published 27 Mar 2017   Follow @JohnMGooding

Over the weekend, Starbucks and Walmart joined the growing boycott of Google's YouTube due to concerns that ads might be appearing in close proximity to hateful or offensive content. The setback is the latest in a bad start to the year for Google; in February the Wall Street Journal investigated one of YouTube's most prominent stars for anti-Semitic and Nazi imagery in his videos.

But Google isn't the only tech giant with user-generated content that is under pressure to better police what it publishes (or allows to be published, depending on your perspective). In Germany, which has some of the toughest hate speech laws in the world, Minister for Justice and Consumer Protection Heiko Maas advocated a draft law earlier this month that would impose substantial fines on networks that fail to remove 'obviously criminal' material within 24 hours, and less overtly illegal content within a week.

There's been a marked uptick in German hate speech in the wake of a huge refugee influx from mostly Muslim-majority countries. In response to criticism from the government, Facebook, Twitter and others agreed in December 2015 to remove flagged hate speech within 24 hours, and signed up to an EU code of conduct stipulating similar standards in May last year. However, in September Maas critiqued Facebook for not abiding by the agreement and said the German government would implement legislation to enforce the standards if social media networks could not voluntarily meet them.

And in Pakistan, the Interior Ministry has claimed that Facebook will send a delegation to Pakistan to address the issue of blasphemous content. Unremarkably, the Pakistani government would like more effective content management to ensure blasphemy is quickly removed from the network. Rather more remarkably, the government also wants Facebook to cooperate with authorities in identifying the authors of blasphemous posts. A decision on whether to ban Facebook is set to be made after the Interior Ministry provides a report on the matter to the Islamabad High Court in a hearing today, reports Dawn.

Pakistani law stipulates the death penalty for defiling the name of the Prophet Muhammad and prison sentences for a number of similar offences. Polling indicates that the laws have overwhelming local support. Though no-one has been executed for blasphemy since legislation against it was strengthened in the 1980s, many are currently awaiting execution.

Since 1990, vigilantes have killed at least 65 people publicly accused of blasphemy, while others have been forced to flee the country. Most famously, in early 2011 Federal Minister for Minorities Affairs Shahbaz Bhatti and Governor of Punjab Salmaan Taseer were both assassinated, ostensibly for committing blasphemy. Taseer was killed by his own security guard, Mumtaz Qadri, whose execution for murder five years later led to days of protests in Islamabad that were only alleviated when the Pakistani government issued assurances that the blasphemy legislation would not be amended and no-one convicted of blasphemy would be pardoned. Qadri's grave is now the site of an elaborate and well-trafficked shrine, paid for by donations to his family.

It is in this context that Facebook deliberates over Interior Minister Chaudry Nisar Ali Khan's statement as reported by Dawn that 'Facebook and other service providers should share all information about the people behind this blasphemous content'. Despite criticism from human rights advocates, Facebook does delete some blasphemous posts it's notified about, and it does respond to state requests to restrict access to blasphemous content. According to Facebook's government requests report, the number of content pieces restricted in Pakistan due to government requests fell from 1773 in the first half of 2014 to just 25 in the first half of 2016 – Facebook does not elaborate on the reason for this fall, and it's unclear whether it's due to Facebook abiding by fewer requests, the government making fewer requests, fewer blasphemous posts in the first place, or some other factor.

Despite Facebook's somewhat amorphous position on the importance of unfettered free expression, the prospect of ratting out blasphemers to a government that would (in theory at least) seek to put them to death is out of the question. In addition to any ethical objections, obliging the Pakistani government on this issue would likely provoke extreme backlash and boycotts in countries that provide the lion's share of Facebook's revenue.

While Facebook and other social media channels have little principled opposition to restricting content to operate in markets such as Pakistan or Germany, implementing the level of controls stipulated may prove challenging. A recent study in Germany revealed that Facebook deleted less than half and Twitter only 1% of illegal hate speech within the government's 24-hour timeframe. Facebook CEO Mark Zuckerberg has outlined his desire to have a combination of user input and artificial intelligence eventually allow for the automatic categorisation and appropriate censoring of objectionable content:

The approach is to combine creating a large-scale democratic process to determine standards with AI to help enforce them. The idea is to give everyone in the community options for how they would like to set the content policy for themselves…for those who don't make a decision, the default will be whatever the majority of people in your region selected, like a referendum. Of course you will always be free to update your personal settings anytime…Although we will still block content based on standards and local laws, our hope is that this system of personal controls and democratic referenda should minimize restrictions on what we can share.

But Zuckerberg is quick to add that such technology is potentially years away. And implementing content control measures may lend further legitimacy to the argument, so long avoided by tech companies, that Facebook and others should be subject to the same expectations placed upon traditional publishers – that they are not just a medium, but are as responsible for the content posted on their sites as the users that post them.

'Overall, it is important that the governance of our community scales with the complexity and demands of its people', wrote Zuckerberg in his manifesto. But Facebook's community doesn’t operate in a vacuum – it relies explicitly on the blessings of other agents. Whether the governance of Facebook, Google and others can scale fast enough to meet the increasing demands of the advertisers that fund them and the governments whose citizens they rely on for revenue remains to be seen.



You may also be interested in