Advertiser boycott Facebook: The great responsibility of social media platforms
The number of advertisers who temporarily stopped advertising on Facebook and Instagram as a protest against fake news and hateful content has risen to over 750. The #stophateforprofit campaign is growing, but will Facebook do more to stop hate, outrage and disinformation? According to media scholars Prof. Jos茅 van Dijck and PhD student Anna Marieke Weerdmeester the social pressure on the company is increasing.
Legally, it is difficult to prosecute Facebook, because as a "tech company" it is not liable for the content that users post. In addition, the lack of effective government regulation is problematic. And because Facebook is playing an increasing role in providing information to citizens, human rights are also at risk, Weerdmeester warns.
Exemption position
"Since 1996 there has been an American law (section 230 of the Communication Decency Act, a law that has been 'translated' to Europe as the e-Commerce Directive) that exempts platforms from liability for content that users post on the platform," explains Jos茅 van Dijck, university professor of media and digital society. "This law was implemented long before social media platforms became key players in the global public debate and gave platforms (still startups back then) an exceptional position vis-脿-vis news organisations and telecoms. By positioning themselves as a 'tech company', these platforms were given all the room they needed to decide for themselves what their responsibilities were with regard to user-generated content. Some content is strengthened by mechanisms such as likes, retweets, rankings and recommendations, while others are more hidden. These are built-in, subtle mechanisms that give a platform a great deal of power over the distribution of online messages".
On social media platforms, any user can, so to speak, reach for a megaphone, while Facebook and Twitter control the amplifier that determines whether or not to that voice has a global reach.
When the looting starts, the shooting starts
This power comes with responsibilities, which each platform deals with differently. When U.S. President Donald Trump wrote on Twitter and Facebook in early June "when the looting starts, the shooting starts", in response to the riots following the death of George Floyd, Twitter added a framework to the tweet. It mentioned that Trump had broken the rules of the platform. However, Facebook did not do this; critique on this decision followed. Asked about the decision Mark Zuckerberg stated: "We should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies." For users of these platforms, these different policies are very confusing and intransparent. "Of course, news organisations also apply different rules for content moderation," Van Dijck says, "but we know more or less what they are and their reach is often tied to a certain community. On social media platforms, any user can, so to speak, reach for a megaphone, while Facebook and Twitter control the amplifier that determines whether or not to that voice has a global reach".
Further-reaching regulation
Facebook is taking measures to control the growing amount of disinformation and hate messages. "Facebook has, for example, taken various initiatives around moderating disinformation, such as partnering with other platforms and the World Health Organization to limit disinformation about the coronavirus. Facebook has also recently established an Oversight Board to monitor and improve their moderation process of illegal and unwanted content. This is partly due to pressure from society, companies and (government) agencies, as well as from the European Union. However, this mainly concerns voluntary or own initiatives," says Weerdmeester. From 2019, all social media platforms will have to report to the European Union on a monthly basis on what they have done to combat disinformation. "Next year, the EU will evaluate this form of self-regulation and it will then become clear whether further-reaching regulation is necessary, such as an ombudsperson or another institution. Businesses are aware that if self-regulation does not work sufficiently, there will be more government regulation," Van Dijck adds.
Facebook is playing an increasingly important role in our information provision, and human rights are under threat because of that.
Right to information violated
Facebook is playing an increasing role in our information provision, and human rights are at risk because of that, Weerdmeester states. "Approximately one third of people use Facebook to access news, according to research by the Reuters Institute for Journalism. Because Facebook moderates access to information, it affects the right to receive information for the individual, which is part of the right to freedom of expression. Under Article 10 of the ECHR, the European Court of Human Rights has in fact stated several responsibilities for journalists and the media, for informing society and for exercising control over government actions, like a "watchdog". Because Facebook qualifies as a technology company, it can benefit from their power in the provision of information without having to meet the obligations of journalists and the media. This is problematic: the right to receive information, and by extension the right to freedom of expression, is a hugely important right for citizens' participation in public debate and democratic decision-making. It is also an important right for individual self-development".
Increase pressure
Both media scholars think that the boycott by advertisers will have limited effect. It is mainly a sign of dissatisfaction and may increase the pressure on Facebook to make fundamental changes to the platform. Van Dijck: "Social media platforms have understood two market laws well: convenience and free. Free services that you can use with one click of a mouse are irresistible, even if you pay at the checkout with your personal data and attention. Citizens are becoming increasingly aware of their power as consumers, but are reluctant to deny themselves the convenience of these services. User activism has been used from the outset as a means of pressure to reform Facebook and allow more public control; its effect has been very limited. But if the pressure comes from many sides, think of advertisers, users, governments and activists, it of course amplifies the signal to these platforms: if they want to be part of the solution, they will first have to tackle their problematic technology and revenue models".
Governing the Digital Society
Jos茅 van Dijck is university professor and conducts research into digital society and culture, social media and media technologies. Anna Marieke Weerdmeester conducts doctoral research on the responsibilities of social media platforms as gatekeepers of information. Both are connected to the focus area Governing the Digital Society. This focus area stimulates research into the social processes of datafication, algorithmisation and platformisation.