Disinformation, whether it鈥檚 about climate change, COVID-19 or the war in Ukraine, is plaguing the online spaces we rely on to understand and engage with the world around us. Social media platforms and governments are taking steps to curb this phenomenon online, but the supply of deceit shows no signs of abating 鈥 if only, it鈥檚 growing more sophisticated by the day. So what are proven, effective tactics to fight disinformation on social media? Researchers at Utrecht 木瓜福利影视 are still grappling with this question, but they warn many of the existing measures fail to address the number one motive that makes disinformation thrive in the first place: too often, lies are more profitable than the truth. And that鈥檚 a wicked incentive for content producers and influencers who, in the age of social media, can be anyone of us.

Disinformation business

Illustration of Joe Biden. Half his face is covered by a phone screen showing a malefic version of him, in an allegory to the incendiary image disinformation portrays him.
Illustration: Bas Huissen for Utrecht 木瓜福利影视

You鈥檝e probably seen them all: a manipulated video of Joe Biden making transphobic comments, memes spreading false claims that 5G networks cause COVID-19, or images of alleged atrocities during the war in Ukraine, taken out of context or reused from other conflicts. Whichever form disinformation takes, they all have one thing in common: they are made with the deliberate intent to deceive you.

In the art of manipulation, these examples are just the tip of the iceberg. The increasing sophistication of new technologies, such as deep fake videos or audios, will make it harder to distinguish what is real from what is not. Just imagine what would be possible in fully immersive virtual environments in a few years from now. 

Already today, the rise of false or misleading news poses one of the biggest threats to open, democratic societies that depend on the free flow of accurate information for citizens to participate in public life. 鈥淒isinformation is splitting society in two camps: there鈥檚 an 鈥榰s鈥 who is right versus a 鈥榯hem鈥 who is wrong. That is leading to increased division, polarisation and mistrust鈥, says Bruce Mutsvairo, a professor in the department of Media and Culture Studies at Utrecht 木瓜福利影视. 鈥淚f people can鈥檛 agree on even basic facts about vaccines or climate change, we can鈥檛 work on solutions to our shared problems. We live in two different realities.鈥 

Producing and spreading disinformation can be done for multiple reasons, says Mutsvairo. 鈥淕overnments or political candidates are known to peddle it to mobilise support for their cause or discredit the opposition鈥, he says, mentioning the infamous illegitimate use of data from millions of Facebook users from Cambridge Analytica to tilt the outcome of the 2016 US presidential elections or the Brexit referendum.

Regular citizens, too, are responsible for spreading falsehoods or conspiracy theories (鈥9/11 was an inside job鈥), Mutsvairo stresses. 鈥淚 how politically motivated groups use it to exacerbate conflict in Mali and Ethiopia. A lot of young people feel they have been rejected by the State. They don鈥檛 trust the government or the media. They don鈥檛 see a future. It鈥檚 easy for these people to fall into disinformation's grip.鈥 

But the most common motive to spread disinformation online is money - even if it serves other (political) interests too, says Jos茅 van Dijck, Professor of Media and Digital Society at Utrecht 木瓜福利影视. 鈥淚f we鈥檙e serious about fighting disinformation, we need to understand the perverse financial incentives and the wider infrastructure that enables it鈥, Van Dijck argues. According to her, the problem, ultimately, is not so much about fake news (after all, they鈥檝e always existed) as it is about today鈥檚 information ecosystem and how vulnerable, or amenable, it has become to disinformation. 

Social media platforms have a financial incentive to amplify content that is sensationalist, polarising and fake.

Illustration of a healthcare working administering a vaccine. Superposed, a phone screen shows the havoc the vaccine is supposed to wreak according to disinformation campaigns.
Illustration: Bas Huissen for Utrecht 木瓜福利影视

Free pass to disinformation

鈥淪ince the advent of digital and social media, anyone can post content on the Internet and reach millions within minutes. As a result, there are fewer gatekeepers to filter the veracity of the information that reaches us鈥, says Van Dijck. 鈥淭he same tools that are helping spawn pro-democracy movements around the world are also enabling disinformation to spread faster and further than ever before.鈥 In fact, as an , false news stories are 70% more likely to be retweeted than true stories.

The virality of false information online doesn鈥檛 happen by chance. As Van Dijck explains, 鈥淔acebook, Twitter, YouTube and other social media platforms use algorithms to amplify content that is more likely to get your attention. That can be photos of your friends鈥 birthday parties or a nuanced report about mitigating COVID-19, but too often it means content that is sensationalist, polarising and fake.鈥 The logic, Van Dijck argues, is simple: these are the types of posts and stories that appeal to us emotionally, and thus, keep us engaged. 

Because engagement is platforms鈥 core business model and money-making metric. 鈥淭he more time you spend on social media, the more data you鈥檙e giving away about yourself, and the easier it is for these platforms to target and sell ads. That鈥檚 how platforms make money out of their free services, by selling your attention to advertisers鈥, says Van Dijck. Just in 2021, ads earned social media platforms $153 billion, a number that is projected to grow to $252 billion by 2026. 

Platforms make money by selling your attention to advertisers

鈥淭hat is a perverse financial incentive to amplify content that is engaging, whether it鈥檚 accurate or not. That鈥檚 also the reason why recent social media initiatives, like flagging up misleading information, suspending fake accounts or hiring fact-checkers cannot fully or permanently tackle the problem,鈥 says Van Dijck, who鈥檚 sceptic platforms can ever do enough to curb disinformation while still profiting from it. 

Disrupting the economic incentives is surely an effective solution, but not an easy one, Van Dijck says, in part because we鈥檙e heavily dependent on these centralised platforms for our global information and communication needs. 鈥淎 few handful of companies with commercial interests now control and procure the information diet of several billion people: the Big Five (Google, Amazon, Apple, Meta, and Microsoft) in the West and another three (Baidu, Alibaba and Tencent) in the East鈥, says Van Dijck, who analyses the workings of these platforms in the book . 

鈥淭his situation of monopoly has given big tech platforms an exceptional power to decide what their responsibilities are, or not, with regards to content that users post on them. That Elon Musk, for example, can decide whether to allow Donald Trump back into Twitter or what should account as hate speech is a crazy situation where you have the owner of a particular platform decide upon the rules. These should be negotiated by social contract, conditioned by legal frameworks.鈥 

Illustration of a protester holding a peace sign, where a phone screen shows the sign turned into a bomb.
Illustration: Bas Huissen for Utrecht 木瓜福利影视

Influencers: a new vehicle for disinformation

Regulators attempting to demonetise the spread of disinformation, however, are facing new challenges. 鈥淚n the past decade, social media platforms have been developing new monetisation strategies that go well beyond selling advertising鈥, says Catalina Goanta, Associate Professor in Private Law and Technology at Utrecht 木瓜福利影视, and Principal Investigator of the ERC Starting Grant . 鈥淭hink of live shopping on TikTok, crowdfunding, or subscriptions on platforms such as Patreon, Twitch, and YouTube. Chief amongst these is influencer marketing, which allows Internet users to not only engage with advertising, but also to become 补诲惫别谤迟颈蝉颈苍驳.鈥&苍产蝉辫;

Influencer marketing, which Goanta describes as a form of human advertising, is now a booming industry, expected to reach $15 billion by 2022. 鈥淎t first, influencers earned revenue from promoting goods or services, often inconspicuously, from sponsoring brands to their large follower base. Nowadays, any Internet user can rise to fame, and be compensated for sharing multimedia content about virtually any topic, including news, conspiracy theories, or elections鈥, says Goanta. 鈥淭hat makes influencers a powerful vehicle for disinformation, especially since the relationship with their followers is largely perceived as one based on trust and authenticity.鈥 

Influencers are cashing in money for sharing ads masked as facts.

The need to keep a steady audience may encourage influencers to engage in unethical practices, and, says Goanta, examples of it are mounting: from a to pay TikTok influencers to spread Kremlin-propaganda about the war in Ukraine to , where some influencers immediately turned to their channels to disclose the attempt to recruit them, while others appeared to take up the offer.

鈥淚nfluencers are cashing in money for sharing ads masked as facts. The industry behind influencer marketing is really opaque, so we only see an influencer holding a product or spreading a political idea, but we don鈥檛 see the contracting parties鈥, says Goanta. 鈥淎nd at the moment, we don鈥檛 yet have clear legal definitions to differentiate between content and advertising in social media.鈥

The European Union鈥檚 (DSA), says Goanta, is a groundbreaking legislation attempting to address some of these online harms. 鈥淭he DSA severs the liability of social media platforms for the type of content and economic transactions on them. It sets higher standards of consumer and citizen protection than platforms themselves are offering, demanding greater accountability for content moderation or greater transparency to the workings of their algorithms and targeted advertising practices鈥, explains Goanta.

鈥淗owever, the European regulator has specifically left influencer marketing outside of the DSA, and in my opinion, wrongfully so. Influencer marketing is a buzzword, but the underlying phenomenon of native advertising is a long-standing problem in the world of advertising: the cat and mouse game of hiding advertising and influencing audiences. We鈥檝e seen this with product placement, with native ads as news editorials and now with social media content鈥, says Goanta. 鈥淎t Utrecht 木瓜福利影视鈥檚 Governing the social media and data economy group, we want to chart more clearly the harms that emerge in this evolving social media landscape and how they can best be regulated.鈥 

Public alternatives

Jos茅 Van Dijck agrees, the European Union鈥檚 Digital Services Act is a step in the right direction to strengthen platform governance. Besides, we need to invest in alternatives to corporate platforms. Van Dijck is cooperating with public organisations in public media, cultural heritage, festivals, museums and education to define how a fully socially responsible online space would look like under the project . 

The main goal is 鈥渢o design a stack of platforms where users are not viewed as exploitable assets or data sources, but as equal partners that share a common public interest. Existing alternative platforms based on public values need to be made accessible and interoperable, while new ones may have to be designed. This is not simply a technical process but also requires serious reflection on the (local) governance and moderation of platforms.鈥

Education is key

Until online public spaces become a solid alternative, there鈥檚 one more solution in our toolbox against disinformation. Because not everyone contributing to its spread is motivated by profit. Often, it鈥檚 just regular citizens like you and me who share false information unknowingly (referred to as 鈥榤isinformation鈥). It鈥檚 therefore important that everyone realises their own responsibility, but also their own power to act against attempts to confuse and deceive, says Eug猫ne Loos, Associate Professor at the Utrecht 木瓜福利影视 School of Governance. Loos believes that educating citizens to spot disinformation is key to stop it from spreading. 

鈥淧roviding media literacy for both young and older people is a durable solution now and in the future. If you learn how to assess the trustworthiness of a message or the credibility of the source, you will be better equipped to think critically before sharing and amplifying dubious content鈥, explains Loos, whose research on the access of reliable digital information and the role of media literacy programmes suggests it鈥檚 possible to start building immunity against disinformation. 

Media literacy empowers people to distinguish what is reliable media content and what is not.

鈥淛ust as vaccines, exposing people to the tactics commonly used can against future manipulation attempts鈥, says Loos. And that works better when done preemptively, he says, after reviewing . 鈥淓xposing students to fake news websites or having them play fake news games could be far more effective than more traditional approaches that focus on debunking misconceptions with facts.鈥

The reason, Loos suspects, lies in the assumption that we can convince or change people鈥檚 minds by giving them the facts. 鈥淵ears of behavioural research show that it鈥檚 also about reaching them on an emotional level鈥, he says. His own research among Dutch primary school children showed that we鈥檙e more vulnerable to fall for fake news when we鈥檙e emotionally invested. 鈥淭hat鈥檚 also the way out. Educational programmes will need to counter the emotional trigger of those falsehoods.鈥  

More research about the secrets of successful media literacy programmes is certainly needed 鈥 for example, many of the interventions Loos came across were not evidence-based and they were mainly circumscribed to school settings, thus excluding older generations as well as the real-life scenarios where people are targeted with disinformation. Still, Loos is confident about the value of investing research in this educational approach: 鈥淚nstead of assigning platforms or governments as the arbiters of truth, media literacy empowers people so that they themselves are able to distinguish what is reliable media content and what is not.鈥

Disinformation is ubiquitous across digital and social media platforms. Governments, social media platforms and regular citizens, all have a role to play in its spread. By shining a light on its business logic, our researchers identify the shortcomings of current policies and legislations that fail to tackle the economic incentive to spread disinformation. And by studying media literacy approaches, they can understand which works best to stop its appeal. In the end, disinformation may never fully disappear, but these joint efforts can already make it harder for beneficiaries to spread their falsehoods online and provide citizens with the skills and knowledge to access reliable, accurate information. Because that is a fundamental right.

  • Portrait of Marta Jimenez

What can you do yourself?

Meet the experts

  • Portretfoto Bruce Mutsvairo 漏 Kees Gort

    Bruce Mutsvairo

    Professor in the department of Media and Culture Studies at Utrecht 木瓜福利影视
  • Een foto van Jos茅 van Dijck

    Jos茅 van Dijck

    Professor of Media and Digital Society at Utrecht 木瓜福利影视
  • Catalina Goanta. Bron: maastrichtuniversity.nl

    Catalina Goanta

    Associate Professor in Private Law and Technology at Utrecht 木瓜福利影视
  • Eug猫ne Loos

    Eug猫ne Loos

    Associate Professor at the Utrecht 木瓜福利影视 School of Governance

More stories of Utrecht 木瓜福利影视