Disinformation: Defining the problem and how best to counter it

Article

Disinformation is costing lives and undermining democracy around the globe, but there are steps we can take to address it.

That was the conclusion of an international discussion on the issue hosted by SCI.

Social change activists and funders around the world are seeing a significant rise in disinformation, causing serious damage to community cohesion, democracy and wider stability and well-being.

SCI’s online discussion on disinformation was led by three respected contributors:

  • Mike Posner is director of the New York University Centre for Business and Human Rights, with a focus on disinformation, social media and content moderation across platforms and countries. He served in the Obama administration as Assistant Secretary of State for Democracy, Human Rights and Labor.
  • Aoife Gallagher is an analyst at the Institute for Social Dialogue with particular interest in the intersection between disinformation, conspiracy theories and far-right extremism.  She was a journalist with Storyful.
  • Kavisha Pillay is a social change activist and former journalist working with Corruption Watch in South Africa.  A qualified data scientist, Kavisha is exploring how big data techniques can be used to advance social justice and how to best understand disinformation and prevent its exploitation in communities.

Defining the problem

Despite the challenges of disinformation, the panellists were keen to acknowledge the potential of the internet as a 'force for good’.  

Panellists felt that we are now at a critical juncture to actually address the damaging effects of ‘disinformation’ – described by Mike Posner as “deliberately putting disinformation into the system”.

Kavisha Pillay detailed the international impact of disinformation, noting that it is deliberately deployed to do harm and to increase communal tensions. She noted how even before the emergence of the internet, disinformation was used to weaponize poverty and unemployment to foster xenophobia and fuel division. The social nature of how disinformation operates has to be part of the solution.

For all the contributors, educating societies in media and online literacy was essential. 

Aoife Gallagher highlighted Media Literacy Ireland’s ‘Be Media Smart’ campaign.

"We need to inoculate people against these issues,” she added, “with media online literacy in schools."

For Aoife, one significant negative consequence arising from the global pandemic was an increased “coalescing of movements” that “used to sit on their own” now connecting to each other and lifting tactics and approaches that spread disinformation and exploit “natural levels of distrust”.   

She outlined how the most extreme conspiracy theories take hold.  She noted that they can play on people's fears, pander to their confirmation biases, or give a false sense of 'control' in times of global crisis.

Here, again, education was important, but so too was greater transparency from platforms about how their algorithms promote controversial content.

Mike Posner said social media platforms are failing to enforce their own monitoring rules and are out-sourcing their monitoring responsibility. 

Despite huge profits, they were avoiding real scrutiny and failing to take on the responsibility for the outworking of their business models. Even today, platforms continue to operate in countries where they have no footprint or physical presence and no language appropriate moderation.  

He noted that the problems are particularly acute in countries where there is a lot of internet use but providers are not commercially interested.  He said that while there are all kinds of problems in the US and Europe, people are paying attention and are pressing companies to take action. That’s not the case in other places.     


Where from here?

The SCI event (available in full at the end of this article) heard panellists make a series of proposals, including:

  • embedding in school and life-long learning curricula to foster critical thinking, digital literacy and fact-checking for information and internet users
  • supporting quality independent journalism to counter disinformation
  • platforms including initiatives like tobacco-style warnings, alerting audiences to the nature of the algorithms that are pushing divisive content
  • pressing advertisers to use their financial muscle to ensure platforms block disinformation and hate speech
  • Fostering global collaboration between groups lobbying against disinformation – doing practical things like ‘campaigns for truth’.

Mike said government and civil society "have a collective responsibility" to push companies to change. The impact of disinformation on political activity means democracy is at stake.

A note of caution was expressed by the panellists in handing governments the power to regulate content, but Mike did list four things government can do. They should ensure that the large online platforms:

  • have a presence in countries they operate in
  • have a content overseer reporting to the platform’s CEO
  • hire their own moderating staff and in sufficient numbers
  • provide clarity and information on what they are actually doing to combat disinformation and hate speech

He added: “I am not at all pessimistic about this. I think these issues have been raised to a level where there is a recognition by governments, by societies as a whole that this is a big problem.

"I think we need to redouble our efforts to try to come up with very practical ways to have government oversee what those [social media] community standards are, create greater transparency, create some kind of ongoing oversight and regulation without individual content determinations. I think it can be done.”

Useful resources:

Watch and share the entire online discussion: