Opinion

Facebook’s failures allow authoritarianism to thrive

Written by Joseph Misuraca

It’s been three years this month since Mark Zuckerberg, CEO of Facebook provided testimony to US Congress about the Cambridge Analytica scandal.

His company, in that time, is supposed to be removing fake content, whether it’s misinformation about COVID-19, or disinformation produced by political parties in countries across almost every continent.

If you’re hoping the multibillion-dollar social network has learned its lessons recently with the Storming of Capitol Hill on January 6 or the reoccurring leaking of its users’ data from 2019, you’re bound to be disappointed.

Just within the past week, The Guardian*, as it did in March 2018 with the Cambridge Analytica scandal, has revealed in an extensive four-story-long news package how Facebook has been slow to deactivate fake user accounts by state-run organisations in countries like Honduras and Paraguay.

Judging from these revelatory news articles, Facebook is reluctant to combat the proliferation of information pollution in lesser known countries such as the ex-Soviet nation of Azerbaijan.

When former Facebook data-scientist-turned-whistleblower, Sophie Zhang discovered the president of the Philippines, Rodrigo Duterte had trolls running smear campaigns against his political opponents, Facebook took months to react.

It only did so once the fake account users commented on Donald Trump’s Facebook page.

Image of Sophie Zhang during her interview with ‘The Guardian’

How did it get to this?

Social media platforms like Facebook as well as its rivals, Twitter and YouTube depend upon algorithms to present relevant information to its users.

Information pollution is disseminated via this process.

Another way political propaganda is being spread is through the use of bots and fake accounts.

Comments, likes, shares and reactions to Facebook posts by politicians and political parties’ pages are used to either empower a government or discredit the opposition.

Zeynep Tufekci, Associate Professor at the University of California has played the role of sibyl over the past decade.

In her opinion piece for MIT Technology Review, she states how she could see the dangers of algorithms being used to suggest content to Facebook users.

She writes, “I wrote an op-ed for The New York Times [this is in 2012] voicing these worries” and “I merely advocated transparency and accountability for political ads and content on social media.”

By 2018, with the exposing of the Cambridge Analytica scandal, Facebook has since been in the spotlight for its failure to properly police individuals and organisations who steal users’ data and use disinformation to cause electoral and civic harm.

What is Facebook doing to stop the spread of information pollution?

Ms Zhang has brought to the public’s attention Facebook’s technical term for the action of using multiple fake accounts to engage users and spread content, ‘coordinated inauthentic behaviour’ (CIB).

The tech company has specialist teams – like the one Ms Zhang worked on before being fired in September last year – using this approach to stifle political manipulation campaigns around the globe.

While this seems to be a positive initiative on Facebook’s part, the social media platform is racially and geographically biased when it prioritises which countries’ CIB it wishes to tackle first.

The US and Poland, for example are prioritised over Argentina and Mexico, whereas Iran and Mongolia are ignored.

Why is it permissible to allow authoritarianism to thrive in some countries and not others?

Is it only important to stop Western nations’ democracies from being destabilised?

Why doesn’t it matter if political dissent is fostered in developing countries?

Where to from here?

It’s clear from looking at what the BBC has reported about Myanmar in recent months, Facebook is removing content that “praises or supports the coup” (the Burmese military have now forced internet providers to block Facebook in Myanmar).

Image of Burmese children in Myanmar during the 2018 massacre of Rohingya Muslims. Source: Reuters.

Facebook’s reaction has been for the Burmese authorities to “restore connectivity so that people in Myanmar can communicate” and “access important information” such as updates on COVID-19.

But is this enough?

As detailed in the Tow Center Report, Friend and Foe, “moving forward, platforms…will establish the ground rules related to civic participation, content moderation, and identity authentication across the communities, geographies, and interest groups they operate.”

When will Facebook stop hiding behind its statement of it being a ‘neutral platform’ and be responsible for its actions and inactions?

It may be renowned like its competitors for putting profits before people, but, it’s just as immoral and unethical to place profits before politics.

*Buzzfeed News initially reported whistleblower and former Facebook data scientist, Sophie Zhang’s story in September 2020 by drawing upon her internal memo, not via interviews with her. The Guardian is the first media outlet to interview Ms Zhang.

(Featured Image: Cartoon depiction of Facebook as an all-knowing source from eddierockerz.com.)

 

 

 

 

 

 

 

 

 

 

About the author

Joseph Misuraca

Joseph is a freelance journalist and writer. His work has been published in the LGBTIQ+ magazine, 'Archer' and 'North & West Melbourne News'. He is currently doing RMIT's Graduate Diploma in Journalism.

5 Comments

  • I was going to say that governments should set up rules saying if social media companies want to run their services within their country then they need to follow that countries rules…but then I guess that’s where censorship and fake news come in, as not all countries share the same ideals. Perhaps instead countries like Australia etc. can demand that if Facebook etc wants to run their services in Australia, then they need to do a better job of enforcing free speech and removing fake news outside of the west.

    But I can imagine free speech and fake news have a weird cross over too, especially in America where people are brought up to believe they can say whatever they want as the constitution will protect them. So I dunno bro.

  • It’s inordinately frustrating to that Facebook prioritises what countries CIB is used for. While yes, it’s a good step to see Facebook address political manipulation – it is still harmful that authoritarian tendencies/regimes remain unchecked in developing countries, such as Iran and Mongolia.
    The initiatives (or lack thereof) of Facebook is an evolving and ever-changing issue. It is often difficult to remain fully informed on what is happening to its users around the world. So it is certainly an entity to always keep an eye on – as it is already watching us!

  • Funny how you mentioned how Facebook took months to react to paid trolls of Duterte because Facebook barely did anything about it. The general public knows about the existence of these trolls, journalists know about it, even young children know about it. That’s because up until now, these trolls continue to run rampant on the Filipino side of Facebook.

    It’s like every month they will have these scripted posts, multiple people posting the same exact thing yet it doesn’t get reported as spams. Although the government of the Philippines can’t completely control Facebook, it still doesn’t change the fact that these trolls threaten people both on posts, comments, and even private messages and facebook just lets it be. Some would even go far as creating a poser account of media people, students, and activists to mar their credibilities.

    Bottomline is more people, at least in the Philippines, would not believe what is on Facebook because of the knowledge of paid trolls and the site not actually dealing with it.

  • Excellent article, Joe. I suspect the reason Facebook is minded to at least give the appearance it is addressing the problem of disinformation in first world countries, as opposed to lesser known or poorer nations, lies in the regulatory power first world countries typically wield. If the United States were to hypothetically pass reforms in the wake of a failure by Facebook to tackle disinformation in the US, that would conceivably have a domino effect across Western Europe and in countries like Australia and New Zealand. Poorer countries typically lack such influence, and so provide less incentive to Facebook to address the problem of disinformation.

  • Hey Joe, fantastic article! Totally agree with you in that Facebook needs to stop hiding behind its statement of it being a ‘neutral platform’ and start taking accountability for its actions and inactions. I believe it’s their way of attempting to remain in everyone’s good books as far as possible. #PRforthewin

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.