The threat of right wing extremism in Australia has been linked to the rise of the internet. Defined by the Australian Strategic Policy Institute as a commitment to an “extreme social, political or ideological position” that is suspicious of “non-white others”, right wing extremism has extended its influence to online spaces.
Far right extremist groups have been prominent on social media apps such as Telegram. In a 2021 study conducted by independent researcher Gerard Gill, five conspiracist Telegram channels were analyzed. Of these five channels, Wake Up Australia had the highest membership (3403), followed by Australian Peacemakers (700), QAnon Australia (334), Australian Freedom Fighters (160), and Project Phoenix Community (84).
The study found that these channels had been promoting a variety of conspiracy theories regarding race and Covid-19. A post from the Wake Up Australia channel attributed the Federal Government’s Online Safety Bill to a failure of multiculturalism and argued that it was used by “Jews” and “communists” to “silence dissent and bully Australians into not speaking out against them”. Another post from the Australian Peacemakers channel claimed that hydroxychloroquine was “discredited as treatment for Covid- 19” due to Jewish media.
Far right online spaces have also used mainstream media to spread their ideology. A study conducted by Victoria University last month analyzed 11, 000 Facebook posts and 45, 000 Gab posts by far right accounts. The Daily Mail was the most shared social media source among far right Facebook users, with 430 posts sharing one of its articles, followed by ABC (337), Sky News (318), and 7News (206). In contrast, the two most shared sources on Gab were Sky News (3, 131 posts) and The Daily Mail (1, 851), followed by News.com (675), ABC (604), and The Age (227).
The VU study found that the mainstream sources had been reframed for far right ideological messaging. A user from Gab shared a balanced article from Sky News about Victorian Premier Daniel Andrews’ desire to extend the state government’s emergency powers, arguing that the “plandemic” was merely “about cementing state power”. A user from Facebook disapprovingly shared an SBS article which reported the Islamic call for prayer being broadcasted at a Sydney mosque, labelling the Islamic ideology as “predatory”.
There have been efforts to address these sorts of online behaviors. A 2020 report stated that there had been a stark increase in the removal of the total volume of hate speech content on Facebook. At the end of 2018, three million pieces of content were removed. This number rose to 22 million by the end of 2020.
There has also been government intervention in online activity. The Federal Government’s Online Safety Act was introduced in 2021 to make Australia’s online safety laws stronger and more expansive.
Key changes in this Act include stronger information-gathering powers, the ability block access to violent material, and an Online Content Scheme to regulate restricted content.
But there is skepticism as to whether these responses will be effective in tackling online right wing extremism. Mario Peucker, leading expert on online and offline mobilization of the radical right in Australia, said regulating social media “is tempting but complicated”.
“Once people step into far right spaces and find that community in there, it’s hard to get them back because it’s based on conspiratorial and new community thinking,” according to Peucker.
Peucker said social media platforms such as Telegram “are a network of thousands of far right mini servers” and “nothing happens if you take one down”.
“On Facebook you would see someone challenge an extremist comment but in spaces such as Telegram there’s no challenging by users anymore, so it really amplifies and gets worse,” he said.
A right wing extremism researcher who chose to be anonymous for safety purposes said the power of algorithms was a key aspect of social media which must not be overlooked.
“The algorithms are concerning because they send more extreme content and create these echo chambers,” the researcher said.
So there should be “stronger regulation around encryption and transparency”, according to the researcher.
The researcher said education could be a helpful way to stop the next generations of Australians from being influenced by right wing extremism online.
“We would be better off investing resources in peer networks to think about having informed conversations,” the researcher said.
Communities must be “better educated” to “understand the extremist content they’re seeing” and “what resources are available”, according to the researcher.