By Yana Milcheva and April Buxton
The role of US social media companies in providing a platform for ‘alternative facts’ has become an issue of core public concern and, whatever the outcome of the election, these companies are likely to face significant regulatory and anti-trust focus from a political class that feels they have become too powerful. This has prompted something of a mindset change in the companies themselves, and they have made efforts to improve their algorithms for detecting and flagging disinformation and coordinated fake accounts. The QAnon movement — a far-right online organisation — has been recognised by the FBI and others for playing a key role in amplifying disinformation and corrupting public discourse. The close links between QAnon communities, Donald Trump, and anti-vaccine narratives on Facebook and Twitter have prompted social media networks to take action to limit the reach of the QAnon movement. Whether the action being taken by the social media platforms will be effective in the longer term is questionable.
The QAnon movement, whose name derives from the term “Q Clearance” as used by the US Department of Energy, is driven by a far-right conspiracy. The conspiracy surfaced between 2016-2017 and, reportedly, developed from Cicada 3301; an online movement consisting of cryptic coding puzzles. The movement supports numerous theories centered around the belief that President Trump is waging a secret war against elite, Satan-worshipping paedophiles in the government, business, and the media. QAnon supporters, referred to as ‘Bakers’, believe that President Trump is working to expose and dismantle these networks, and prevail against the so-called “deep state”.
Two years ago, QAnon conspiracy theories could only be discovered on anonymous websites like 4chan. Today, they have become a part of the mainstream political discourse. QAnon supporters’ activity mainly consists of driving and amplifying specific keywords and hashtags, as well as coordinating online abuse of perceived enemies. The growing influence of the QAnon movement online has even prompted the FBI and the US Combating Terrorism Center to classify the conspiracy group as a domestic terrorist threat, and the movement has since been condemned by the US House of Representatives. This is not a threat defined by its number of core adherents, but, rather, by its capacity, by way of the internet, to inject disinformation into the political process.
QAnon theories are often embedded within current news stories and political conversations. The COVID-19 outbreak enabled QAnon supporters to gain a prominent voice outside of their network by sharing conspiracy videos like ‘Plandemic’ on social media, which suggest that the pandemic has been orchestrated as a means of controlling the population and restricting peoples’ liberty. The QAnon movement is also said to have found supporters in high places: numerous current and former members of the Trump administration, such as General Michael Flynn, have endorsed some of the conspiracy theories shared by QAnon, although it is unclear whether these endorsements were, actually, motivated by the fact that the theories are popular among Trump supporters. President Trump, himself, refused to condemn the movement, and referred to its members as people who “love our [USA] country”. The 2020 US elections served as a further confirmation that the QAnon movement has gained a stronghold in key government organisations. On the November 3, Marjorie Taylor Greene, an outspoken QAnon supporter, won a House seat in Georgia on behalf of the Republican party. Greene is only one of the numerous Republican incumbents who have supported QAnon, by either sharing and amplifying the theories, or by refusing to condemn the movement allowing it to grow roots within not just American, but global politics.
Twitter recently announced that it will be suspending accounts tweeting about the QAnon theory that have violated the platform’s user guidelines — 7,000 accounts were removed in the immediate aftermath. In addition, Twitter is no longer promoting QAnon-related content and accounts in its ‘Recommendations’ section, and the social media platform is blocking associated URLs from being shared. YouTube and TikTok are proceeding in the same manner: YouTube recently announced its decision to remove QAnon conspiracy theory content used to justify real-world violence, and TikTok has devised policies to remove QAnon-affiliated accounts and content, including various hashtags.
Similarly, Facebook announced a complete ban of QAnon-themed groups, pages, and Instagram accounts - a considerable development on the platform’s initial decision only to ban QAnon groups and pages that have openly discussed violence. The fact that the new regulations will be enforced by Facebook’s Dangerous Organizations Operations team, signifies that the social media organisation is taking the threat of QAnon seriously. Although the recent decisions taken by many mainstream social media platforms seem, on their face, satisfactory, there remain potential loopholes that the QAnon community can exploit. For example, Facebook has not implemented a policy affecting individual users who post pro-QAnon content, and no individual Instagram user can be impeached unless their account explicitly “represents” them as a QAnon supporter. Thus, these platforms are only willing to remove explicit QAnon accounts, leaving a vast number of threats undetected.
Several counteractive measures were anticipated by the QAnon community, which had been preparing for a potential crackdown by the social media companies. The anonymous ‘Q account’, considered a leading figure in the movement, has advised members of the network to deploy alternative language that will not be flagged by social media platforms’ moderators. For example, many pages and groups have begun using the number ‘17’ as a substitute for ‘Q’, to prevent their pages from being banned.
It is still too early to determine whether social media platforms’ policy changes will be effective. Progress in limiting the reach of these online communities has been slow. In June, TikTok removed various QAnon-affiliated hashtags; however, despite its efforts, at least 14 of these hashtags remained live, accumulating over 488 million views; a reflection of the vast reach of this movement. Whilst the recent steps taken by social media platforms are a start, they are unlikely to be sufficient; especially if core Trump voters organise online to contest the election result. Just as the approach of the social media companies is often waylaid by their advertising and business objectives, the technological solutions are also unclear. The task of creating a regulatory framework that effectively limits the reach of such conspiracies must mobilise regulators worldwide — in recent history, however, coordinated multilateral action has been notoriously hard to fashion. The outlook is not optimistic.