This is how Facebook drove QAnon’s growth and enhanced its power to poison elections - GulfToday

This is how Facebook drove QAnon’s growth and enhanced its power to poison elections

Trump

Donald Trump. File

Roger McNamee, Tribune News Service

Five weeks out from the November election, virulent disinformation from domestic and foreign sources continues to fill internet platforms such as Facebook, Twitter and Google. The most extreme voices are being amplified and conspiracy theories and lies are outcompeting expertise and facts on the most critical issues this nation faces — from the legitimacy of the election to public health measures against the coronavirus.

The damage has been continuous since 2016 and there’s little likelihood that these platforms will stop the subversion of truth at this point.

Postmortems of the election months or years from now may tell us with real accuracy how these platforms affected American democracy. For now, the benefit of examining how this dynamic works could help inoculate voters to limit the harm from this disinformation.

Of all the many sources of falsehoods and conspiracies, the latest wave comes from QAnon, which journalists have described as a cult and a collective delusion, wholly detached from reality. QAnon claims there is a deep state conspiracy of Satan-worshipping Democrats and celebrities, operating a pedophilia ring against which Donald Trump is the only hope. A mysterious character, Q, issues clues that are decoded and shared by followers.

Beginning in 2017 on the fringe platform 4Chan, QAnon has grown to global scale — made possible by social media platforms. The game designer Adrian Hon said in an interview with the New York Times that QAnon applies the principles of alternate reality video games — specifically the use of the real world as a platform for storytelling — to grow its conspiracy theory.

QAnon treats every event, no matter how far off script, as part of the design. This has allowed QAnon to absorb every conspiracy it encounters and launch many new ones. The recent subversion of the hashtag #SaveTheChildren has enabled QAnon to put a softer face on its movement, attracting millions of women to a far-right network once largely filled with disaffected men.

As QAnon has grown in scale, it has become an animating force of right-wing politics around the world. A poll by Daily Kos/Civiqs revealed that 33% of Republicans believe that QAnon is “mostly true,” while an additional 23% believe “some parts” of the theory are true. Media Matters identified 70 candidates for Congress this year who expressed some level of support for QAnon. At least one of these candidates is favored to win.

Most extreme conspiracies that begin on fringe sites never get any further. The success of QAnon required far more than the embrace of gaming architectures. Internet platforms such as Facebook, Instagram, Google, YouTube and Twitter have provided the algorithmic amplification to drive QAnon from the fringes into the mainstream.

This was not an accident. QAnon is huge and dangerous because these platforms empowered it for their own profits and power.

These have become the most powerful businesses in our economy by converting human attention into revenue. Among the many problematic aspects of their business model is the algorithmic amplification of emotionally engaging content to maintain attention and the use of recommendation engines to steer behavior.

When Trump posts a message on Twitter and Facebook that implies support for QAnon, algorithms give it maximum reach because it grabs and holds attention. Trump’s tweets appear relatively benign to nonbelievers, but to QAnon, they are validation. Their existence on Twitter and Facebook allows them to recruit, indoctrinate and influence their audiences.

When Facebook’s systems analyse the immense amounts of personal data the company collects, they identify people who might be curious about conspiracy theories and recommend Facebook Groups to join. Facebook did a study in 2018 that revealed that 64% of the time when a person joins an extremist Facebook Group, they do so because of a Facebook recommendation.

Under pressure from politicians, Twitter banned thousands of QAnon accounts. In August, Facebook removed thousands of QAnon pages and groups, but only ones that discuss potential violence. It did not ban thousands of other QAnon pages, including ones that hijacked #SaveTheChildren.

It is bad enough to face the presidential election under the influence of a pandemic and economic contraction. But we are also struggling with unchecked assaults on reason and democracy. Thanks to amplification by internet platforms, QAnon is a key factor in both assaults.

Related articles