Polarisation fuels medical misinformation - GulfToday

Polarisation fuels medical misinformation

Fake News

Representational image.

Faye Flam, Tribune News Service

FDA commissioner Robert Califf said in an interview that aired this week that misinformation is killing Americans — contributing to the fact that our life expectancy is 3 to 5 years worse than that of people in comparably wealthy countries. He called for better regulation to crack down on misinformation. But would such rules help?

I studied medical misinformation as part of a journalism fellowship, and as I’ve written in previous columns, there is a real danger when misinformed people skip lifesaving vaccines or buy into risky, untested treatments. Yet policing misinformation is tricky.

The fact-checking industry may even make the problem worse by confusing value judgments with facts, and by portraying science as a set of immutable facts, rather than a system of inquiry that constructs provisional theories based on imperfect data.

The advent of artificial intelligence tools like ChatGPT will only magnify the confusion — the latest version, with GPT-4, is slick, articulate, lightning-witted and some experts worry it could be used as a turbocharged misinformation machine that floods us with AI-generated fake news and fake images.

As my Bloomberg Opinion colleague Niall Ferguson recently wrote, some AI enthusiasts are plotting to “flood the zone with truth” — but this is problematic when people have an inflated idea of their own abilities to identify truth.

A lot of people are upset, even outraged about rampant misinformation online, but not especially worried about falling for it. The real problem is all those more gullible people.

But according to a new study from Oxford University, the very people who are most worried about misinformation are also the most likely to consider themselves impervious to it. They’re probably overconfident: 80% of those surveyed think they’re above average at spotting misinformation.

Sacha Altay, the cognitive scientist who led the study, said the bottom line is that there’s a strong correlation between concern about misinformation and feelings of superiority in spotting it. This makes sense. If you’re not puffed up with superiority, you’ll assume you’re not special and other people are seeing through the same misleading claims you are.

Altay, who tested participants from both the US and the UK, argued that we’re seeing a moral panic about misinformation that’s been exaggerated by people’s false sense of superior discernment. He said he thinks the media are contributing to an “alarmist” view with stories that, for example, overstate how many people believe in QAnon (a conspiracy theory that involves liberal elites extracting blood from children). Perhaps the public is not as gullible as has been assumed.

Cambridge University psychologist Sander van der Linden, author of the new book “Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity,” has done research that shows small nudges can motivate people to be smarter consumers of information. One of his most recent studies tested over 3,000 US participants on their ability to spot fake news stories with a political bent, and found their performance improved remarkably if they were given a cash reward for each right answer.

For the most part, he said, people tend to discount facts that cast political rivals in a positive light. But just the promise of $1 per right answer improved volunteers’ accuracy by 30%.

Spreading misinformation might be more about demonstrating one’s own politics and less about gullibility. In another study, van der Linden and his colleagues found that what really drove engagement on social media was hurling dirt and insults at the other side — technically called “outgroup derogation.” This behavior is rewarded by the group, while those who fail to conform are sidelined or ignored.

Seen through this lens, a group’s reluctance to, say, get a vaccine may stem more from political polarisation than medical misinformation.

How can we use insights like these to make the world less susceptible to deception and error? To Altay, stamping out misinformation is the wrong goal. Rebuilding public trust is much more important. “It’s very dangerous for a democracy to promote ideas that people are stupid and there is misinformation everywhere,” he said. It’s far better to shore up trust in institutions and in reliable sources of information. His view reminded me of something I learned from former Soviet spy Larry Martin (formerly Ladislav Bittman), who defected to the U.S. in the 1980s. He’d created disinformation — even more deliberately deceptive than misinformation — as deputy commander of the Czechoslovak intelligence service.

When I interviewed him in 2017 for this column, he told me that when the Soviets wanted to cause damage, they would spread such propaganda to undermine trust in our institutions — the government, universities, the press. It’s bad for democracy if people lose faith in each other.

And assuming (other) people are stupid is also bad for our health. People have a range of cognitive strengths and weaknesses in every country. Blaming online misinformation for shrinking American life spans is a cop-out — especially when we have an overburdened health care system that has made serious mistakes, from overprescribing opioids to failing to come up with an effective COVID strategy.

Our brainpower is what it is, but our health care system can do a lot better.

Related articles