As Covid-19 Continues to Spread, So Does Misinformation About It

Nearly three years into the pandemic, Covid-19 remains stubbornly persistent. So, too, does misinformation about the virus.

As Covid cases, hospitalizations and deaths rise in parts of the country, myths and misleading narratives continue to evolve and spread, exasperating overburdened doctors and evading content moderators.

What began in 2020 as rumors that cast doubt on the existence or seriousness of Covid quickly evolved into often outlandish claims about dangerous technology lurking in masks and the supposed miracle cures from unproven drugs, like ivermectin. Last year’s vaccine rollout fueled another wave of unfounded alarm. Now, in addition to all the claims still being bandied about, there are conspiracy theories about the long-term effects of the treatments, researchers say.

The ideas still thrive on social media platforms, and the constant barrage, now a yearslong accumulation, has made it increasingly difficult for accurate advice to break through, misinformation researchers say. That leaves people already suffering from pandemic fatigue to become further inured to Covid’s continuing dangers and susceptible to other harmful medical content.

“It’s easy to forget that health misinformation, including about Covid, can still contribute to people not getting vaccinated or creating stigmas,” said Megan Marrelli, the editorial director of Meedan, a nonprofit focused on digital literacy and information access. “We know for a fact that health misinformation contributes to the spread of real-world disease.”

Twitter is of particular concern for researchers. The company recently gutted the teams responsible for keeping dangerous or inaccurate material in check on the platform, stopped enforcing its Covid misinformation policy and began basing some content moderation decisions on public polls posted by its new owner and chief executive, the billionaire Elon Musk.

From Nov. 1 to Dec. 5, Australian researchers collected more than half a million conspiratorial and misleading English-language tweets about Covid, using terms such as “deep state,” “hoax” and “bioweapon.” The tweets drew more than 1.6 million likes and 580,000 retweets.

The researchers said the volume of toxic material surged late last month with the release of a film that included baseless claims that Covid vaccines set off “the greatest orchestrated die-off in the history of the world.”

Naomi Smith, a sociologist at Federation University Australia who helped conduct the research with Timothy Graham, a digital media expert at Queensland University of Technology, said Twitter’s misinformation policies helped tamp down anti-vaccination content that had been common on the platform in 2015 and 2016. From January 2020 to September 2022, Twitter suspended more than 11,000 accounts over violations of its Covid misinformation policy.

Now, Dr. Smith said, the protective barriers are “falling over in real time, which is both interesting as an academic and absolutely terrifying.”

“Pre-Covid, people who believed in medical misinformation were generally just talking to each other, contained within their own little bubble, and you had to go and do a bit of work to find that bubble,” she said. “But now, you don’t have to do any work to find that information — it is presented in your feed with any other types of information.”

Several prominent Twitter accounts that had been suspended for spreading unfounded claims about Covid were reinstated in recent weeks, including those of Representative Marjorie Taylor Greene, a Georgia Republican, and Robert Malone, a vaccine skeptic.

Mr. Musk himself has used Twitter to weigh in on the pandemic, predicting in March 2020 that the United States was likely to have “close to zero new cases” by the end of that April. (More than 100,000 positive tests were reported to the Centers for Disease Control and Prevention in the last week of the month.) This month, he took aim at Dr. Anthony S. Fauci, who will soon step down as President Biden’s top medical adviser and the longtime director of the National Institute of Allergy and Infectious Diseases. Mr. Musk said Dr. Fauci should be prosecuted.

Twitter did not respond to a request for comment. Other major social platforms, including TikTok and YouTube, said last week that they remained committed to combating Covid misinformation.

YouTube prohibits content — including videos, comments and links — about vaccines and Covid-19 that contradicts recommendations from the local health authorities or the World Health Organization. Facebook’s policy on Covid-19 content is more than 4,500 words long. TikTok said it had removed more than 250,000 videos for Covid misinformation and worked with partners such as its content advisory council to develop its policies and enforcement strategies. (Mr. Musk disbanded Twitter’s advisory council this month.)

In years past, people would get medical advice from neighbors, or try to self-diagnose via Google search, said Dr. Anish Agarwal, an emergency physician in Philadelphia. Now, years into the pandemic, he still gets patients who believe “crazy” claims on social media that Covid vaccines will insert robots into their arms.

“We battle that every single day,” said Dr. Agarwal, who teaches at the University of Pennsylvania’s Perelman School of Medicine and serves as deputy director of Penn Medicine’s Center for Digital Health.

Online and offline discussions of the coronavirus are constantly shifting, with patients bringing him questions lately about booster shots and long Covid, Dr. Agarwal said. He has a grant from the National Institutes of Health to study the Covid-related social media habits of different populations.

“Moving forward, understanding our behaviors and thoughts around Covid will probably also shine light on how individuals interact with other health information on social media, how we can actually use social media to combat misinformation,” he said.

Years of lies and rumors about Covid have had a contagion effect, damaging public acceptance of all vaccines, said Heidi J. Larson, the director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.

Dr. Graham Walker, an emergency physician in San Francisco, said the rumors spreading online about the pandemic drove him and many of his colleagues to social media to try to correct inaccuracies. He has posted several Twitter threads with more than a hundred evidence-packed tweets trying to debunk misinformation about the coronavirus.

But this year, he said, he felt increasingly defeated by the onslaught of toxic content about a variety of medical issues. He left Twitter after the company abandoned its Covid misinformation policy.

“I began to think that this was not a winning battle,” he said. “It doesn’t feel like a fair fight.”

Sahred From Source link Health

Leave a Reply

Your email address will not be published.