In April this year, the CoronaVirusFact Alliance database recorded nearly 4,000 Coronavirus-related hoaxes circulating around the world. In the UK, a study by telecoms regulator Ofcom suggested that 46% of adults on the internet saw false or misleading information about Covid-19 in the first week alone of the country’s lockdown. To get a clearer picture on what some are already coining “the infodemic,” we’ve taken a look at just how much fake news has increased during the pandemic, why it’s happening, and what platforms can do to prevent it. Plus, the study revealed that interactions with articles from these deceptive sites have increased by a worryingly 242% between the third quarter of 2016 and the third quarter of 2020.
What is misinformation and disinformation?
Misinformation: Misleading advice or statistics – usually spread by well-meaning but ill-informed individuals. This type of information could be spread by governments releasing skewed data, or perhaps a family member wanting to help keep their loved ones safe. Examples include:Disputed causes of coronavirusUnproven coronavirus treatmentsConfusion over lockdown rules
Disinformation: Malicious, false information, aimed at disrupting public order or manipulating an agenda. This is made more dangerous due to the speed at which fake news spreads during a crisis, on social media, or messaging platforms. Examples include:Coronavirus is a hoax5G caused covid-19Coronavirus is government propaganda
This rapid spread of potentially harmful information has meant even the World Health Organisation has had to set up its own initiative, named Verified, to combat misinformation.
1. An increase in social media usage
Naturally, we’ve all been using our phones a lot more since have to be locked into the confinement of our own four walls. Overall, 37.9% of people in the US have been spending two hours or more each day on social media during the pandemic. A recent study found that 56% of people who believe there’s no hard evidence that coronavirus exists get a lot of their information from Facebook. Plus, 60% of people who believe there is a link between 5G and Covid-19 get a great deal of their information on the virus from YouTube, rather than traditional media. In 2019, 49% of Americans that shared fake news later found out it was fake, after sharing it with friends. Social media services are making various efforts to combat this. Twitter, for example, is asking users to read articles before they share them, or actively flagging known fake articles as harmful. President Trump even went as far as suggesting the consumption of bleach could cure the virus. Plus, we only need to take a glance at his Twitter account for examples of how disinformation can be easily spread by key political figures. — Donald J. Trump (@realDonaldTrump) May 25, 2020 In this isolation, many of us have turned to social media – a handy online tool to make the separation more bearable. This has meant, however, that we have the potential to consume more misleading information – as we mentioned earlier, social media is a catalyst for the spread of misinformation. Researchers accumulated more than 200 million tweets discussing the virus since January and found that about 45% were sent by accounts that behave more like computerized robots than humans. The first reason is that misinformation about the virus itself can lead to deaths. The second reason why is that it’s seeped into an increasingly polarised political environment in the run-up to an already divisive election. Below, we’ve dug a bit deeper into how these two reasons have been wreaking havoc in America.
1. Misinformation about the virus causing harm
A global pandemic isn’t something many of us have faced in our life times – it was always going to cause people to panic. The UN Secretary-General António Guterres from the WHO noted, “As soon as the virus spread across the globe, inaccurate and even dangerous messages proliferated wildly over social media, leaving people confused, misled, and ill-advised.” A key example of how misinformation holds potential danger is when Trump claimed that bleach could cure the virus. The president has since claimed he was joking, but for the avoidance of doubt, his words were: As you can imagine, most people were taken aback. But, did anyone follow in Trump’s suggestion? In the graph below, there is a clear correlation between Trump’s bizarre bleach statement in April and the number of bleach-poison incidents. QAnon is a far-right conspiracy theory. It alleges that a cabal of Satan-worshiping pedophiles is running a global child sex-trafficking ring and plotting against US President Donald Trump, who is battling against the cabal. Since the beginning of the pandemic, this eccentric group has gained a huge following: In just four months, Facebook memberships of the biggest QAnon groups rose by a massive 700%. This disinformation is not only feeding people wrong information, but some believe it is spurring on the next wave of terrorism.
Facebook said it had removed “hundreds of thousands” of coronavirus posts that could have led to harm, while putting warning labels on “90 million pieces of misinformation” globally in March and April. Through pop-ups and a new COVID-19 information center on Facebook, the company says it’s directed more than 1 billion people on Facebook and Instagram to resources from the World Health Organization. Facebook recently went as far as to remove the President’s post, claiming that Covid-19 is less deadly than the seasonal flu.
In March, Twitter broadened its policy guidance to address content that goes directly against guidance on COVID-19. Earlier this year, Twitter also introduced a new label on Tweets containing potentially harmful, misleading information related to COVID-19. These warnings inform people that the information in the Tweet conflicts with public health experts’ guidance before they view it. In its announcement, Twitter stated that “given the dynamic situation, we will prioritize review and labeling of content that could lead to increased exposure or transmission.” This may well be why they’re so hot on Donald Trump’s tail – having deleted multiple tweets over misinformation since March.
YouTube
More than a quarter of the most-viewed coronavirus videos on YouTube contain misleading or inaccurate information. The site has, however, taken measure to try and combat this spread of misinformation. Content that violates their misinformation policy will be removed, and creators will be sent a confirmation email. If this is your first time violating, you’ll get a warning with no penalty to your channel. If it’s not, Youtube will issue a strike against your channel. If you get three strikes, your channel will be terminated.
Coronavirus shows no sign of going away any time soon – and with some countries going back into lockdown, we all need to keep a vigilant eye out for fake news. Whether it’s being spread intentionally or not, in some cases it has potential to be as dangerous as the virus itself.