Billions of people around the world use social media services like Facebook to stay in touch with friends and family. Yet on July 20, 2021, President Joe Biden blamed the hesitancy around the COVID-19 vaccine on Facebook, saying that Facebook was allowing misinformation about the COVID-19 vaccine to spread on its platform. “It’s killing people. It’s bad information,” he said. “My hope is that Facebook … would do something about the misinformation, the outrageous misinformation about the vaccine.”
Biden’s statements about Facebook becoming a hub of COVID-19 misinformation is TRUE. Facebook has even admitted that misinformation is common on its platform.
To understand how misinformation spreads on social media platforms like Facebook, it is important to understand how social media uses your information to make billions of dollars a year and how it affects what kind of content you see on those platforms.
This explainer will focus on Facebook, by far the largest and most powerful social network on the planet, which shares many operating similarities with other online services (such as YouTube).
How Facebook makes money
Facebook makes its money through advertising. It is particularly effective at this because of the way it is structured. When you join Facebook, you provide information about your personal preferences, and then your activity on Facebook is used to create a more complete picture of who you are and what you like. That information is then used to show you ads based on your interests and behavior.
For instance, if you say you like cars, car companies will pay money to Facebook to show you their ads.
Every aspect of your identity that you share is used by Facebook to determine what types of ads to show you, including analysis of pictures that you upload onto the site. For instance, Facebook can determine your body type and hair style from the photos you share, which can then affect what types of ads are shown to you. Some ads can even use your profile photo without needing to notify you.
How Facebook Keeps You on the Platform
If Facebook only showed ads, people would leave very quickly. Facebook needs to show you content that will keep you on its website, and that content includes posts from friends and family, posts from groups that you follow, and posts from news outlets and websites.
That content is tailored specifically to your taste, through what is called an algorithm. Algorithms are a set of automated actions that occur in response to information. For instance, traffic lights operate on algorithms that are based on the number of cars at each light and the speed limit of the road, to make traffic flow as smoothly and safely as possible.
The algorithms that determine what appears on your Facebook page primarily use machine learning to determine what to show you, with little human intervention. For instance, if you liked your friends’ beach photos, Facebook may show you more beach-related content and ads.
Any information you contribute to Facebook is used to determine what you see next. This includes your posts, photos, and likes. It also includes how quickly you scroll past content on your Facebook News Feed, and what you read without liking or commenting.
It’s also important to know that your Facebook News Feed is unique to you. The posts that you see, the order that you see them, and even what you do not see are all controlled by a system that is designed to maximize how frequently you visit Facebook.
How Facebook Spreads Misinformation
In the Internet age, it is easy for anyone to create a website and create false conspiracy theories, which they can then share on Facebook. For instance, you are scrolling through your Facebook feed and you click on a link from one of these disreputable websites that says that COVID-19 is a hoax. The Facebook algorithm then notices that you engaged with that topic, and will now send you more articles that say COVID-19 is a hoax.
While Facebook recently removed 18 million pieces of COVID-19 misinformation from its platforms, fake news is still a problem. Because misinformation is created to grab attention and cause fear, people like and share this content simply to help protect and inform their loved ones. Since the algorithm encourages the liking and sharing of posts, those posts get more aggressively promoted by Facebook so that a single post may be seen by thousands or even millions of people—even if those posts are promoting false information. Most people sharing fake news online are doing so unknowingly.
According to a study from the Harvard Misinformation Review, published by the Harvard University’s Kennedy School, “false reports spread more virally than real news” precisely because they appeal to people’s fears. A similar finding was also reported from the Center for Cybersecurity at New York University.
Facebook has filed numerous patents on its algorithm technology and this patent portfolio is likely worth billions of dollars to Facebook.
YouTube also has a similar algorithm, where it will show you more videos that are similar to the last video you clicked on in the platform. For example, if you click on a YouTube video about the QAnon conspiracy theory, YouTube will send you more QAnon videos.
It’s also important to know that violent extremists have been found on Facebook in private groups. While private groups can be useful, they are also very difficult to police or observe, and the moderators of the groups have absolute control over what gets posted and who can be in the group. This means that misinformation spreads more easily in private Facebook groups.
Why Are People Spreading Misinformation on Purpose?
Platforms like Facebook have created an unprecedented opportunity for strangers to interact with one another. There are now many bad actors, ranging from individuals to even hostile governments who seek to use Facebook to manipulate Americans. Sometimes these effects can be relatively benign, like overpaying for a bad product that doesn’t work. But it can include outright lies about public health to weaken the United States.
Fake news is also profitable, according to Facebook: “These spammers make money by masquerading as legitimate news publishers and posting hoaxes that get people to visit their sites, which are often mostly ads.” A report from the Center for Countering Digital Hate found that many of the creators of COVID-19 misinformation do so to profit from selling unproven health products to fight COVID-19.
What Can You Do?
First, be vigilant. The reason misinformation easily spreads on Facebook is because it appeals to our emotions, according to the Society for Personality and Social Psychology. So if you encounter a post that makes you feel scared or angry, take a breath. Who is making the claim? Are they a reputable professional? What kind of evidence is being presented? Have any reputable news outlets reported on this issue? Being vigilant and not letting your emotions lead will prevent you from being fooled by misinformation.
Remember, just because someone on the Internet said it does not mean it’s true.
Second, review your Facebook privacy settings and make sure you understand who can see what you post. There are many suggestions from reputable sources that can help you set your privacy settings, so you can still share posts with friends and family while limiting how much others may see what you share.
For instance, any content tagged as “public” on Facebook can be used in a national advertising campaign without your knowledge or consent. If there are posts or photos on Facebook that you no longer want to share, delete them. Your profile photo on Facebook is considered public by default, so if it depicts you in a way that you do not want strangers to see, remove it or change the privacy settings.
Most importantly, look at social media as part of a varied and balanced media diet that includes doing your own research and reading reputable news sources, as well as talking to your family and friends in real life.