After election-related misinformation led to widespread unrest in the United States, there has been greater pressure placed on YouTube, Facebook, and Twitter to address content on these platforms that lead to misinformed views. According to many political analysts, misinformation can lead to extremist thoughts and even violence.
Part of the problem is that these systems were designed to keep people engaged. Every time you watch a video, like a post, or retweet, the social media platform then gives you similar content so that you will also engage with it. This is an effective way to get more users to spend more time on these sites and apps.
While online services are protected from liability for the content that their users post, there is a renewed willingness from these platforms to remove content that is specifically related to election-related misinformation. Said Facebook CEO Mark Zuckerberg in a post before the 2020 Election: “We want to make sure people can speak up if they encounter problems at the polls or have been prevented from voting, but that doesn’t extend to spreading misinformation.”
So what can you, the reader, do to fight misinformation?
You can flag content so that YouTube, Twitter, and Facebook know to remove them. These platforms have a team dedicated to identifying and taking down misinformation or incitement towards violence, but they can always use more eyes.
Flagging is a way that you can help this process by identifying misleading content, which will then be reviewed by the platforms themselves. Outcomes of flagging can range from nothing to the removal of a video, user, or channel from a service. There are no penalties for flagging content, even if you make a mistake.
You must be logged into YouTube in order to report content.
On the YouTube app:
- Tap a video while it’s playing, or paused, and then tap the “More” button on the video, which looks like three vertical dots ( ⋮ ).
- Tap “Report” which has a flag icon next to it.
- Select your reason for reporting. The correct report for misinformation is “Spam or misleading”.
On the YouTube.com website:
- If it is not immediately obvious that the video is misinformation, pause the video where misinformation is about to occur.
- Under the video, click the ellipsis (…) and then click “Report”.
3. Select your reason for reporting. The correct report for misinformation is “Spam or misleading”.
4. You now have the option to add additional information. The timestamp of the video is also recorded so that YouTube knows which part of the video is misleading.
You can report almost any content on Facebook, which includes Posts, Groups, User Profiles, and Photos.
- Click the ellipsis (…) on any content you want to report.
- Click on “Find support or report post” .
- Tap the reason (for example: “False News”).
- Click “Submit’ or “Next”—only one of these choices will appear.
The main way to report misleading content on Twitter is by reporting a tweet or reporting a user.
- On any tweet, click the ellipsis (…) icon on the tweet itself.
- Select “Report Tweet”.
- Twitter will give you a couple of questions to click through to help categorize the issue.
4. The final step is to click on additional tweets from the same account (choose 4 more, maximum of 5 tweets) that may have the same issue. Click either “skip” or “add (number of additional tweets)” to finish sending your report.