Facebook Has Removed 1.3 Billion Fake Accounts to Tackle Misinformation

Traditional print news outlets are slowly dying out because social media platforms are becoming the primary news source of more and more people.

At the center of it all is Facebook and its huge userbase of 2.8 billion netizens—that’s a lot of people you can either inform or misinform about important things. Facebook has just  detailed how much misinformation it’s dealt with over the past several months.

Facebook Continues to Fight Against Misinformation

On Monday, a post to Facebook Newsroom claimed that 1.3 billion fake Facebook accounts were taken down between October and December 2020. The company had over 35,000 people working to tackle misinformation on its platform.

Facebook says that since the beginning of the pandemic, it has used its hate-detecting AI systems in tandem with feedback from global health experts to delete posts containing misinformation.

As a result, more than 12 million posts about COVID-19 and vaccines have been removed.

Guy Rosen, Facebook’s VP of Integrity, writes that “misinformation can also be posted by people, even in good faith.” Not everyone posts misinformation with malicious intent; they could be wanting to help people, but are unaware that the info they’re sharing is factually untrue.

To address this challenge, Rosen writes that Facebook has built a global network of over 80 independent fact-checkers to review content in more than 60 languages:

When they rate something as false, we reduce its distribution so fewer people see it and add a warning label with more information for anyone who sees it. (…) For the most serious kinds of misinformation, such as false claims about COVID-19 and vaccines and content that is intended to suppress voting, we will remove the content.

In addition to taking down misleading or false information, Facebook has also set up some dedicated online spaces for users to find reliable information from trusted experts. Examples include its pages for COVID-19, climate science, and US voting in 2020.

The Spread of Misinformation Across Social Media

The first time Facebook took action against misinformation was in August 2020, when Facebook removed 7 million posts with COVID-19 misinformation. This led to the eventual shutdown of a number of Russian misinformation networks, as well as having to demote posts that may contain false claims about the election.

But Facebook is far from being the only social media platform to have to deal with the rise of fake news. Recently, Twitter started banning users that continuously post COVID-19 misinformation. Even TikTok removed over 300,000 videos with election misinformation.

It’s gotten to the point where it’s hard to imagine what kind of action any social network could possibly take to properly address the spread of misinformation. It’s important to do your part in fighting against it.

Ensure that the information you take in everyday is from a credible source. Better yet, we recommend doing a fact-check on new stories before reacting to them.

Source: makeuseof.com

Related posts

Connections #354: Today’s Answer and Clues (Thursday, May 30, 2024)

Here’s How I Get the Perfect Composition in My Smartphone Photos

Buying an iPad (or Other Tablet) in 2024? 5 Reasons They’re Still Worth Using