Facebook is taking additional steps to ensure the safety of groups on the platform. The social media giant has faced backlash for its handling of controversial material in the past, prompting its crackdown on groups that break Facebook’s policies.
Tom Alison, Facebook’s vice president of engineering, detailed the new measures the platform is taking to secure groups in an About Facebook blog post. Facebook has already tightened restrictions on hate speech, and those restrictions are now closing in on groups as well.
Facebook has already taken down 1.5 million posts in groups for breaking Facebook’s organized hate policies, and also removed over one million groups for violating Community Standards. Alison notes the extensive measures the platform will take to further combat harmful groups, stating:
We now limit the spread of these groups by removing them from recommendations, restricting them from search, and soon reducing their content in News Feed. We also remove these groups when they discuss potential violence, even if they use veiled language and symbols.
Now, any group member who violates Facebook’s Community Standards will need to have their posts approved by an administrator for 30 days. And if that administrator continually approves inappropriate posts, the entire group will be pulled.
Facebook is putting more responsibility into the hands of administrators, and for that reason, it will start making sure that every group has an active administrator.
Any group that hasn’t had an active administrator for a long period of time will get archived. But before archiving the group, Facebook will suggest an administrator role to group members in good standing.
Lastly, Facebook is taking yet another measure to stop the spread of COVID-19 misinformation. The platform will no longer display health groups in recommendation lists. Although you can still join and search for these groups, you’ll no longer see them pop up in your recommendations.
Facebook clearly isn’t shy about holding groups accountable for inappropriate or misleading content. While the new restrictions may succeed in dissolving certain groups, some harmful content is bound to slip through the cracks.
In the end, it might be better to avoid the drama of Facebook altogether, and just stay off the platform.