Facebook announced Wednesday that it will restrict the popularity of groups and users that accrue violations and, in some cases, remove them altogether in order to reduce harmful content and misinformation.
“We’ve taken action to curb the spread of harmful content, like hate speech and misinformation, and made it harder for certain groups to operate or be discovered, whether they’re Public or Private. When a group repeatedly breaks our rules, we take it down entirely,” Facebook announced in a press release Wednesday.
Building upon its decision to stop recommending political groups to U.S. users in January after the Jan. 6 attack on the U.S. Capitol, Facebook will now reduce the privileges and popularity of groups that violate its content moderation rules.
This means users will be less likely to discover groups that have had Facebook community violations in the past and will see warnings when they try to join them.
Groups with a large number of members who have broken Facebook rules will be required to get administrator and moderator approval before posts can be published.
Individual users who have repeatedly violated Facebook policies within a group will be blocked altogether from posting in groups or inviting others to groups and creating new groups themselves.